Feb 23 06:44:36 crc systemd[1]: Starting Kubernetes Kubelet... Feb 23 06:44:36 crc restorecon[4805]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:44:36 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:44:37 crc restorecon[4805]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:44:37 crc restorecon[4805]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 23 06:44:38 crc kubenswrapper[5047]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 06:44:38 crc kubenswrapper[5047]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 23 06:44:38 crc kubenswrapper[5047]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 06:44:38 crc kubenswrapper[5047]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 06:44:38 crc kubenswrapper[5047]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 23 06:44:38 crc kubenswrapper[5047]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.121317 5047 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128017 5047 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128051 5047 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128060 5047 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128071 5047 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128081 5047 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128092 5047 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128102 5047 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128113 5047 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128125 5047 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128149 5047 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128160 5047 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128169 5047 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128178 5047 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128186 5047 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128194 5047 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128202 5047 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128211 5047 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128218 5047 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128226 5047 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128234 5047 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128242 5047 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128250 5047 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128258 5047 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128267 5047 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128276 5047 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128286 5047 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128294 5047 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128303 5047 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128311 5047 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128319 5047 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128327 5047 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128335 5047 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128342 5047 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128353 5047 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128363 5047 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128371 5047 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128380 5047 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128388 5047 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128397 5047 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128405 5047 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128414 5047 feature_gate.go:330] unrecognized feature gate: Example Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128423 5047 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128431 5047 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128440 5047 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128448 5047 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128469 5047 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128477 5047 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128485 5047 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128493 5047 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128500 5047 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128508 5047 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128516 5047 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128524 5047 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128532 5047 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128539 5047 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128547 5047 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128567 5047 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128575 5047 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128582 5047 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128590 5047 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128597 5047 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128608 5047 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128616 5047 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128624 5047 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128631 5047 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128638 5047 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128648 5047 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128659 5047 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128668 5047 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128677 5047 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.128685 5047 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129501 5047 flags.go:64] FLAG: --address="0.0.0.0" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129525 5047 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129541 5047 flags.go:64] FLAG: --anonymous-auth="true" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129553 5047 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129564 5047 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129573 5047 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129585 5047 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129596 5047 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129606 5047 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129615 5047 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129625 5047 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129635 5047 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129644 5047 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129652 5047 flags.go:64] FLAG: --cgroup-root="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129661 5047 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129671 5047 flags.go:64] FLAG: --client-ca-file="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129680 5047 flags.go:64] FLAG: --cloud-config="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129689 5047 flags.go:64] FLAG: --cloud-provider="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129699 5047 flags.go:64] FLAG: --cluster-dns="[]" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129709 5047 flags.go:64] FLAG: --cluster-domain="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129718 5047 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129728 5047 flags.go:64] FLAG: --config-dir="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129737 5047 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129747 5047 flags.go:64] FLAG: --container-log-max-files="5" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129757 5047 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129766 5047 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129775 5047 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129785 5047 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129794 5047 flags.go:64] FLAG: --contention-profiling="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129803 5047 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129811 5047 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129821 5047 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129830 5047 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129840 5047 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129851 5047 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129860 5047 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129869 5047 flags.go:64] FLAG: --enable-load-reader="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129879 5047 flags.go:64] FLAG: --enable-server="true" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129890 5047 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129915 5047 flags.go:64] FLAG: --event-burst="100" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129952 5047 flags.go:64] FLAG: --event-qps="50" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129961 5047 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129970 5047 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129979 5047 flags.go:64] FLAG: --eviction-hard="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.129993 5047 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130003 5047 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130015 5047 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130028 5047 flags.go:64] FLAG: --eviction-soft="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130040 5047 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130052 5047 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130064 5047 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130075 5047 flags.go:64] FLAG: --experimental-mounter-path="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130083 5047 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130092 5047 flags.go:64] FLAG: --fail-swap-on="true" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130102 5047 flags.go:64] FLAG: --feature-gates="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130120 5047 flags.go:64] FLAG: --file-check-frequency="20s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130129 5047 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130138 5047 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130149 5047 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130158 5047 flags.go:64] FLAG: --healthz-port="10248" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130168 5047 flags.go:64] FLAG: --help="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130177 5047 flags.go:64] FLAG: --hostname-override="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130185 5047 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130194 5047 flags.go:64] FLAG: --http-check-frequency="20s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130204 5047 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130212 5047 flags.go:64] FLAG: --image-credential-provider-config="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130221 5047 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130230 5047 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130239 5047 flags.go:64] FLAG: --image-service-endpoint="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130248 5047 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130257 5047 flags.go:64] FLAG: --kube-api-burst="100" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130268 5047 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130279 5047 flags.go:64] FLAG: --kube-api-qps="50" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130290 5047 flags.go:64] FLAG: --kube-reserved="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130302 5047 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130314 5047 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130326 5047 flags.go:64] FLAG: --kubelet-cgroups="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130335 5047 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130344 5047 flags.go:64] FLAG: --lock-file="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130353 5047 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130365 5047 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130374 5047 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130387 5047 flags.go:64] FLAG: --log-json-split-stream="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130396 5047 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130404 5047 flags.go:64] FLAG: --log-text-split-stream="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130414 5047 flags.go:64] FLAG: --logging-format="text" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130423 5047 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130432 5047 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130442 5047 flags.go:64] FLAG: --manifest-url="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130450 5047 flags.go:64] FLAG: --manifest-url-header="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130462 5047 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130471 5047 flags.go:64] FLAG: --max-open-files="1000000" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130482 5047 flags.go:64] FLAG: --max-pods="110" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130491 5047 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130500 5047 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130508 5047 flags.go:64] FLAG: --memory-manager-policy="None" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130517 5047 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130527 5047 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130537 5047 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130547 5047 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130566 5047 flags.go:64] FLAG: --node-status-max-images="50" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130575 5047 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130584 5047 flags.go:64] FLAG: --oom-score-adj="-999" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130594 5047 flags.go:64] FLAG: --pod-cidr="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130602 5047 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130616 5047 flags.go:64] FLAG: --pod-manifest-path="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130625 5047 flags.go:64] FLAG: --pod-max-pids="-1" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130634 5047 flags.go:64] FLAG: --pods-per-core="0" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130643 5047 flags.go:64] FLAG: --port="10250" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130652 5047 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130660 5047 flags.go:64] FLAG: --provider-id="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130669 5047 flags.go:64] FLAG: --qos-reserved="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130679 5047 flags.go:64] FLAG: --read-only-port="10255" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130688 5047 flags.go:64] FLAG: --register-node="true" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130696 5047 flags.go:64] FLAG: --register-schedulable="true" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130705 5047 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130721 5047 flags.go:64] FLAG: --registry-burst="10" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130730 5047 flags.go:64] FLAG: --registry-qps="5" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130739 5047 flags.go:64] FLAG: --reserved-cpus="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130747 5047 flags.go:64] FLAG: --reserved-memory="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130758 5047 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130767 5047 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130776 5047 flags.go:64] FLAG: --rotate-certificates="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130785 5047 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130794 5047 flags.go:64] FLAG: --runonce="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130802 5047 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130812 5047 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130821 5047 flags.go:64] FLAG: --seccomp-default="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130830 5047 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130839 5047 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130848 5047 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130857 5047 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130866 5047 flags.go:64] FLAG: --storage-driver-password="root" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130875 5047 flags.go:64] FLAG: --storage-driver-secure="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130883 5047 flags.go:64] FLAG: --storage-driver-table="stats" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130892 5047 flags.go:64] FLAG: --storage-driver-user="root" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130908 5047 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130918 5047 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130953 5047 flags.go:64] FLAG: --system-cgroups="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130962 5047 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130976 5047 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130984 5047 flags.go:64] FLAG: --tls-cert-file="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.130993 5047 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.131004 5047 flags.go:64] FLAG: --tls-min-version="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.131013 5047 flags.go:64] FLAG: --tls-private-key-file="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.131021 5047 flags.go:64] FLAG: --topology-manager-policy="none" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.131032 5047 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.131041 5047 flags.go:64] FLAG: --topology-manager-scope="container" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.131050 5047 flags.go:64] FLAG: --v="2" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.131062 5047 flags.go:64] FLAG: --version="false" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.131073 5047 flags.go:64] FLAG: --vmodule="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.131083 5047 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.131095 5047 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131299 5047 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131312 5047 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131322 5047 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131331 5047 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131340 5047 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131349 5047 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131357 5047 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131368 5047 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131378 5047 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131386 5047 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131395 5047 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131403 5047 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131411 5047 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131419 5047 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131427 5047 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131435 5047 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131442 5047 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131450 5047 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131458 5047 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131466 5047 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131474 5047 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131482 5047 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131490 5047 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131499 5047 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131510 5047 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131520 5047 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131531 5047 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131539 5047 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131548 5047 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131557 5047 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131565 5047 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131576 5047 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131584 5047 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131593 5047 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131603 5047 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131614 5047 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131622 5047 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131631 5047 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131640 5047 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131648 5047 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131655 5047 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131663 5047 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131671 5047 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131679 5047 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131686 5047 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131694 5047 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131702 5047 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131710 5047 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131718 5047 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131726 5047 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131734 5047 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131742 5047 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131750 5047 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131757 5047 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131765 5047 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131774 5047 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131781 5047 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131789 5047 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131798 5047 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131805 5047 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131813 5047 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131821 5047 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131829 5047 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131837 5047 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131845 5047 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131853 5047 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131861 5047 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131869 5047 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131877 5047 feature_gate.go:330] unrecognized feature gate: Example Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131884 5047 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.131892 5047 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.131911 5047 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.147833 5047 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.147891 5047 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148054 5047 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148073 5047 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148082 5047 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148092 5047 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148102 5047 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148109 5047 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148116 5047 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148124 5047 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148131 5047 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148137 5047 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148144 5047 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148151 5047 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148158 5047 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148164 5047 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148170 5047 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148177 5047 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148186 5047 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148199 5047 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148206 5047 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148214 5047 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148221 5047 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148229 5047 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148237 5047 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148244 5047 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148250 5047 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148257 5047 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148264 5047 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148270 5047 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148277 5047 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148284 5047 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148291 5047 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148298 5047 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148304 5047 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148311 5047 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148320 5047 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148326 5047 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148333 5047 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148340 5047 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148348 5047 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148356 5047 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148364 5047 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148374 5047 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148383 5047 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148391 5047 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148399 5047 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148406 5047 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148414 5047 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148421 5047 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148427 5047 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148434 5047 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148441 5047 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148448 5047 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148454 5047 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148461 5047 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148468 5047 feature_gate.go:330] unrecognized feature gate: Example Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148474 5047 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148482 5047 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148489 5047 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148496 5047 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148503 5047 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148512 5047 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148521 5047 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148530 5047 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148538 5047 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148545 5047 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148553 5047 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148560 5047 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148567 5047 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148573 5047 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148580 5047 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148588 5047 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.148600 5047 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148813 5047 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148828 5047 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148836 5047 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148844 5047 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148851 5047 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148858 5047 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148868 5047 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148881 5047 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148889 5047 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148896 5047 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148911 5047 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148918 5047 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148962 5047 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148972 5047 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148980 5047 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148988 5047 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.148998 5047 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149006 5047 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149014 5047 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149020 5047 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149027 5047 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149034 5047 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149040 5047 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149047 5047 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149055 5047 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149061 5047 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149069 5047 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149078 5047 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149086 5047 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149094 5047 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149103 5047 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149111 5047 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149118 5047 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149127 5047 feature_gate.go:330] unrecognized feature gate: Example Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149136 5047 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149143 5047 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149151 5047 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149158 5047 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149164 5047 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149171 5047 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149178 5047 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149185 5047 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149192 5047 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149199 5047 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149205 5047 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149212 5047 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149219 5047 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149226 5047 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149233 5047 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149240 5047 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149247 5047 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149254 5047 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149260 5047 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149267 5047 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149274 5047 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149281 5047 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149288 5047 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149294 5047 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149303 5047 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149311 5047 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149320 5047 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149328 5047 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149335 5047 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149343 5047 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149350 5047 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149359 5047 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149366 5047 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149373 5047 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149381 5047 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149388 5047 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.149397 5047 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.149408 5047 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.150860 5047 server.go:940] "Client rotation is on, will bootstrap in background" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.159618 5047 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.159782 5047 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.161431 5047 server.go:997] "Starting client certificate rotation" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.161472 5047 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.163487 5047 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-27 15:36:12.117266234 +0000 UTC Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.163685 5047 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.187592 5047 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.191507 5047 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.191508 5047 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.204621 5047 log.go:25] "Validated CRI v1 runtime API" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.236738 5047 log.go:25] "Validated CRI v1 image API" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.238330 5047 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.242371 5047 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-23-06-36-05-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.242400 5047 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.257656 5047 manager.go:217] Machine: {Timestamp:2026-02-23 06:44:38.254196873 +0000 UTC m=+0.505524027 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d12f1023-e3c1-471d-b7b7-19c07f350921 BootID:b1b59f5d-eb2f-45ea-8116-a187d6509bf4 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:05:6e:3e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:05:6e:3e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:9b:a3:3e Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:7d:90:cb Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:27:3b:51 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:3b:cf:a3 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:b7:43:d5 Speed:-1 Mtu:1496} {Name:ens7.44 MacAddress:52:54:00:d6:e2:03 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3e:8d:e9:57:41:48 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1a:7e:af:2e:8a:f0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.257908 5047 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.258092 5047 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.261657 5047 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.262408 5047 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.262455 5047 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.262718 5047 topology_manager.go:138] "Creating topology manager with none policy" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.262736 5047 container_manager_linux.go:303] "Creating device plugin manager" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.263332 5047 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.263379 5047 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.263816 5047 state_mem.go:36] "Initialized new in-memory state store" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.264280 5047 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.267654 5047 kubelet.go:418] "Attempting to sync node with API server" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.267683 5047 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.267705 5047 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.267722 5047 kubelet.go:324] "Adding apiserver pod source" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.267735 5047 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.272302 5047 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.273396 5047 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.274144 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.274222 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.274152 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.274261 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.275164 5047 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.276635 5047 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.276665 5047 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.276675 5047 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.276685 5047 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.276700 5047 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.276710 5047 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.276719 5047 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.276734 5047 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.276744 5047 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.276754 5047 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.276767 5047 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.276776 5047 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.278267 5047 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.278737 5047 server.go:1280] "Started kubelet" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.279616 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.279898 5047 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.279895 5047 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 23 06:44:38 crc systemd[1]: Started Kubernetes Kubelet. Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.280537 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.280567 5047 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.280634 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 23:17:08.791813462 +0000 UTC Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.280860 5047 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.280895 5047 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.280885 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.280984 5047 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.281014 5047 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.281858 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.281950 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.282032 5047 factory.go:55] Registering systemd factory Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.282197 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="200ms" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.282951 5047 factory.go:221] Registration of the systemd container factory successfully Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.284384 5047 server.go:460] "Adding debug handlers to kubelet server" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.284489 5047 factory.go:153] Registering CRI-O factory Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.284511 5047 factory.go:221] Registration of the crio container factory successfully Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.284667 5047 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.284694 5047 factory.go:103] Registering Raw factory Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.284713 5047 manager.go:1196] Started watching for new ooms in manager Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.285508 5047 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.129:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1896cd28bd82a0e1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:44:38.278709473 +0000 UTC m=+0.530036627,LastTimestamp:2026-02-23 06:44:38.278709473 +0000 UTC m=+0.530036627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.293297 5047 manager.go:319] Starting recovery of all containers Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.301765 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.302184 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.302402 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.302590 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.302765 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.303024 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.303205 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.303380 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.303546 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.303693 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.303895 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.304124 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.304286 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.304448 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.304674 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.304865 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.305140 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.305728 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.305913 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.306159 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.306338 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.306519 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.306668 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.306918 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.307139 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.307302 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.307477 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.307648 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.307811 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.308069 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.308259 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.308458 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.308645 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.308870 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.309112 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.309297 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.309460 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.309653 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.309839 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.310046 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.310252 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.310420 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.310576 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.310705 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313168 5047 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313231 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313253 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313269 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313285 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313300 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313317 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313338 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313378 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313412 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313432 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313454 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313476 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313496 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313518 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313534 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313548 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313563 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313581 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313599 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313619 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313635 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313654 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313672 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313725 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313749 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313769 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313799 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313826 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313845 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313866 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313885 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313911 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313957 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.313980 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314006 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314027 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314051 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314072 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314092 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314113 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314134 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314153 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314172 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314191 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314211 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314236 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314255 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314274 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314296 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314319 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314338 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314358 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314380 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314398 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314416 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314431 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314449 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314464 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314478 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314496 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314519 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314538 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314557 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314574 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314590 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314607 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314626 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314643 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314660 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314675 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314692 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314711 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314732 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314752 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314770 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314786 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314803 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314822 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314844 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314589 5047 manager.go:324] Recovery completed Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.314863 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315189 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315225 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315244 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315262 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315278 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315297 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315312 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315329 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315345 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315363 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315385 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315408 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315429 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315448 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315464 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315480 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315495 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315511 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315528 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315545 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315560 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315576 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315592 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315607 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315625 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315641 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315659 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315675 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315690 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315704 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315720 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315735 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315749 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315766 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315782 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315799 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315815 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315830 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315845 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315859 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315874 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315889 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315909 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315942 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315975 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.315991 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316008 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316023 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316037 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316051 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316067 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316081 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316097 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316110 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316124 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316138 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316151 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316165 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316180 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316194 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316208 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316221 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316234 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316250 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316262 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316278 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316294 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316310 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316322 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316335 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316350 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316366 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316380 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316394 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316407 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316420 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316439 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316454 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316471 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316486 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316500 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316513 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316526 5047 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316541 5047 reconstruct.go:97] "Volume reconstruction finished" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.316552 5047 reconciler.go:26] "Reconciler: start to sync state" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.330473 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.332243 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.333050 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.333073 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.334421 5047 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.334467 5047 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.334510 5047 state_mem.go:36] "Initialized new in-memory state store" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.335321 5047 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.339303 5047 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.339370 5047 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.339588 5047 kubelet.go:2335] "Starting kubelet main sync loop" Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.339698 5047 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.345377 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.345639 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.355617 5047 policy_none.go:49] "None policy: Start" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.356482 5047 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.356523 5047 state_mem.go:35] "Initializing new in-memory state store" Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.381975 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.417384 5047 manager.go:334] "Starting Device Plugin manager" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.417784 5047 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.417855 5047 server.go:79] "Starting device plugin registration server" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.418563 5047 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.418597 5047 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.419338 5047 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.419575 5047 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.419600 5047 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.429808 5047 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.440114 5047 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.440243 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.441656 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.441708 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.441720 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.441937 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.442121 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.442192 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.443430 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.443538 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.443560 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.443703 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.444603 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.444733 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.445116 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.445161 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.445173 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.445387 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.445411 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.445422 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.445564 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.445651 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.445683 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.447093 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.447137 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.447152 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.448106 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.448151 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.448165 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.448196 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.448249 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.448296 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.448628 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.448727 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.448785 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.449963 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.450006 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.450022 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.451221 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.452019 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.452046 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.452311 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.452360 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.453333 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.453374 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.453386 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.482972 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="400ms" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.519304 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.519378 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.519403 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.519420 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.519436 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.519454 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.519420 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.519522 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.519654 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.519716 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.519820 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.519879 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.520034 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.520148 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.520198 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.520241 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.520553 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.520585 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.520595 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.520620 5047 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.521812 5047 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621392 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621488 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621522 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621561 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621577 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621597 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621628 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621626 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621666 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621657 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621730 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621744 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621761 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621811 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621848 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621831 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621859 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621839 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.621984 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.622035 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.622080 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.622098 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.622113 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.622136 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.622151 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.622159 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.622189 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.622222 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.622245 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.622381 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.722330 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.723576 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.723616 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.723634 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.723659 5047 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.724021 5047 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.764251 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.779313 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.796523 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.803719 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.803987 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-2c36dba42330886e905df73a0a31730ae8b260523cd4a810821b018efb5f701b WatchSource:0}: Error finding container 2c36dba42330886e905df73a0a31730ae8b260523cd4a810821b018efb5f701b: Status 404 returned error can't find the container with id 2c36dba42330886e905df73a0a31730ae8b260523cd4a810821b018efb5f701b Feb 23 06:44:38 crc kubenswrapper[5047]: I0223 06:44:38.807451 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.807806 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-98f725ae65166fa577efdff4994d2953ab4fdfe3aa8fc68dce2231c7ee42ec56 WatchSource:0}: Error finding container 98f725ae65166fa577efdff4994d2953ab4fdfe3aa8fc68dce2231c7ee42ec56: Status 404 returned error can't find the container with id 98f725ae65166fa577efdff4994d2953ab4fdfe3aa8fc68dce2231c7ee42ec56 Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.822856 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-57a0bdafba6446a90d9a1f2dd65e93234c556d5d56f5143a84237a35098232fd WatchSource:0}: Error finding container 57a0bdafba6446a90d9a1f2dd65e93234c556d5d56f5143a84237a35098232fd: Status 404 returned error can't find the container with id 57a0bdafba6446a90d9a1f2dd65e93234c556d5d56f5143a84237a35098232fd Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.826112 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-f971545a9cfa48188fb30ebb13f3e75235a1ccece3f0667ab161daa95de4aa5b WatchSource:0}: Error finding container f971545a9cfa48188fb30ebb13f3e75235a1ccece3f0667ab161daa95de4aa5b: Status 404 returned error can't find the container with id f971545a9cfa48188fb30ebb13f3e75235a1ccece3f0667ab161daa95de4aa5b Feb 23 06:44:38 crc kubenswrapper[5047]: W0223 06:44:38.828208 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-5431dde8e9152ddad756c5963ed8e3cc396a2a232f358dc51dfb5ba9fa60b491 WatchSource:0}: Error finding container 5431dde8e9152ddad756c5963ed8e3cc396a2a232f358dc51dfb5ba9fa60b491: Status 404 returned error can't find the container with id 5431dde8e9152ddad756c5963ed8e3cc396a2a232f358dc51dfb5ba9fa60b491 Feb 23 06:44:38 crc kubenswrapper[5047]: E0223 06:44:38.884416 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="800ms" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.124726 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.126605 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.126698 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.126725 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.126779 5047 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:44:39 crc kubenswrapper[5047]: E0223 06:44:39.127636 5047 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Feb 23 06:44:39 crc kubenswrapper[5047]: W0223 06:44:39.157995 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:39 crc kubenswrapper[5047]: E0223 06:44:39.158161 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.280849 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.280798 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 05:25:42.058434026 +0000 UTC Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.348348 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541"} Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.348473 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5431dde8e9152ddad756c5963ed8e3cc396a2a232f358dc51dfb5ba9fa60b491"} Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.350258 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8"} Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.350319 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f971545a9cfa48188fb30ebb13f3e75235a1ccece3f0667ab161daa95de4aa5b"} Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.350431 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.351399 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.351442 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.351459 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.351561 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9"} Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.351590 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"57a0bdafba6446a90d9a1f2dd65e93234c556d5d56f5143a84237a35098232fd"} Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.351674 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.352283 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.352301 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.352312 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.353511 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8"} Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.353537 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2c36dba42330886e905df73a0a31730ae8b260523cd4a810821b018efb5f701b"} Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.353610 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.354241 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.354261 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.354268 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.355347 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"db410610d66ac6995629552ac1e2a2d1f6f1ee0716cfced332200b1ec230d213"} Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.355381 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"98f725ae65166fa577efdff4994d2953ab4fdfe3aa8fc68dce2231c7ee42ec56"} Feb 23 06:44:39 crc kubenswrapper[5047]: W0223 06:44:39.546344 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:39 crc kubenswrapper[5047]: E0223 06:44:39.546429 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:44:39 crc kubenswrapper[5047]: W0223 06:44:39.652520 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:39 crc kubenswrapper[5047]: E0223 06:44:39.652596 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:44:39 crc kubenswrapper[5047]: E0223 06:44:39.685742 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="1.6s" Feb 23 06:44:39 crc kubenswrapper[5047]: W0223 06:44:39.843630 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:39 crc kubenswrapper[5047]: E0223 06:44:39.843706 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.928538 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.930476 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.930534 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.930551 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:39 crc kubenswrapper[5047]: I0223 06:44:39.930580 5047 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:44:39 crc kubenswrapper[5047]: E0223 06:44:39.931355 5047 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.281028 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 12:22:42.629501083 +0000 UTC Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.281176 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.342312 5047 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:44:40 crc kubenswrapper[5047]: E0223 06:44:40.343407 5047 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.358756 5047 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541" exitCode=0 Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.358827 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541"} Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.358890 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.359739 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.359775 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.359788 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.360401 5047 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8" exitCode=0 Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.360471 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8"} Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.360549 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.361238 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.361642 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.361669 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.361680 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.362181 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.362217 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.362231 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.362454 5047 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9" exitCode=0 Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.362526 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9"} Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.362628 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.363325 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.363340 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.363348 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.364825 5047 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8" exitCode=0 Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.364881 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8"} Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.364988 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.365707 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.365741 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.365787 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.369933 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ea2a8886b868cf981148d459b9d6f92665b1d346bd8d2a07478b2715a8db06f2"} Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.369968 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2394c5130c5db1be3f5acb17466223dbea987efe7cbe5a2fb40033d9db92bc1e"} Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.369980 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d7f6af11b197ea3f94d5b8b0e984f9b2ad265560dcabd474a4c065d9fdb65c05"} Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.369991 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.370605 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.370639 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:40 crc kubenswrapper[5047]: I0223 06:44:40.370653 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:41 crc kubenswrapper[5047]: W0223 06:44:41.037075 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:41 crc kubenswrapper[5047]: E0223 06:44:41.037161 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.281205 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 17:13:23.364735293 +0000 UTC Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.281279 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:41 crc kubenswrapper[5047]: E0223 06:44:41.287295 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="3.2s" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.376129 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"808841a24cbb860aa46294fd300f6b1fbd869ea49128ecdff9d8020c9201bde3"} Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.376495 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.377441 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.377472 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.377482 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.378002 5047 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16" exitCode=0 Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.378065 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16"} Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.378173 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.378803 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.378838 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.378851 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.381311 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a8820373a5aa097eb52489502d77cfb56d8c2c4b7edb0d19705e9ab00a2e131a"} Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.381338 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a30f84f8932b31484defc9e203eab4381b61d6fd0315c51b5f388477d46d0a79"} Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.381354 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"07882d40dca6a83764ed89d4374217a2f3f47d2b2cf1ea24d5b78d1b5d637f43"} Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.381429 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.382238 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.382268 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.382280 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.384545 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.384548 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d6dcfd77743e27766e8acdd98aae6ebd90159ad2f92bb53db71955cc3094741a"} Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.384666 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b"} Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.384684 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab"} Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.384579 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.384698 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956"} Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.384784 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043"} Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.385480 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.385504 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.385512 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.385780 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.385847 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.385860 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:41 crc kubenswrapper[5047]: W0223 06:44:41.483137 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.129:6443: connect: connection refused Feb 23 06:44:41 crc kubenswrapper[5047]: E0223 06:44:41.483202 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.129:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.531568 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.536697 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.536748 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.536761 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.536823 5047 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:44:41 crc kubenswrapper[5047]: E0223 06:44:41.537265 5047 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.129:6443: connect: connection refused" node="crc" Feb 23 06:44:41 crc kubenswrapper[5047]: I0223 06:44:41.849169 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.096097 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.281325 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 23:04:23.49881657 +0000 UTC Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.389371 5047 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054" exitCode=0 Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.389473 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.390027 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054"} Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.390103 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.390149 5047 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.390174 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.390509 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.390582 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.390606 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.390744 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.390768 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.390776 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.391093 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.391224 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:42 crc kubenswrapper[5047]: I0223 06:44:42.391254 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.281416 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 00:46:38.854763766 +0000 UTC Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.395355 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b"} Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.395390 5047 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.395413 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce"} Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.395429 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.395430 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f"} Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.395605 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.395635 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19"} Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.395660 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc"} Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.396351 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.396381 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.396393 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.397001 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.397040 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.397051 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.445944 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.446124 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.447729 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.447775 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.447785 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:43 crc kubenswrapper[5047]: I0223 06:44:43.452040 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.147401 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.281565 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 19:44:07.600020836 +0000 UTC Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.398707 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.398763 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.399052 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.400334 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.400376 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.400394 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.400613 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.400639 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.400648 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.401575 5047 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.737407 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.739941 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.740115 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.740234 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:44 crc kubenswrapper[5047]: I0223 06:44:44.740370 5047 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.281864 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 10:54:14.067401975 +0000 UTC Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.400695 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.400714 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.401607 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.401607 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.401668 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.401681 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.401639 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.401708 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.577039 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.577237 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.578321 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.578352 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.578362 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:45 crc kubenswrapper[5047]: I0223 06:44:45.718173 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:46 crc kubenswrapper[5047]: I0223 06:44:46.282893 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 04:52:24.975079322 +0000 UTC Feb 23 06:44:46 crc kubenswrapper[5047]: I0223 06:44:46.402923 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:46 crc kubenswrapper[5047]: I0223 06:44:46.404246 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:46 crc kubenswrapper[5047]: I0223 06:44:46.404270 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:46 crc kubenswrapper[5047]: I0223 06:44:46.404277 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:47 crc kubenswrapper[5047]: I0223 06:44:47.283304 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 21:40:41.173820312 +0000 UTC Feb 23 06:44:47 crc kubenswrapper[5047]: I0223 06:44:47.982691 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 23 06:44:47 crc kubenswrapper[5047]: I0223 06:44:47.982989 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:47 crc kubenswrapper[5047]: I0223 06:44:47.984327 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:47 crc kubenswrapper[5047]: I0223 06:44:47.984378 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:47 crc kubenswrapper[5047]: I0223 06:44:47.984399 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:48 crc kubenswrapper[5047]: I0223 06:44:48.284069 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 20:06:28.649286945 +0000 UTC Feb 23 06:44:48 crc kubenswrapper[5047]: E0223 06:44:48.429955 5047 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:44:48 crc kubenswrapper[5047]: I0223 06:44:48.827602 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:48 crc kubenswrapper[5047]: I0223 06:44:48.827885 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:48 crc kubenswrapper[5047]: I0223 06:44:48.829473 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:48 crc kubenswrapper[5047]: I0223 06:44:48.829517 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:48 crc kubenswrapper[5047]: I0223 06:44:48.829533 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:49 crc kubenswrapper[5047]: I0223 06:44:49.232832 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:49 crc kubenswrapper[5047]: I0223 06:44:49.285419 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 22:14:16.368550382 +0000 UTC Feb 23 06:44:49 crc kubenswrapper[5047]: I0223 06:44:49.410437 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:49 crc kubenswrapper[5047]: I0223 06:44:49.411534 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:49 crc kubenswrapper[5047]: I0223 06:44:49.411611 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:49 crc kubenswrapper[5047]: I0223 06:44:49.411663 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:50 crc kubenswrapper[5047]: I0223 06:44:50.002273 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:44:50 crc kubenswrapper[5047]: I0223 06:44:50.286172 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 08:29:50.612544637 +0000 UTC Feb 23 06:44:50 crc kubenswrapper[5047]: I0223 06:44:50.412772 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:50 crc kubenswrapper[5047]: I0223 06:44:50.413809 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:50 crc kubenswrapper[5047]: I0223 06:44:50.413869 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:50 crc kubenswrapper[5047]: I0223 06:44:50.413888 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:51 crc kubenswrapper[5047]: I0223 06:44:51.287765 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 05:12:58.085389226 +0000 UTC Feb 23 06:44:51 crc kubenswrapper[5047]: I0223 06:44:51.829353 5047 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:44:51 crc kubenswrapper[5047]: I0223 06:44:51.829438 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 06:44:51 crc kubenswrapper[5047]: W0223 06:44:51.907053 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 23 06:44:51 crc kubenswrapper[5047]: I0223 06:44:51.907150 5047 trace.go:236] Trace[207965445]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (23-Feb-2026 06:44:41.905) (total time: 10001ms): Feb 23 06:44:51 crc kubenswrapper[5047]: Trace[207965445]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:44:51.907) Feb 23 06:44:51 crc kubenswrapper[5047]: Trace[207965445]: [10.001452078s] [10.001452078s] END Feb 23 06:44:51 crc kubenswrapper[5047]: E0223 06:44:51.907178 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 23 06:44:52 crc kubenswrapper[5047]: W0223 06:44:52.011812 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.011943 5047 trace.go:236] Trace[906854014]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (23-Feb-2026 06:44:42.010) (total time: 10000ms): Feb 23 06:44:52 crc kubenswrapper[5047]: Trace[906854014]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (06:44:52.011) Feb 23 06:44:52 crc kubenswrapper[5047]: Trace[906854014]: [10.000941672s] [10.000941672s] END Feb 23 06:44:52 crc kubenswrapper[5047]: E0223 06:44:52.011967 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.097010 5047 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.097089 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.161773 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:52Z is after 2026-02-23T05:33:13Z Feb 23 06:44:52 crc kubenswrapper[5047]: E0223 06:44:52.165855 5047 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:44:52 crc kubenswrapper[5047]: E0223 06:44:52.166470 5047 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:52Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:44:52 crc kubenswrapper[5047]: E0223 06:44:52.168493 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:52Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 23 06:44:52 crc kubenswrapper[5047]: W0223 06:44:52.170358 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:52Z is after 2026-02-23T05:33:13Z Feb 23 06:44:52 crc kubenswrapper[5047]: E0223 06:44:52.170412 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:44:52 crc kubenswrapper[5047]: W0223 06:44:52.172331 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:52Z is after 2026-02-23T05:33:13Z Feb 23 06:44:52 crc kubenswrapper[5047]: E0223 06:44:52.172393 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:44:52 crc kubenswrapper[5047]: E0223 06:44:52.172640 5047 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:52Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1896cd28bd82a0e1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:44:38.278709473 +0000 UTC m=+0.530036627,LastTimestamp:2026-02-23 06:44:38.278709473 +0000 UTC m=+0.530036627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.176674 5047 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.176715 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.283570 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:52Z is after 2026-02-23T05:33:13Z Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.288797 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 05:44:05.035610881 +0000 UTC Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.419635 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.421604 5047 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d6dcfd77743e27766e8acdd98aae6ebd90159ad2f92bb53db71955cc3094741a" exitCode=255 Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.421657 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d6dcfd77743e27766e8acdd98aae6ebd90159ad2f92bb53db71955cc3094741a"} Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.421793 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.422576 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.422610 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.422620 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:52 crc kubenswrapper[5047]: I0223 06:44:52.423075 5047 scope.go:117] "RemoveContainer" containerID="d6dcfd77743e27766e8acdd98aae6ebd90159ad2f92bb53db71955cc3094741a" Feb 23 06:44:53 crc kubenswrapper[5047]: I0223 06:44:53.283383 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:53Z is after 2026-02-23T05:33:13Z Feb 23 06:44:53 crc kubenswrapper[5047]: I0223 06:44:53.289715 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 08:59:59.025697609 +0000 UTC Feb 23 06:44:53 crc kubenswrapper[5047]: I0223 06:44:53.426052 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 23 06:44:53 crc kubenswrapper[5047]: I0223 06:44:53.432252 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b9ba0cc30c87f44d0d14724da49143006883c943cb6a705088e39b2f933401fc"} Feb 23 06:44:53 crc kubenswrapper[5047]: I0223 06:44:53.432567 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:53 crc kubenswrapper[5047]: I0223 06:44:53.434608 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:53 crc kubenswrapper[5047]: I0223 06:44:53.434646 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:53 crc kubenswrapper[5047]: I0223 06:44:53.434655 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:54 crc kubenswrapper[5047]: I0223 06:44:54.284008 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:54Z is after 2026-02-23T05:33:13Z Feb 23 06:44:54 crc kubenswrapper[5047]: I0223 06:44:54.290443 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 20:03:23.177015347 +0000 UTC Feb 23 06:44:54 crc kubenswrapper[5047]: I0223 06:44:54.440067 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 23 06:44:54 crc kubenswrapper[5047]: I0223 06:44:54.441211 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 23 06:44:54 crc kubenswrapper[5047]: I0223 06:44:54.443575 5047 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b9ba0cc30c87f44d0d14724da49143006883c943cb6a705088e39b2f933401fc" exitCode=255 Feb 23 06:44:54 crc kubenswrapper[5047]: I0223 06:44:54.443631 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b9ba0cc30c87f44d0d14724da49143006883c943cb6a705088e39b2f933401fc"} Feb 23 06:44:54 crc kubenswrapper[5047]: I0223 06:44:54.443692 5047 scope.go:117] "RemoveContainer" containerID="d6dcfd77743e27766e8acdd98aae6ebd90159ad2f92bb53db71955cc3094741a" Feb 23 06:44:54 crc kubenswrapper[5047]: I0223 06:44:54.443898 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:54 crc kubenswrapper[5047]: I0223 06:44:54.445704 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:54 crc kubenswrapper[5047]: I0223 06:44:54.445797 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:54 crc kubenswrapper[5047]: I0223 06:44:54.445835 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:54 crc kubenswrapper[5047]: I0223 06:44:54.447345 5047 scope.go:117] "RemoveContainer" containerID="b9ba0cc30c87f44d0d14724da49143006883c943cb6a705088e39b2f933401fc" Feb 23 06:44:54 crc kubenswrapper[5047]: E0223 06:44:54.447745 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:44:55 crc kubenswrapper[5047]: I0223 06:44:55.285012 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:55Z is after 2026-02-23T05:33:13Z Feb 23 06:44:55 crc kubenswrapper[5047]: I0223 06:44:55.291567 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 19:20:52.980838661 +0000 UTC Feb 23 06:44:55 crc kubenswrapper[5047]: I0223 06:44:55.450622 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 23 06:44:55 crc kubenswrapper[5047]: I0223 06:44:55.577599 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:55 crc kubenswrapper[5047]: I0223 06:44:55.577851 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:55 crc kubenswrapper[5047]: I0223 06:44:55.579467 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:55 crc kubenswrapper[5047]: I0223 06:44:55.579528 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:55 crc kubenswrapper[5047]: I0223 06:44:55.579550 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:55 crc kubenswrapper[5047]: I0223 06:44:55.580431 5047 scope.go:117] "RemoveContainer" containerID="b9ba0cc30c87f44d0d14724da49143006883c943cb6a705088e39b2f933401fc" Feb 23 06:44:55 crc kubenswrapper[5047]: E0223 06:44:55.580770 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:44:56 crc kubenswrapper[5047]: I0223 06:44:56.283731 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:56Z is after 2026-02-23T05:33:13Z Feb 23 06:44:56 crc kubenswrapper[5047]: I0223 06:44:56.292127 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 13:56:52.832496911 +0000 UTC Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.108109 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.108443 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.110449 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.110499 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.110514 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.111161 5047 scope.go:117] "RemoveContainer" containerID="b9ba0cc30c87f44d0d14724da49143006883c943cb6a705088e39b2f933401fc" Feb 23 06:44:57 crc kubenswrapper[5047]: E0223 06:44:57.111341 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.112826 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.284342 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:57Z is after 2026-02-23T05:33:13Z Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.292704 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 02:35:33.313446161 +0000 UTC Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.458651 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.461882 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.461978 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.462006 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:57 crc kubenswrapper[5047]: I0223 06:44:57.463268 5047 scope.go:117] "RemoveContainer" containerID="b9ba0cc30c87f44d0d14724da49143006883c943cb6a705088e39b2f933401fc" Feb 23 06:44:57 crc kubenswrapper[5047]: E0223 06:44:57.463438 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:44:57 crc kubenswrapper[5047]: W0223 06:44:57.685223 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:57Z is after 2026-02-23T05:33:13Z Feb 23 06:44:57 crc kubenswrapper[5047]: E0223 06:44:57.685389 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:44:57 crc kubenswrapper[5047]: W0223 06:44:57.814704 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:57Z is after 2026-02-23T05:33:13Z Feb 23 06:44:57 crc kubenswrapper[5047]: E0223 06:44:57.814813 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.016685 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.016890 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.018076 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.018120 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.018133 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.031684 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.160692 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.286620 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:58Z is after 2026-02-23T05:33:13Z Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.292960 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 07:13:58.199185127 +0000 UTC Feb 23 06:44:58 crc kubenswrapper[5047]: E0223 06:44:58.430235 5047 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.461764 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.461775 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.463772 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.463839 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.463772 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.463938 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.463982 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.464012 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.465178 5047 scope.go:117] "RemoveContainer" containerID="b9ba0cc30c87f44d0d14724da49143006883c943cb6a705088e39b2f933401fc" Feb 23 06:44:58 crc kubenswrapper[5047]: E0223 06:44:58.465562 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.567113 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.568995 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.569040 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.569057 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:44:58 crc kubenswrapper[5047]: I0223 06:44:58.569138 5047 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:44:58 crc kubenswrapper[5047]: E0223 06:44:58.572196 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:58Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 23 06:44:58 crc kubenswrapper[5047]: E0223 06:44:58.573420 5047 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:58Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:44:59 crc kubenswrapper[5047]: I0223 06:44:59.286517 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:44:59Z is after 2026-02-23T05:33:13Z Feb 23 06:44:59 crc kubenswrapper[5047]: I0223 06:44:59.293836 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 11:18:28.81620513 +0000 UTC Feb 23 06:45:00 crc kubenswrapper[5047]: I0223 06:45:00.285184 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:00Z is after 2026-02-23T05:33:13Z Feb 23 06:45:00 crc kubenswrapper[5047]: I0223 06:45:00.294684 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 19:55:27.96308205 +0000 UTC Feb 23 06:45:00 crc kubenswrapper[5047]: W0223 06:45:00.531694 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:00Z is after 2026-02-23T05:33:13Z Feb 23 06:45:00 crc kubenswrapper[5047]: E0223 06:45:00.531855 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:00 crc kubenswrapper[5047]: I0223 06:45:00.585479 5047 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:45:00 crc kubenswrapper[5047]: E0223 06:45:00.595023 5047 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:01 crc kubenswrapper[5047]: I0223 06:45:01.285702 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:01Z is after 2026-02-23T05:33:13Z Feb 23 06:45:01 crc kubenswrapper[5047]: I0223 06:45:01.295104 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 09:59:50.262690678 +0000 UTC Feb 23 06:45:01 crc kubenswrapper[5047]: I0223 06:45:01.829084 5047 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:45:01 crc kubenswrapper[5047]: I0223 06:45:01.829227 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 06:45:01 crc kubenswrapper[5047]: W0223 06:45:01.913643 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:01Z is after 2026-02-23T05:33:13Z Feb 23 06:45:01 crc kubenswrapper[5047]: E0223 06:45:01.913777 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:02 crc kubenswrapper[5047]: E0223 06:45:02.179245 5047 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:02Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1896cd28bd82a0e1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:44:38.278709473 +0000 UTC m=+0.530036627,LastTimestamp:2026-02-23 06:44:38.278709473 +0000 UTC m=+0.530036627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:45:02 crc kubenswrapper[5047]: I0223 06:45:02.286041 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:02Z is after 2026-02-23T05:33:13Z Feb 23 06:45:02 crc kubenswrapper[5047]: I0223 06:45:02.295263 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 11:26:53.239852225 +0000 UTC Feb 23 06:45:03 crc kubenswrapper[5047]: I0223 06:45:03.283451 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:03Z is after 2026-02-23T05:33:13Z Feb 23 06:45:03 crc kubenswrapper[5047]: I0223 06:45:03.295719 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 11:16:20.222214595 +0000 UTC Feb 23 06:45:04 crc kubenswrapper[5047]: I0223 06:45:04.285609 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:04Z is after 2026-02-23T05:33:13Z Feb 23 06:45:04 crc kubenswrapper[5047]: I0223 06:45:04.296067 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 14:06:18.305234428 +0000 UTC Feb 23 06:45:05 crc kubenswrapper[5047]: W0223 06:45:05.032498 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:05Z is after 2026-02-23T05:33:13Z Feb 23 06:45:05 crc kubenswrapper[5047]: E0223 06:45:05.032566 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:05 crc kubenswrapper[5047]: I0223 06:45:05.283048 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:05Z is after 2026-02-23T05:33:13Z Feb 23 06:45:05 crc kubenswrapper[5047]: I0223 06:45:05.296248 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 21:20:56.9841001 +0000 UTC Feb 23 06:45:05 crc kubenswrapper[5047]: I0223 06:45:05.573515 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:05 crc kubenswrapper[5047]: I0223 06:45:05.575099 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:05 crc kubenswrapper[5047]: I0223 06:45:05.575162 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:05 crc kubenswrapper[5047]: I0223 06:45:05.575189 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:05 crc kubenswrapper[5047]: I0223 06:45:05.575232 5047 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:45:05 crc kubenswrapper[5047]: E0223 06:45:05.575730 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:05Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 23 06:45:05 crc kubenswrapper[5047]: E0223 06:45:05.577872 5047 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:05Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:45:06 crc kubenswrapper[5047]: I0223 06:45:06.283374 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:06Z is after 2026-02-23T05:33:13Z Feb 23 06:45:06 crc kubenswrapper[5047]: I0223 06:45:06.296877 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 16:55:38.525600332 +0000 UTC Feb 23 06:45:07 crc kubenswrapper[5047]: I0223 06:45:07.285551 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:07Z is after 2026-02-23T05:33:13Z Feb 23 06:45:07 crc kubenswrapper[5047]: I0223 06:45:07.297043 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 18:59:54.40193264 +0000 UTC Feb 23 06:45:08 crc kubenswrapper[5047]: I0223 06:45:08.284904 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:08Z is after 2026-02-23T05:33:13Z Feb 23 06:45:08 crc kubenswrapper[5047]: I0223 06:45:08.297426 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 15:30:12.989007577 +0000 UTC Feb 23 06:45:08 crc kubenswrapper[5047]: E0223 06:45:08.430435 5047 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:45:08 crc kubenswrapper[5047]: W0223 06:45:08.977102 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:08Z is after 2026-02-23T05:33:13Z Feb 23 06:45:08 crc kubenswrapper[5047]: E0223 06:45:08.977202 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:09 crc kubenswrapper[5047]: I0223 06:45:09.286115 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:09Z is after 2026-02-23T05:33:13Z Feb 23 06:45:09 crc kubenswrapper[5047]: I0223 06:45:09.297531 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 20:28:30.404590686 +0000 UTC Feb 23 06:45:09 crc kubenswrapper[5047]: I0223 06:45:09.340670 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:09 crc kubenswrapper[5047]: I0223 06:45:09.341961 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:09 crc kubenswrapper[5047]: I0223 06:45:09.342010 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:09 crc kubenswrapper[5047]: I0223 06:45:09.342023 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:09 crc kubenswrapper[5047]: I0223 06:45:09.342648 5047 scope.go:117] "RemoveContainer" containerID="b9ba0cc30c87f44d0d14724da49143006883c943cb6a705088e39b2f933401fc" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.285277 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:10Z is after 2026-02-23T05:33:13Z Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.298677 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 06:39:10.53644178 +0000 UTC Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.498237 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.499137 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.502061 5047 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="26dcb6702bd085cffee97b08a0672550ddc1d98151bdcc237f4796fbcc88a8e8" exitCode=255 Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.502123 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"26dcb6702bd085cffee97b08a0672550ddc1d98151bdcc237f4796fbcc88a8e8"} Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.502182 5047 scope.go:117] "RemoveContainer" containerID="b9ba0cc30c87f44d0d14724da49143006883c943cb6a705088e39b2f933401fc" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.502419 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.503804 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.503880 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.503905 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.505143 5047 scope.go:117] "RemoveContainer" containerID="26dcb6702bd085cffee97b08a0672550ddc1d98151bdcc237f4796fbcc88a8e8" Feb 23 06:45:10 crc kubenswrapper[5047]: E0223 06:45:10.505611 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.663938 5047 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:47122->192.168.126.11:10357: read: connection reset by peer" start-of-body= Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.664015 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:47122->192.168.126.11:10357: read: connection reset by peer" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.664078 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.664270 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.665684 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.665715 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.665725 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.666190 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"d7f6af11b197ea3f94d5b8b0e984f9b2ad265560dcabd474a4c065d9fdb65c05"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 23 06:45:10 crc kubenswrapper[5047]: I0223 06:45:10.666363 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://d7f6af11b197ea3f94d5b8b0e984f9b2ad265560dcabd474a4c065d9fdb65c05" gracePeriod=30 Feb 23 06:45:11 crc kubenswrapper[5047]: I0223 06:45:11.283943 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:11Z is after 2026-02-23T05:33:13Z Feb 23 06:45:11 crc kubenswrapper[5047]: I0223 06:45:11.299465 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 08:22:30.057405638 +0000 UTC Feb 23 06:45:11 crc kubenswrapper[5047]: I0223 06:45:11.507959 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 23 06:45:11 crc kubenswrapper[5047]: I0223 06:45:11.514866 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 23 06:45:11 crc kubenswrapper[5047]: I0223 06:45:11.515572 5047 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d7f6af11b197ea3f94d5b8b0e984f9b2ad265560dcabd474a4c065d9fdb65c05" exitCode=255 Feb 23 06:45:11 crc kubenswrapper[5047]: I0223 06:45:11.515636 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d7f6af11b197ea3f94d5b8b0e984f9b2ad265560dcabd474a4c065d9fdb65c05"} Feb 23 06:45:11 crc kubenswrapper[5047]: I0223 06:45:11.515713 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f18d208b90ca9fc5264ec1c3860df730a39cca8c3249c17cec85518f4d2c0bae"} Feb 23 06:45:11 crc kubenswrapper[5047]: I0223 06:45:11.515884 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:11 crc kubenswrapper[5047]: I0223 06:45:11.517370 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:11 crc kubenswrapper[5047]: I0223 06:45:11.517449 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:11 crc kubenswrapper[5047]: I0223 06:45:11.517475 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:12 crc kubenswrapper[5047]: E0223 06:45:12.183790 5047 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:12Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1896cd28bd82a0e1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:44:38.278709473 +0000 UTC m=+0.530036627,LastTimestamp:2026-02-23 06:44:38.278709473 +0000 UTC m=+0.530036627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:45:12 crc kubenswrapper[5047]: I0223 06:45:12.283314 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:12Z is after 2026-02-23T05:33:13Z Feb 23 06:45:12 crc kubenswrapper[5047]: I0223 06:45:12.300216 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 22:16:31.747999435 +0000 UTC Feb 23 06:45:12 crc kubenswrapper[5047]: I0223 06:45:12.578470 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:12 crc kubenswrapper[5047]: I0223 06:45:12.581320 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:12 crc kubenswrapper[5047]: I0223 06:45:12.581388 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:12 crc kubenswrapper[5047]: I0223 06:45:12.581410 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:12 crc kubenswrapper[5047]: I0223 06:45:12.581465 5047 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:45:12 crc kubenswrapper[5047]: E0223 06:45:12.582773 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:12Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 23 06:45:12 crc kubenswrapper[5047]: E0223 06:45:12.587340 5047 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:12Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:45:13 crc kubenswrapper[5047]: I0223 06:45:13.286304 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:13Z is after 2026-02-23T05:33:13Z Feb 23 06:45:13 crc kubenswrapper[5047]: I0223 06:45:13.300785 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 03:41:28.501887166 +0000 UTC Feb 23 06:45:14 crc kubenswrapper[5047]: I0223 06:45:14.285726 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:14Z is after 2026-02-23T05:33:13Z Feb 23 06:45:14 crc kubenswrapper[5047]: I0223 06:45:14.301253 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 21:36:47.775449988 +0000 UTC Feb 23 06:45:15 crc kubenswrapper[5047]: I0223 06:45:15.285839 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:15Z is after 2026-02-23T05:33:13Z Feb 23 06:45:15 crc kubenswrapper[5047]: I0223 06:45:15.302313 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 18:04:21.231785143 +0000 UTC Feb 23 06:45:15 crc kubenswrapper[5047]: I0223 06:45:15.577133 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:15 crc kubenswrapper[5047]: I0223 06:45:15.577512 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:15 crc kubenswrapper[5047]: I0223 06:45:15.579552 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:15 crc kubenswrapper[5047]: I0223 06:45:15.579635 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:15 crc kubenswrapper[5047]: I0223 06:45:15.579662 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:15 crc kubenswrapper[5047]: I0223 06:45:15.580856 5047 scope.go:117] "RemoveContainer" containerID="26dcb6702bd085cffee97b08a0672550ddc1d98151bdcc237f4796fbcc88a8e8" Feb 23 06:45:15 crc kubenswrapper[5047]: E0223 06:45:15.581269 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:45:16 crc kubenswrapper[5047]: W0223 06:45:16.140053 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:16Z is after 2026-02-23T05:33:13Z Feb 23 06:45:16 crc kubenswrapper[5047]: E0223 06:45:16.140159 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:16 crc kubenswrapper[5047]: I0223 06:45:16.285068 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:16Z is after 2026-02-23T05:33:13Z Feb 23 06:45:16 crc kubenswrapper[5047]: I0223 06:45:16.303012 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 03:40:19.115185904 +0000 UTC Feb 23 06:45:17 crc kubenswrapper[5047]: I0223 06:45:17.284382 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:17Z is after 2026-02-23T05:33:13Z Feb 23 06:45:17 crc kubenswrapper[5047]: I0223 06:45:17.303525 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 09:23:48.085514238 +0000 UTC Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.120145 5047 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:45:18 crc kubenswrapper[5047]: E0223 06:45:18.125991 5047 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:18 crc kubenswrapper[5047]: E0223 06:45:18.127187 5047 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.160602 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.160786 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.161925 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.161963 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.161974 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.162440 5047 scope.go:117] "RemoveContainer" containerID="26dcb6702bd085cffee97b08a0672550ddc1d98151bdcc237f4796fbcc88a8e8" Feb 23 06:45:18 crc kubenswrapper[5047]: E0223 06:45:18.162581 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.284975 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:18Z is after 2026-02-23T05:33:13Z Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.303640 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 05:13:14.386651521 +0000 UTC Feb 23 06:45:18 crc kubenswrapper[5047]: E0223 06:45:18.430635 5047 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.827830 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.828260 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.829858 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.829971 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:18 crc kubenswrapper[5047]: I0223 06:45:18.829999 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:19 crc kubenswrapper[5047]: I0223 06:45:19.283247 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:19Z is after 2026-02-23T05:33:13Z Feb 23 06:45:19 crc kubenswrapper[5047]: I0223 06:45:19.304472 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 19:53:55.931816601 +0000 UTC Feb 23 06:45:19 crc kubenswrapper[5047]: I0223 06:45:19.588103 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:19 crc kubenswrapper[5047]: E0223 06:45:19.590543 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:19Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 23 06:45:19 crc kubenswrapper[5047]: I0223 06:45:19.590821 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:19 crc kubenswrapper[5047]: I0223 06:45:19.590889 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:19 crc kubenswrapper[5047]: I0223 06:45:19.590941 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:19 crc kubenswrapper[5047]: I0223 06:45:19.590988 5047 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:45:19 crc kubenswrapper[5047]: E0223 06:45:19.596601 5047 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:19Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:45:20 crc kubenswrapper[5047]: I0223 06:45:20.002012 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:20 crc kubenswrapper[5047]: I0223 06:45:20.002219 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:20 crc kubenswrapper[5047]: I0223 06:45:20.003731 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:20 crc kubenswrapper[5047]: I0223 06:45:20.003802 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:20 crc kubenswrapper[5047]: I0223 06:45:20.003816 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:20 crc kubenswrapper[5047]: I0223 06:45:20.287462 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:20Z is after 2026-02-23T05:33:13Z Feb 23 06:45:20 crc kubenswrapper[5047]: I0223 06:45:20.305422 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 13:26:18.153949221 +0000 UTC Feb 23 06:45:21 crc kubenswrapper[5047]: I0223 06:45:21.285033 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:21Z is after 2026-02-23T05:33:13Z Feb 23 06:45:21 crc kubenswrapper[5047]: I0223 06:45:21.305820 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 06:12:10.035117494 +0000 UTC Feb 23 06:45:21 crc kubenswrapper[5047]: I0223 06:45:21.827747 5047 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:45:21 crc kubenswrapper[5047]: I0223 06:45:21.827851 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 06:45:22 crc kubenswrapper[5047]: E0223 06:45:22.190807 5047 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:22Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1896cd28bd82a0e1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:44:38.278709473 +0000 UTC m=+0.530036627,LastTimestamp:2026-02-23 06:44:38.278709473 +0000 UTC m=+0.530036627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:45:22 crc kubenswrapper[5047]: I0223 06:45:22.285423 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:22Z is after 2026-02-23T05:33:13Z Feb 23 06:45:22 crc kubenswrapper[5047]: I0223 06:45:22.306121 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 12:12:16.707431243 +0000 UTC Feb 23 06:45:23 crc kubenswrapper[5047]: I0223 06:45:23.285332 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:23Z is after 2026-02-23T05:33:13Z Feb 23 06:45:23 crc kubenswrapper[5047]: I0223 06:45:23.307020 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 05:01:29.384497546 +0000 UTC Feb 23 06:45:24 crc kubenswrapper[5047]: I0223 06:45:24.284308 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:24Z is after 2026-02-23T05:33:13Z Feb 23 06:45:24 crc kubenswrapper[5047]: I0223 06:45:24.308002 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 16:33:16.127734676 +0000 UTC Feb 23 06:45:25 crc kubenswrapper[5047]: I0223 06:45:25.285653 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:25Z is after 2026-02-23T05:33:13Z Feb 23 06:45:25 crc kubenswrapper[5047]: I0223 06:45:25.308719 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 07:09:21.634211728 +0000 UTC Feb 23 06:45:25 crc kubenswrapper[5047]: W0223 06:45:25.483319 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:25Z is after 2026-02-23T05:33:13Z Feb 23 06:45:25 crc kubenswrapper[5047]: E0223 06:45:25.483400 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:25Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:26 crc kubenswrapper[5047]: I0223 06:45:26.287659 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:26Z is after 2026-02-23T05:33:13Z Feb 23 06:45:26 crc kubenswrapper[5047]: I0223 06:45:26.309612 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 13:29:23.340740788 +0000 UTC Feb 23 06:45:26 crc kubenswrapper[5047]: I0223 06:45:26.597174 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:26 crc kubenswrapper[5047]: E0223 06:45:26.597267 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:26Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 23 06:45:26 crc kubenswrapper[5047]: I0223 06:45:26.599238 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:26 crc kubenswrapper[5047]: I0223 06:45:26.599329 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:26 crc kubenswrapper[5047]: I0223 06:45:26.599357 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:26 crc kubenswrapper[5047]: I0223 06:45:26.599409 5047 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:45:26 crc kubenswrapper[5047]: E0223 06:45:26.604837 5047 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:26Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:45:27 crc kubenswrapper[5047]: I0223 06:45:27.284363 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:27Z is after 2026-02-23T05:33:13Z Feb 23 06:45:27 crc kubenswrapper[5047]: I0223 06:45:27.310744 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 05:21:15.859749833 +0000 UTC Feb 23 06:45:28 crc kubenswrapper[5047]: W0223 06:45:28.029180 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:28Z is after 2026-02-23T05:33:13Z Feb 23 06:45:28 crc kubenswrapper[5047]: E0223 06:45:28.029291 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:28 crc kubenswrapper[5047]: I0223 06:45:28.285896 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:28Z is after 2026-02-23T05:33:13Z Feb 23 06:45:28 crc kubenswrapper[5047]: I0223 06:45:28.311332 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 04:34:38.78923889 +0000 UTC Feb 23 06:45:28 crc kubenswrapper[5047]: E0223 06:45:28.430760 5047 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:45:29 crc kubenswrapper[5047]: I0223 06:45:29.283357 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:29Z is after 2026-02-23T05:33:13Z Feb 23 06:45:29 crc kubenswrapper[5047]: I0223 06:45:29.312502 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 19:35:54.517903575 +0000 UTC Feb 23 06:45:30 crc kubenswrapper[5047]: I0223 06:45:30.284799 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:30Z is after 2026-02-23T05:33:13Z Feb 23 06:45:30 crc kubenswrapper[5047]: I0223 06:45:30.313294 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 02:20:03.240801887 +0000 UTC Feb 23 06:45:31 crc kubenswrapper[5047]: I0223 06:45:31.282657 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:31Z is after 2026-02-23T05:33:13Z Feb 23 06:45:31 crc kubenswrapper[5047]: I0223 06:45:31.313636 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 17:27:22.171554303 +0000 UTC Feb 23 06:45:31 crc kubenswrapper[5047]: I0223 06:45:31.829068 5047 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:45:31 crc kubenswrapper[5047]: I0223 06:45:31.829157 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 06:45:31 crc kubenswrapper[5047]: I0223 06:45:31.854862 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:45:31 crc kubenswrapper[5047]: I0223 06:45:31.855206 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:31 crc kubenswrapper[5047]: I0223 06:45:31.856456 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:31 crc kubenswrapper[5047]: I0223 06:45:31.856510 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:31 crc kubenswrapper[5047]: I0223 06:45:31.856526 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:32 crc kubenswrapper[5047]: E0223 06:45:32.194732 5047 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:32Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1896cd28bd82a0e1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:44:38.278709473 +0000 UTC m=+0.530036627,LastTimestamp:2026-02-23 06:44:38.278709473 +0000 UTC m=+0.530036627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:45:32 crc kubenswrapper[5047]: W0223 06:45:32.267794 5047 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:32Z is after 2026-02-23T05:33:13Z Feb 23 06:45:32 crc kubenswrapper[5047]: E0223 06:45:32.267864 5047 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:32 crc kubenswrapper[5047]: I0223 06:45:32.282973 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:32Z is after 2026-02-23T05:33:13Z Feb 23 06:45:32 crc kubenswrapper[5047]: I0223 06:45:32.314571 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 14:02:22.622480947 +0000 UTC Feb 23 06:45:32 crc kubenswrapper[5047]: I0223 06:45:32.340242 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:32 crc kubenswrapper[5047]: I0223 06:45:32.341886 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:32 crc kubenswrapper[5047]: I0223 06:45:32.341941 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:32 crc kubenswrapper[5047]: I0223 06:45:32.341950 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:32 crc kubenswrapper[5047]: I0223 06:45:32.342490 5047 scope.go:117] "RemoveContainer" containerID="26dcb6702bd085cffee97b08a0672550ddc1d98151bdcc237f4796fbcc88a8e8" Feb 23 06:45:32 crc kubenswrapper[5047]: I0223 06:45:32.586744 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 23 06:45:32 crc kubenswrapper[5047]: I0223 06:45:32.590771 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a"} Feb 23 06:45:32 crc kubenswrapper[5047]: I0223 06:45:32.590979 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:32 crc kubenswrapper[5047]: I0223 06:45:32.595727 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:32 crc kubenswrapper[5047]: I0223 06:45:32.595825 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:32 crc kubenswrapper[5047]: I0223 06:45:32.595844 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.282635 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:13Z Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.315045 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 19:10:20.27794626 +0000 UTC Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.593628 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.594012 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.595182 5047 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a" exitCode=255 Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.595213 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a"} Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.595244 5047 scope.go:117] "RemoveContainer" containerID="26dcb6702bd085cffee97b08a0672550ddc1d98151bdcc237f4796fbcc88a8e8" Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.595367 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.596104 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.596127 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.596136 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.596509 5047 scope.go:117] "RemoveContainer" containerID="d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a" Feb 23 06:45:33 crc kubenswrapper[5047]: E0223 06:45:33.596639 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:45:33 crc kubenswrapper[5047]: E0223 06:45:33.600739 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.605965 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.607270 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.607325 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.607342 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:33 crc kubenswrapper[5047]: I0223 06:45:33.607372 5047 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:45:33 crc kubenswrapper[5047]: E0223 06:45:33.610616 5047 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:45:34 crc kubenswrapper[5047]: I0223 06:45:34.285665 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:34Z is after 2026-02-23T05:33:13Z Feb 23 06:45:34 crc kubenswrapper[5047]: I0223 06:45:34.315533 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 13:40:34.613586265 +0000 UTC Feb 23 06:45:34 crc kubenswrapper[5047]: I0223 06:45:34.600388 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 23 06:45:35 crc kubenswrapper[5047]: I0223 06:45:35.284054 5047 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:35Z is after 2026-02-23T05:33:13Z Feb 23 06:45:35 crc kubenswrapper[5047]: I0223 06:45:35.315743 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 04:11:51.21860835 +0000 UTC Feb 23 06:45:35 crc kubenswrapper[5047]: I0223 06:45:35.577720 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:35 crc kubenswrapper[5047]: I0223 06:45:35.578091 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:35 crc kubenswrapper[5047]: I0223 06:45:35.579982 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:35 crc kubenswrapper[5047]: I0223 06:45:35.580043 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:35 crc kubenswrapper[5047]: I0223 06:45:35.580065 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:35 crc kubenswrapper[5047]: I0223 06:45:35.581002 5047 scope.go:117] "RemoveContainer" containerID="d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a" Feb 23 06:45:35 crc kubenswrapper[5047]: E0223 06:45:35.581356 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:45:36 crc kubenswrapper[5047]: I0223 06:45:36.195749 5047 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 23 06:45:36 crc kubenswrapper[5047]: I0223 06:45:36.316491 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 19:12:08.15519546 +0000 UTC Feb 23 06:45:37 crc kubenswrapper[5047]: I0223 06:45:37.317574 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 22:14:23.577853243 +0000 UTC Feb 23 06:45:38 crc kubenswrapper[5047]: I0223 06:45:38.160581 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:38 crc kubenswrapper[5047]: I0223 06:45:38.160741 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:38 crc kubenswrapper[5047]: I0223 06:45:38.162124 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:38 crc kubenswrapper[5047]: I0223 06:45:38.162161 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:38 crc kubenswrapper[5047]: I0223 06:45:38.162172 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:38 crc kubenswrapper[5047]: I0223 06:45:38.162740 5047 scope.go:117] "RemoveContainer" containerID="d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a" Feb 23 06:45:38 crc kubenswrapper[5047]: E0223 06:45:38.162926 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:45:38 crc kubenswrapper[5047]: I0223 06:45:38.318789 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 16:13:40.617292368 +0000 UTC Feb 23 06:45:38 crc kubenswrapper[5047]: E0223 06:45:38.431654 5047 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:45:39 crc kubenswrapper[5047]: I0223 06:45:39.318996 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 16:37:30.209535802 +0000 UTC Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.319521 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 06:53:23.194750078 +0000 UTC Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.610784 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.612758 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.612817 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.612836 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.613062 5047 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.628607 5047 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.629119 5047 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 23 06:45:40 crc kubenswrapper[5047]: E0223 06:45:40.629162 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.633672 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.633726 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.633747 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.633774 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.633795 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:45:40Z","lastTransitionTime":"2026-02-23T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:45:40 crc kubenswrapper[5047]: E0223 06:45:40.654258 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.665748 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.665821 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.665844 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.665881 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.665950 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:45:40Z","lastTransitionTime":"2026-02-23T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:45:40 crc kubenswrapper[5047]: E0223 06:45:40.684888 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.696504 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.696574 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.696600 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.696635 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.696661 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:45:40Z","lastTransitionTime":"2026-02-23T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:45:40 crc kubenswrapper[5047]: E0223 06:45:40.716695 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.737838 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.737936 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.737958 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.737988 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:45:40 crc kubenswrapper[5047]: I0223 06:45:40.738008 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:45:40Z","lastTransitionTime":"2026-02-23T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:45:40 crc kubenswrapper[5047]: E0223 06:45:40.752688 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:45:40 crc kubenswrapper[5047]: E0223 06:45:40.752859 5047 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:45:40 crc kubenswrapper[5047]: E0223 06:45:40.752897 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:40 crc kubenswrapper[5047]: E0223 06:45:40.853397 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:40 crc kubenswrapper[5047]: E0223 06:45:40.954161 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:41 crc kubenswrapper[5047]: E0223 06:45:41.055050 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:41 crc kubenswrapper[5047]: E0223 06:45:41.155849 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:41 crc kubenswrapper[5047]: E0223 06:45:41.256396 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.320671 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 03:40:24.484095336 +0000 UTC Feb 23 06:45:41 crc kubenswrapper[5047]: E0223 06:45:41.357588 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.394017 5047 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:39198->192.168.126.11:10357: read: connection reset by peer" start-of-body= Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.394129 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:39198->192.168.126.11:10357: read: connection reset by peer" Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.394460 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.394730 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.397073 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.397132 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.397153 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.398197 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f18d208b90ca9fc5264ec1c3860df730a39cca8c3249c17cec85518f4d2c0bae"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.398373 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://f18d208b90ca9fc5264ec1c3860df730a39cca8c3249c17cec85518f4d2c0bae" gracePeriod=30 Feb 23 06:45:41 crc kubenswrapper[5047]: E0223 06:45:41.458281 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:41 crc kubenswrapper[5047]: E0223 06:45:41.558895 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.630709 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.633124 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.633593 5047 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f18d208b90ca9fc5264ec1c3860df730a39cca8c3249c17cec85518f4d2c0bae" exitCode=255 Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.633653 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f18d208b90ca9fc5264ec1c3860df730a39cca8c3249c17cec85518f4d2c0bae"} Feb 23 06:45:41 crc kubenswrapper[5047]: I0223 06:45:41.633719 5047 scope.go:117] "RemoveContainer" containerID="d7f6af11b197ea3f94d5b8b0e984f9b2ad265560dcabd474a4c065d9fdb65c05" Feb 23 06:45:41 crc kubenswrapper[5047]: E0223 06:45:41.659301 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:41 crc kubenswrapper[5047]: E0223 06:45:41.759969 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:41 crc kubenswrapper[5047]: E0223 06:45:41.860691 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:41 crc kubenswrapper[5047]: E0223 06:45:41.961444 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:42 crc kubenswrapper[5047]: E0223 06:45:42.061616 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:42 crc kubenswrapper[5047]: E0223 06:45:42.162084 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:42 crc kubenswrapper[5047]: E0223 06:45:42.262879 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:42 crc kubenswrapper[5047]: I0223 06:45:42.321919 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 11:45:02.070616551 +0000 UTC Feb 23 06:45:42 crc kubenswrapper[5047]: E0223 06:45:42.363359 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:42 crc kubenswrapper[5047]: E0223 06:45:42.463922 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:42 crc kubenswrapper[5047]: E0223 06:45:42.564452 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:42 crc kubenswrapper[5047]: I0223 06:45:42.640374 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 23 06:45:42 crc kubenswrapper[5047]: I0223 06:45:42.641886 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"467ed8a21c0c42fdc43da9adf454627fd84c723f7336563bfbcb6c06a048009d"} Feb 23 06:45:42 crc kubenswrapper[5047]: I0223 06:45:42.642137 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:42 crc kubenswrapper[5047]: I0223 06:45:42.644579 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:42 crc kubenswrapper[5047]: I0223 06:45:42.644640 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:42 crc kubenswrapper[5047]: I0223 06:45:42.644663 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:42 crc kubenswrapper[5047]: E0223 06:45:42.665080 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:42 crc kubenswrapper[5047]: E0223 06:45:42.765582 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:42 crc kubenswrapper[5047]: E0223 06:45:42.865696 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:42 crc kubenswrapper[5047]: E0223 06:45:42.966482 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:43 crc kubenswrapper[5047]: E0223 06:45:43.067361 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:43 crc kubenswrapper[5047]: E0223 06:45:43.168173 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:43 crc kubenswrapper[5047]: E0223 06:45:43.269005 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:43 crc kubenswrapper[5047]: I0223 06:45:43.322172 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 03:41:30.894523051 +0000 UTC Feb 23 06:45:43 crc kubenswrapper[5047]: E0223 06:45:43.369697 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:43 crc kubenswrapper[5047]: E0223 06:45:43.470211 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:43 crc kubenswrapper[5047]: E0223 06:45:43.571375 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:43 crc kubenswrapper[5047]: I0223 06:45:43.644531 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:43 crc kubenswrapper[5047]: I0223 06:45:43.645361 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:43 crc kubenswrapper[5047]: I0223 06:45:43.645384 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:43 crc kubenswrapper[5047]: I0223 06:45:43.645414 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:43 crc kubenswrapper[5047]: E0223 06:45:43.672548 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:43 crc kubenswrapper[5047]: E0223 06:45:43.773713 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:43 crc kubenswrapper[5047]: E0223 06:45:43.874463 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:43 crc kubenswrapper[5047]: E0223 06:45:43.975346 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:44 crc kubenswrapper[5047]: E0223 06:45:44.076410 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:44 crc kubenswrapper[5047]: E0223 06:45:44.176846 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:44 crc kubenswrapper[5047]: E0223 06:45:44.277522 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:44 crc kubenswrapper[5047]: I0223 06:45:44.323152 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 12:35:01.18031588 +0000 UTC Feb 23 06:45:44 crc kubenswrapper[5047]: I0223 06:45:44.340477 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:44 crc kubenswrapper[5047]: I0223 06:45:44.341954 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:44 crc kubenswrapper[5047]: I0223 06:45:44.342029 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:44 crc kubenswrapper[5047]: I0223 06:45:44.342051 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:44 crc kubenswrapper[5047]: E0223 06:45:44.378328 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:44 crc kubenswrapper[5047]: E0223 06:45:44.479158 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:44 crc kubenswrapper[5047]: E0223 06:45:44.579747 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:44 crc kubenswrapper[5047]: E0223 06:45:44.680537 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:44 crc kubenswrapper[5047]: E0223 06:45:44.781705 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:44 crc kubenswrapper[5047]: E0223 06:45:44.882368 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:44 crc kubenswrapper[5047]: E0223 06:45:44.983385 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:45 crc kubenswrapper[5047]: E0223 06:45:45.084363 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:45 crc kubenswrapper[5047]: E0223 06:45:45.185599 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:45 crc kubenswrapper[5047]: E0223 06:45:45.286194 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:45 crc kubenswrapper[5047]: I0223 06:45:45.324014 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 13:29:09.773673183 +0000 UTC Feb 23 06:45:45 crc kubenswrapper[5047]: E0223 06:45:45.387024 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:45 crc kubenswrapper[5047]: E0223 06:45:45.487183 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:45 crc kubenswrapper[5047]: E0223 06:45:45.588310 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:45 crc kubenswrapper[5047]: E0223 06:45:45.688838 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:45 crc kubenswrapper[5047]: E0223 06:45:45.789836 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:45 crc kubenswrapper[5047]: E0223 06:45:45.890474 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:45 crc kubenswrapper[5047]: E0223 06:45:45.991232 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:46 crc kubenswrapper[5047]: E0223 06:45:46.091896 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:46 crc kubenswrapper[5047]: E0223 06:45:46.193020 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:46 crc kubenswrapper[5047]: E0223 06:45:46.294154 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:46 crc kubenswrapper[5047]: I0223 06:45:46.324702 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 11:45:46.348411644 +0000 UTC Feb 23 06:45:46 crc kubenswrapper[5047]: E0223 06:45:46.394870 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:46 crc kubenswrapper[5047]: E0223 06:45:46.495471 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:46 crc kubenswrapper[5047]: E0223 06:45:46.596186 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:46 crc kubenswrapper[5047]: E0223 06:45:46.696614 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:46 crc kubenswrapper[5047]: E0223 06:45:46.797609 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:46 crc kubenswrapper[5047]: E0223 06:45:46.898488 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:46 crc kubenswrapper[5047]: E0223 06:45:46.999342 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:47 crc kubenswrapper[5047]: E0223 06:45:47.100201 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:47 crc kubenswrapper[5047]: E0223 06:45:47.200622 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:47 crc kubenswrapper[5047]: E0223 06:45:47.301162 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:47 crc kubenswrapper[5047]: I0223 06:45:47.324857 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 15:36:12.938973699 +0000 UTC Feb 23 06:45:47 crc kubenswrapper[5047]: E0223 06:45:47.401719 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:47 crc kubenswrapper[5047]: E0223 06:45:47.502877 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:47 crc kubenswrapper[5047]: E0223 06:45:47.603645 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:47 crc kubenswrapper[5047]: E0223 06:45:47.704670 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:47 crc kubenswrapper[5047]: E0223 06:45:47.805548 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:47 crc kubenswrapper[5047]: E0223 06:45:47.906270 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:48 crc kubenswrapper[5047]: E0223 06:45:48.006985 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:48 crc kubenswrapper[5047]: E0223 06:45:48.107820 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:48 crc kubenswrapper[5047]: E0223 06:45:48.207968 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:48 crc kubenswrapper[5047]: E0223 06:45:48.308400 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:48 crc kubenswrapper[5047]: I0223 06:45:48.325072 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 07:10:46.063245048 +0000 UTC Feb 23 06:45:48 crc kubenswrapper[5047]: E0223 06:45:48.408967 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:48 crc kubenswrapper[5047]: E0223 06:45:48.432179 5047 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:45:48 crc kubenswrapper[5047]: E0223 06:45:48.510098 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:48 crc kubenswrapper[5047]: E0223 06:45:48.610705 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:48 crc kubenswrapper[5047]: E0223 06:45:48.711657 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:48 crc kubenswrapper[5047]: E0223 06:45:48.811990 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:48 crc kubenswrapper[5047]: I0223 06:45:48.828354 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:48 crc kubenswrapper[5047]: I0223 06:45:48.828506 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:48 crc kubenswrapper[5047]: I0223 06:45:48.829771 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:48 crc kubenswrapper[5047]: I0223 06:45:48.829803 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:48 crc kubenswrapper[5047]: I0223 06:45:48.829813 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:48 crc kubenswrapper[5047]: I0223 06:45:48.831998 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:48 crc kubenswrapper[5047]: E0223 06:45:48.912342 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:49 crc kubenswrapper[5047]: E0223 06:45:49.013054 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:49 crc kubenswrapper[5047]: E0223 06:45:49.113678 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:49 crc kubenswrapper[5047]: E0223 06:45:49.214509 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:49 crc kubenswrapper[5047]: E0223 06:45:49.315634 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:49 crc kubenswrapper[5047]: I0223 06:45:49.325984 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 06:22:04.313316712 +0000 UTC Feb 23 06:45:49 crc kubenswrapper[5047]: E0223 06:45:49.415825 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:49 crc kubenswrapper[5047]: E0223 06:45:49.516825 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:49 crc kubenswrapper[5047]: E0223 06:45:49.617230 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:49 crc kubenswrapper[5047]: I0223 06:45:49.660453 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:49 crc kubenswrapper[5047]: I0223 06:45:49.660465 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:49 crc kubenswrapper[5047]: I0223 06:45:49.661488 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:49 crc kubenswrapper[5047]: I0223 06:45:49.661539 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:49 crc kubenswrapper[5047]: I0223 06:45:49.661553 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:49 crc kubenswrapper[5047]: E0223 06:45:49.717643 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:49 crc kubenswrapper[5047]: E0223 06:45:49.818610 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:49 crc kubenswrapper[5047]: E0223 06:45:49.919644 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:50 crc kubenswrapper[5047]: E0223 06:45:50.020374 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:50 crc kubenswrapper[5047]: E0223 06:45:50.120999 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.129164 5047 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.144604 5047 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.159058 5047 csr.go:261] certificate signing request csr-7nvfg is approved, waiting to be issued Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.166920 5047 csr.go:257] certificate signing request csr-7nvfg is issued Feb 23 06:45:50 crc kubenswrapper[5047]: E0223 06:45:50.221704 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:50 crc kubenswrapper[5047]: E0223 06:45:50.322678 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.326822 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 23:27:57.169544282 +0000 UTC Feb 23 06:45:50 crc kubenswrapper[5047]: E0223 06:45:50.423662 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:50 crc kubenswrapper[5047]: E0223 06:45:50.525635 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:50 crc kubenswrapper[5047]: E0223 06:45:50.626055 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.662795 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.663925 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.663967 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.663977 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:50 crc kubenswrapper[5047]: E0223 06:45:50.726860 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:50 crc kubenswrapper[5047]: E0223 06:45:50.827335 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:50 crc kubenswrapper[5047]: E0223 06:45:50.928381 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:50 crc kubenswrapper[5047]: E0223 06:45:50.975625 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.980085 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.980128 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.980138 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.980157 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.980167 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:45:50Z","lastTransitionTime":"2026-02-23T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:45:50 crc kubenswrapper[5047]: E0223 06:45:50.990309 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.997428 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.997462 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.997474 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.997489 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:45:50 crc kubenswrapper[5047]: I0223 06:45:50.997502 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:45:50Z","lastTransitionTime":"2026-02-23T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:45:51 crc kubenswrapper[5047]: E0223 06:45:51.007402 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:45:51 crc kubenswrapper[5047]: I0223 06:45:51.054451 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:51 crc kubenswrapper[5047]: I0223 06:45:51.054530 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:51 crc kubenswrapper[5047]: I0223 06:45:51.054548 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:51 crc kubenswrapper[5047]: I0223 06:45:51.054576 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:45:51 crc kubenswrapper[5047]: I0223 06:45:51.054594 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:45:51Z","lastTransitionTime":"2026-02-23T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:45:51 crc kubenswrapper[5047]: E0223 06:45:51.071585 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:45:51 crc kubenswrapper[5047]: I0223 06:45:51.080667 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:51 crc kubenswrapper[5047]: I0223 06:45:51.080722 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:51 crc kubenswrapper[5047]: I0223 06:45:51.080740 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:51 crc kubenswrapper[5047]: I0223 06:45:51.080787 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:45:51 crc kubenswrapper[5047]: I0223 06:45:51.080807 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:45:51Z","lastTransitionTime":"2026-02-23T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:45:51 crc kubenswrapper[5047]: E0223 06:45:51.093510 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:45:51 crc kubenswrapper[5047]: E0223 06:45:51.093623 5047 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:45:51 crc kubenswrapper[5047]: E0223 06:45:51.093645 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:51 crc kubenswrapper[5047]: I0223 06:45:51.168022 5047 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-23 06:40:50 +0000 UTC, rotation deadline is 2026-11-18 10:07:26.533869012 +0000 UTC Feb 23 06:45:51 crc kubenswrapper[5047]: I0223 06:45:51.168085 5047 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6435h21m35.365787108s for next certificate rotation Feb 23 06:45:51 crc kubenswrapper[5047]: E0223 06:45:51.194775 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:51 crc kubenswrapper[5047]: E0223 06:45:51.295925 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:51 crc kubenswrapper[5047]: I0223 06:45:51.327655 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 14:01:36.976827352 +0000 UTC Feb 23 06:45:51 crc kubenswrapper[5047]: E0223 06:45:51.397019 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:51 crc kubenswrapper[5047]: E0223 06:45:51.497969 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:51 crc kubenswrapper[5047]: E0223 06:45:51.598969 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:51 crc kubenswrapper[5047]: E0223 06:45:51.699114 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:51 crc kubenswrapper[5047]: E0223 06:45:51.799494 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:51 crc kubenswrapper[5047]: E0223 06:45:51.899765 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:52 crc kubenswrapper[5047]: E0223 06:45:52.001485 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:52 crc kubenswrapper[5047]: E0223 06:45:52.102071 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:52 crc kubenswrapper[5047]: E0223 06:45:52.202744 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:52 crc kubenswrapper[5047]: E0223 06:45:52.302844 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:52 crc kubenswrapper[5047]: I0223 06:45:52.328462 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 21:43:04.408572701 +0000 UTC Feb 23 06:45:52 crc kubenswrapper[5047]: I0223 06:45:52.340894 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:52 crc kubenswrapper[5047]: I0223 06:45:52.347437 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:52 crc kubenswrapper[5047]: I0223 06:45:52.347486 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:52 crc kubenswrapper[5047]: I0223 06:45:52.347500 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:52 crc kubenswrapper[5047]: I0223 06:45:52.348379 5047 scope.go:117] "RemoveContainer" containerID="d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a" Feb 23 06:45:52 crc kubenswrapper[5047]: E0223 06:45:52.348597 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:45:52 crc kubenswrapper[5047]: E0223 06:45:52.403533 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:52 crc kubenswrapper[5047]: E0223 06:45:52.504439 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:52 crc kubenswrapper[5047]: E0223 06:45:52.604891 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:52 crc kubenswrapper[5047]: E0223 06:45:52.705381 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:52 crc kubenswrapper[5047]: E0223 06:45:52.806384 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:52 crc kubenswrapper[5047]: E0223 06:45:52.908288 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:53 crc kubenswrapper[5047]: E0223 06:45:53.009295 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:53 crc kubenswrapper[5047]: E0223 06:45:53.110001 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:53 crc kubenswrapper[5047]: E0223 06:45:53.210588 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:53 crc kubenswrapper[5047]: E0223 06:45:53.311079 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:53 crc kubenswrapper[5047]: I0223 06:45:53.329375 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 23:28:55.226740486 +0000 UTC Feb 23 06:45:53 crc kubenswrapper[5047]: E0223 06:45:53.414227 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:53 crc kubenswrapper[5047]: E0223 06:45:53.515303 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:53 crc kubenswrapper[5047]: E0223 06:45:53.615916 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:53 crc kubenswrapper[5047]: E0223 06:45:53.716636 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:53 crc kubenswrapper[5047]: E0223 06:45:53.817568 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:53 crc kubenswrapper[5047]: E0223 06:45:53.918477 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:54 crc kubenswrapper[5047]: E0223 06:45:54.019170 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:54 crc kubenswrapper[5047]: E0223 06:45:54.119970 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:54 crc kubenswrapper[5047]: E0223 06:45:54.220886 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:54 crc kubenswrapper[5047]: E0223 06:45:54.322002 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:54 crc kubenswrapper[5047]: I0223 06:45:54.330255 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 21:15:19.441225864 +0000 UTC Feb 23 06:45:54 crc kubenswrapper[5047]: E0223 06:45:54.423044 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:54 crc kubenswrapper[5047]: E0223 06:45:54.523976 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:54 crc kubenswrapper[5047]: E0223 06:45:54.624569 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:54 crc kubenswrapper[5047]: E0223 06:45:54.725127 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:54 crc kubenswrapper[5047]: E0223 06:45:54.826238 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:54 crc kubenswrapper[5047]: E0223 06:45:54.927021 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:55 crc kubenswrapper[5047]: E0223 06:45:55.027785 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:55 crc kubenswrapper[5047]: E0223 06:45:55.128749 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:55 crc kubenswrapper[5047]: E0223 06:45:55.229186 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:55 crc kubenswrapper[5047]: E0223 06:45:55.329987 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:55 crc kubenswrapper[5047]: I0223 06:45:55.331060 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 03:01:13.108889814 +0000 UTC Feb 23 06:45:55 crc kubenswrapper[5047]: E0223 06:45:55.430135 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:55 crc kubenswrapper[5047]: E0223 06:45:55.531247 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:55 crc kubenswrapper[5047]: E0223 06:45:55.632276 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:55 crc kubenswrapper[5047]: E0223 06:45:55.733395 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:55 crc kubenswrapper[5047]: E0223 06:45:55.834199 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:55 crc kubenswrapper[5047]: E0223 06:45:55.934980 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:56 crc kubenswrapper[5047]: E0223 06:45:56.035779 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:56 crc kubenswrapper[5047]: E0223 06:45:56.136772 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:56 crc kubenswrapper[5047]: E0223 06:45:56.237823 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:56 crc kubenswrapper[5047]: I0223 06:45:56.332045 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 03:32:24.267865833 +0000 UTC Feb 23 06:45:56 crc kubenswrapper[5047]: E0223 06:45:56.338201 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:56 crc kubenswrapper[5047]: E0223 06:45:56.439129 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:56 crc kubenswrapper[5047]: E0223 06:45:56.540217 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:56 crc kubenswrapper[5047]: E0223 06:45:56.640830 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:56 crc kubenswrapper[5047]: E0223 06:45:56.741727 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:56 crc kubenswrapper[5047]: E0223 06:45:56.842775 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:56 crc kubenswrapper[5047]: E0223 06:45:56.943392 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:57 crc kubenswrapper[5047]: E0223 06:45:57.044130 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:57 crc kubenswrapper[5047]: E0223 06:45:57.144267 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:57 crc kubenswrapper[5047]: E0223 06:45:57.244952 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:57 crc kubenswrapper[5047]: I0223 06:45:57.332551 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 20:24:24.161775957 +0000 UTC Feb 23 06:45:57 crc kubenswrapper[5047]: E0223 06:45:57.345113 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:57 crc kubenswrapper[5047]: E0223 06:45:57.445977 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:57 crc kubenswrapper[5047]: E0223 06:45:57.547086 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:57 crc kubenswrapper[5047]: E0223 06:45:57.647538 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:57 crc kubenswrapper[5047]: E0223 06:45:57.748019 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:57 crc kubenswrapper[5047]: E0223 06:45:57.849047 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:57 crc kubenswrapper[5047]: E0223 06:45:57.949967 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:58 crc kubenswrapper[5047]: E0223 06:45:58.050960 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:58 crc kubenswrapper[5047]: E0223 06:45:58.152178 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:58 crc kubenswrapper[5047]: I0223 06:45:58.164536 5047 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 23 06:45:58 crc kubenswrapper[5047]: E0223 06:45:58.252777 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:58 crc kubenswrapper[5047]: I0223 06:45:58.333304 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 16:52:14.455004804 +0000 UTC Feb 23 06:45:58 crc kubenswrapper[5047]: E0223 06:45:58.353152 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:58 crc kubenswrapper[5047]: E0223 06:45:58.433172 5047 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:45:58 crc kubenswrapper[5047]: E0223 06:45:58.453824 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:58 crc kubenswrapper[5047]: E0223 06:45:58.553989 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:58 crc kubenswrapper[5047]: E0223 06:45:58.654977 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:58 crc kubenswrapper[5047]: E0223 06:45:58.755953 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:58 crc kubenswrapper[5047]: E0223 06:45:58.856160 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:58 crc kubenswrapper[5047]: E0223 06:45:58.956929 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:59 crc kubenswrapper[5047]: E0223 06:45:59.057675 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:59 crc kubenswrapper[5047]: E0223 06:45:59.157803 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:59 crc kubenswrapper[5047]: E0223 06:45:59.258847 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:59 crc kubenswrapper[5047]: I0223 06:45:59.333959 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 08:28:55.91509669 +0000 UTC Feb 23 06:45:59 crc kubenswrapper[5047]: E0223 06:45:59.359550 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:59 crc kubenswrapper[5047]: E0223 06:45:59.460238 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:59 crc kubenswrapper[5047]: E0223 06:45:59.561320 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:59 crc kubenswrapper[5047]: E0223 06:45:59.661404 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:59 crc kubenswrapper[5047]: I0223 06:45:59.747217 5047 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 23 06:45:59 crc kubenswrapper[5047]: E0223 06:45:59.761806 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:59 crc kubenswrapper[5047]: E0223 06:45:59.862801 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:59 crc kubenswrapper[5047]: E0223 06:45:59.963614 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:00 crc kubenswrapper[5047]: I0223 06:46:00.007433 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:46:00 crc kubenswrapper[5047]: I0223 06:46:00.007623 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:46:00 crc kubenswrapper[5047]: I0223 06:46:00.009083 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:00 crc kubenswrapper[5047]: I0223 06:46:00.009120 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:00 crc kubenswrapper[5047]: I0223 06:46:00.009133 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:00 crc kubenswrapper[5047]: E0223 06:46:00.063762 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:00 crc kubenswrapper[5047]: E0223 06:46:00.164700 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:00 crc kubenswrapper[5047]: E0223 06:46:00.265231 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:00 crc kubenswrapper[5047]: I0223 06:46:00.334650 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 16:27:05.706235268 +0000 UTC Feb 23 06:46:00 crc kubenswrapper[5047]: E0223 06:46:00.366359 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:00 crc kubenswrapper[5047]: E0223 06:46:00.467547 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:00 crc kubenswrapper[5047]: E0223 06:46:00.568049 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:00 crc kubenswrapper[5047]: E0223 06:46:00.668882 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:00 crc kubenswrapper[5047]: E0223 06:46:00.770031 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:00 crc kubenswrapper[5047]: E0223 06:46:00.870799 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:00 crc kubenswrapper[5047]: E0223 06:46:00.971522 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.072521 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.173429 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.274758 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.335691 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 05:35:58.222645224 +0000 UTC Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.356137 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.361380 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.361410 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.361419 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.361435 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.361446 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:01Z","lastTransitionTime":"2026-02-23T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.377538 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.382398 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.382449 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.382462 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.382482 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.382498 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:01Z","lastTransitionTime":"2026-02-23T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.392543 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.397157 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.397237 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.397255 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.397274 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.397316 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:01Z","lastTransitionTime":"2026-02-23T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.406338 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.411189 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.411236 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.411245 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.411262 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:01 crc kubenswrapper[5047]: I0223 06:46:01.411273 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:01Z","lastTransitionTime":"2026-02-23T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.423261 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.423373 5047 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.423401 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.524640 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.625343 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.726096 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.827074 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:01 crc kubenswrapper[5047]: E0223 06:46:01.928161 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:02 crc kubenswrapper[5047]: E0223 06:46:02.028989 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:02 crc kubenswrapper[5047]: E0223 06:46:02.129359 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:02 crc kubenswrapper[5047]: E0223 06:46:02.230523 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:02 crc kubenswrapper[5047]: E0223 06:46:02.331457 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:02 crc kubenswrapper[5047]: I0223 06:46:02.336686 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 11:17:04.456250226 +0000 UTC Feb 23 06:46:02 crc kubenswrapper[5047]: E0223 06:46:02.431763 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:02 crc kubenswrapper[5047]: E0223 06:46:02.532889 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:02 crc kubenswrapper[5047]: E0223 06:46:02.633589 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:02 crc kubenswrapper[5047]: E0223 06:46:02.733719 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:02 crc kubenswrapper[5047]: E0223 06:46:02.834957 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:02 crc kubenswrapper[5047]: E0223 06:46:02.935676 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:03 crc kubenswrapper[5047]: E0223 06:46:03.036681 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:03 crc kubenswrapper[5047]: E0223 06:46:03.137142 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:03 crc kubenswrapper[5047]: E0223 06:46:03.238360 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:03 crc kubenswrapper[5047]: I0223 06:46:03.337206 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 08:47:01.548708463 +0000 UTC Feb 23 06:46:03 crc kubenswrapper[5047]: E0223 06:46:03.339406 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:03 crc kubenswrapper[5047]: E0223 06:46:03.439654 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:03 crc kubenswrapper[5047]: E0223 06:46:03.540531 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:03 crc kubenswrapper[5047]: E0223 06:46:03.641677 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:03 crc kubenswrapper[5047]: E0223 06:46:03.742183 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:03 crc kubenswrapper[5047]: E0223 06:46:03.843103 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:03 crc kubenswrapper[5047]: E0223 06:46:03.943814 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:04 crc kubenswrapper[5047]: E0223 06:46:04.044500 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:04 crc kubenswrapper[5047]: E0223 06:46:04.145058 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:04 crc kubenswrapper[5047]: E0223 06:46:04.245393 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:04 crc kubenswrapper[5047]: I0223 06:46:04.338000 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 02:18:36.07794576 +0000 UTC Feb 23 06:46:04 crc kubenswrapper[5047]: E0223 06:46:04.346363 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:04 crc kubenswrapper[5047]: E0223 06:46:04.447058 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:04 crc kubenswrapper[5047]: E0223 06:46:04.547747 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:04 crc kubenswrapper[5047]: E0223 06:46:04.648491 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:04 crc kubenswrapper[5047]: E0223 06:46:04.749524 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:04 crc kubenswrapper[5047]: E0223 06:46:04.850424 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:04 crc kubenswrapper[5047]: E0223 06:46:04.951163 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:05 crc kubenswrapper[5047]: E0223 06:46:05.051998 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:05 crc kubenswrapper[5047]: E0223 06:46:05.153091 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:05 crc kubenswrapper[5047]: E0223 06:46:05.253479 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:05 crc kubenswrapper[5047]: I0223 06:46:05.339251 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 20:48:05.138303696 +0000 UTC Feb 23 06:46:05 crc kubenswrapper[5047]: E0223 06:46:05.354647 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:05 crc kubenswrapper[5047]: E0223 06:46:05.455356 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:05 crc kubenswrapper[5047]: E0223 06:46:05.556386 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:05 crc kubenswrapper[5047]: E0223 06:46:05.657357 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:05 crc kubenswrapper[5047]: E0223 06:46:05.758004 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:05 crc kubenswrapper[5047]: E0223 06:46:05.859104 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:05 crc kubenswrapper[5047]: E0223 06:46:05.960083 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:06 crc kubenswrapper[5047]: E0223 06:46:06.060461 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:06 crc kubenswrapper[5047]: E0223 06:46:06.161191 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:06 crc kubenswrapper[5047]: E0223 06:46:06.261967 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:06 crc kubenswrapper[5047]: I0223 06:46:06.340403 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 20:42:26.736429043 +0000 UTC Feb 23 06:46:06 crc kubenswrapper[5047]: E0223 06:46:06.362578 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:06 crc kubenswrapper[5047]: E0223 06:46:06.463279 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:06 crc kubenswrapper[5047]: E0223 06:46:06.563672 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:06 crc kubenswrapper[5047]: E0223 06:46:06.664524 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:06 crc kubenswrapper[5047]: E0223 06:46:06.765609 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:06 crc kubenswrapper[5047]: E0223 06:46:06.866110 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:06 crc kubenswrapper[5047]: E0223 06:46:06.967028 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:07 crc kubenswrapper[5047]: E0223 06:46:07.067665 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:07 crc kubenswrapper[5047]: E0223 06:46:07.168322 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:07 crc kubenswrapper[5047]: E0223 06:46:07.269142 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:07 crc kubenswrapper[5047]: I0223 06:46:07.340479 5047 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:46:07 crc kubenswrapper[5047]: I0223 06:46:07.341330 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 11:17:57.598811262 +0000 UTC Feb 23 06:46:07 crc kubenswrapper[5047]: I0223 06:46:07.342492 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:07 crc kubenswrapper[5047]: I0223 06:46:07.342557 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:07 crc kubenswrapper[5047]: I0223 06:46:07.342583 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:07 crc kubenswrapper[5047]: I0223 06:46:07.343799 5047 scope.go:117] "RemoveContainer" containerID="d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a" Feb 23 06:46:07 crc kubenswrapper[5047]: E0223 06:46:07.344263 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:46:07 crc kubenswrapper[5047]: E0223 06:46:07.369818 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:07 crc kubenswrapper[5047]: E0223 06:46:07.470655 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:07 crc kubenswrapper[5047]: E0223 06:46:07.571684 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:07 crc kubenswrapper[5047]: E0223 06:46:07.671889 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:07 crc kubenswrapper[5047]: E0223 06:46:07.772418 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:07 crc kubenswrapper[5047]: E0223 06:46:07.873239 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:07 crc kubenswrapper[5047]: E0223 06:46:07.974127 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:08 crc kubenswrapper[5047]: E0223 06:46:08.075112 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:08 crc kubenswrapper[5047]: E0223 06:46:08.175858 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:08 crc kubenswrapper[5047]: E0223 06:46:08.276495 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:08 crc kubenswrapper[5047]: I0223 06:46:08.342134 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 00:28:32.856067823 +0000 UTC Feb 23 06:46:08 crc kubenswrapper[5047]: E0223 06:46:08.377340 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:08 crc kubenswrapper[5047]: E0223 06:46:08.434083 5047 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:46:08 crc kubenswrapper[5047]: E0223 06:46:08.477692 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:08 crc kubenswrapper[5047]: E0223 06:46:08.578317 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:08 crc kubenswrapper[5047]: E0223 06:46:08.679230 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:08 crc kubenswrapper[5047]: E0223 06:46:08.779934 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:08 crc kubenswrapper[5047]: E0223 06:46:08.880424 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:08 crc kubenswrapper[5047]: E0223 06:46:08.981462 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:09 crc kubenswrapper[5047]: E0223 06:46:09.081881 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:09 crc kubenswrapper[5047]: E0223 06:46:09.182028 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:09 crc kubenswrapper[5047]: E0223 06:46:09.283118 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:09 crc kubenswrapper[5047]: I0223 06:46:09.343229 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 01:01:23.610491981 +0000 UTC Feb 23 06:46:09 crc kubenswrapper[5047]: E0223 06:46:09.383760 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:09 crc kubenswrapper[5047]: E0223 06:46:09.484986 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:09 crc kubenswrapper[5047]: E0223 06:46:09.586129 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:09 crc kubenswrapper[5047]: E0223 06:46:09.687260 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:09 crc kubenswrapper[5047]: E0223 06:46:09.787567 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:09 crc kubenswrapper[5047]: E0223 06:46:09.888222 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:09 crc kubenswrapper[5047]: E0223 06:46:09.989104 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:10 crc kubenswrapper[5047]: E0223 06:46:10.090028 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:10 crc kubenswrapper[5047]: I0223 06:46:10.179887 5047 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 23 06:46:10 crc kubenswrapper[5047]: E0223 06:46:10.190733 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:10 crc kubenswrapper[5047]: E0223 06:46:10.291055 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:10 crc kubenswrapper[5047]: I0223 06:46:10.343981 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 10:21:36.568294635 +0000 UTC Feb 23 06:46:10 crc kubenswrapper[5047]: E0223 06:46:10.391538 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:10 crc kubenswrapper[5047]: E0223 06:46:10.491711 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:10 crc kubenswrapper[5047]: E0223 06:46:10.593027 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:10 crc kubenswrapper[5047]: I0223 06:46:10.637326 5047 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 23 06:46:10 crc kubenswrapper[5047]: E0223 06:46:10.693463 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:10 crc kubenswrapper[5047]: E0223 06:46:10.794232 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:10 crc kubenswrapper[5047]: E0223 06:46:10.894626 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:10 crc kubenswrapper[5047]: E0223 06:46:10.995629 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.096379 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.196578 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.297203 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.344609 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 01:58:35.37259412 +0000 UTC Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.398266 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.499104 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.599862 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.700210 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.800565 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.814686 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.822977 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.823102 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.823133 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.823164 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.823186 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.843030 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.848954 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.849061 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.849091 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.849183 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.849252 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.869259 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.875100 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.875249 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.875275 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.875300 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.875365 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.894766 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.900198 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.900294 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.900354 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.900382 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5047]: I0223 06:46:11.900463 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.916155 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.916505 5047 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:46:11 crc kubenswrapper[5047]: E0223 06:46:11.916557 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:12 crc kubenswrapper[5047]: E0223 06:46:12.017316 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:12 crc kubenswrapper[5047]: E0223 06:46:12.118258 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:12 crc kubenswrapper[5047]: E0223 06:46:12.218606 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:12 crc kubenswrapper[5047]: E0223 06:46:12.319763 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:12 crc kubenswrapper[5047]: I0223 06:46:12.345631 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 06:00:46.410824115 +0000 UTC Feb 23 06:46:12 crc kubenswrapper[5047]: E0223 06:46:12.420100 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:12 crc kubenswrapper[5047]: E0223 06:46:12.521004 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:12 crc kubenswrapper[5047]: E0223 06:46:12.621645 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:12 crc kubenswrapper[5047]: E0223 06:46:12.722167 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:12 crc kubenswrapper[5047]: E0223 06:46:12.822845 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:12 crc kubenswrapper[5047]: E0223 06:46:12.923364 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.024046 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.124843 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.225202 5047 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.296741 5047 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.320652 5047 apiserver.go:52] "Watching apiserver" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.324289 5047 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.324545 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.324979 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.325027 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.325030 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.325113 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.325192 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.325235 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.325335 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.325396 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.325386 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.327180 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.327191 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.327680 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.327809 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.328050 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.328077 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.328090 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.328103 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.328109 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.328090 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.328165 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.328252 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.330272 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.330334 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.345747 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 16:00:39.139224513 +0000 UTC Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.363815 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.376353 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.382371 5047 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.388328 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.397789 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.409318 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.421375 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.431128 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.431177 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.431238 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.431252 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.431274 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.431288 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.437715 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.437758 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.437794 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.437814 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.437830 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.437862 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.437884 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.437931 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.437950 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.437967 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.437986 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438020 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438038 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438054 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438085 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438101 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438116 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438135 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438151 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438175 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438196 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438186 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438212 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438250 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438269 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438287 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438320 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438342 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438358 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438377 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438417 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438436 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438454 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438501 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438518 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438535 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438567 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438585 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438605 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438621 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438659 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438681 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438697 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438739 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438757 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438775 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438808 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438826 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438841 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438859 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438888 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438952 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438968 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439170 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439210 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439227 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439244 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439261 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439416 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439595 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439611 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439627 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439885 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439937 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439957 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439973 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439992 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440028 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440044 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440063 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440149 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440167 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440189 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440232 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440249 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440269 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440289 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440367 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440388 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440403 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440450 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440474 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440493 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440529 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440547 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440564 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440647 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440666 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440682 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440698 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440778 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440796 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440812 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440852 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440869 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440885 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440933 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440952 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440969 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440986 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441028 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441047 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441064 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441153 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441174 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441189 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441206 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441275 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441292 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441308 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441728 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441748 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438280 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438448 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441807 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.438659 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439029 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439087 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439733 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439755 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439754 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.439876 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440003 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440113 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440173 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440275 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440265 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440379 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440458 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440820 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440881 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440871 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.440920 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441047 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.441815 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442091 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442134 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442174 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442220 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442259 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442303 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442343 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442374 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442408 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442443 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442473 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442505 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442733 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443024 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442764 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443118 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443151 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443180 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443313 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443339 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443329 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443367 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443393 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443416 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443444 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443466 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443491 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443513 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443539 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443561 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443582 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443605 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443627 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443647 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443674 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443696 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443719 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443741 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443762 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443783 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443808 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443829 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443850 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443871 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443895 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443941 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443966 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.443989 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444012 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444034 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444055 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444075 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444096 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444129 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444152 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444177 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444200 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444222 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444245 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444270 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444296 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444319 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444344 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444365 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444390 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444410 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444434 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444456 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444478 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444500 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444521 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444543 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444563 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444586 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444608 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444629 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444650 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444677 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444700 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444724 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444746 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444770 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444793 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444827 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444848 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444870 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444891 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444962 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444992 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445020 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445047 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445080 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445144 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445171 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445196 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445221 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445244 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445275 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445298 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445321 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445370 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445452 5047 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445470 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445487 5047 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445501 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445514 5047 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445527 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445540 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445552 5047 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445566 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445578 5047 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445591 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445604 5047 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445619 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445633 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445646 5047 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445659 5047 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445672 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445685 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445698 5047 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445715 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445732 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445746 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445759 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445772 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445785 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445797 5047 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442753 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445835 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.442845 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444304 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444353 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444409 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444596 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.444674 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445652 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445959 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.445996 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.447037 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.447116 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.447204 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:13.947171093 +0000 UTC m=+96.198498277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.447231 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.447299 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.447323 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.447120 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.447650 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.447950 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.447984 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.448099 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.448126 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.448522 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.449006 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.449276 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.449603 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.449635 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.449802 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.449723 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.449894 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.449982 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.449946 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.450452 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.450572 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.450591 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.450599 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.450602 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.450676 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.450698 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.450802 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.450812 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.451101 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.451152 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.451161 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.451373 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.451237 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.451257 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.451554 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.451786 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.452259 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.452360 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.452429 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.452433 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.453064 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.453063 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.453171 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.453164 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.453606 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.453677 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.455327 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.455355 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.455204 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.455443 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.455481 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.455534 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.455766 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.456173 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.456246 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.456525 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.456926 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.456953 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.457056 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.457077 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.457384 5047 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.457462 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:13.957440051 +0000 UTC m=+96.208767255 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.457485 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.457690 5047 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.457783 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:13.9577553 +0000 UTC m=+96.209082434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.457969 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.458019 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.458304 5047 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.458870 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.458867 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.458728 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.459006 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.459040 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.459152 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.459482 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.459719 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.460866 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.461510 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.461829 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.462181 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.463167 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.463807 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.464228 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.466149 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.466235 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.468057 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.458130 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.470294 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.470515 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.470806 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.471199 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.471259 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.471726 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.470389 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.472207 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.472232 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.472252 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.472267 5047 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.472334 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:13.972310413 +0000 UTC m=+96.223637547 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.472460 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.472504 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.472593 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.472601 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.472613 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.472630 5047 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.472949 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.472976 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.473057 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.473116 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:13.972665673 +0000 UTC m=+96.223992897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.473953 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.475174 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.475217 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.475307 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.475534 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.475806 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.476283 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.476387 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.477781 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.478204 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.478590 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.479764 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.479801 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.479920 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.480046 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.480255 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.483251 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.483441 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.483649 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.483734 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.483949 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.483959 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.483989 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.484893 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.485031 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.485356 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.485722 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.486157 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.486325 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.486570 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.486610 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.486929 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.487100 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.487334 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.487301 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.486949 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.487407 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.487472 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.487851 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.488117 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.488300 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.488153 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.488450 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.488471 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.488550 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.488620 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.488631 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.488735 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.488636 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.489128 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.489231 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.489316 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.489503 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.489847 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.490039 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.491666 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.492128 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.492178 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.493404 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.493356 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.493835 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.494309 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.495857 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.497047 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.508868 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.511977 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.516987 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.520765 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.534609 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.534655 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.534667 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.534688 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.534702 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547015 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547060 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547111 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547532 5047 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547559 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547570 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547579 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547589 5047 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547597 5047 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547607 5047 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547615 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547622 5047 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547630 5047 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547638 5047 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547646 5047 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547654 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547541 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547662 5047 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547699 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547713 5047 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547726 5047 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547743 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547762 5047 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547781 5047 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547798 5047 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547814 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547830 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547846 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547859 5047 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547870 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547882 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547901 5047 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547944 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547961 5047 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547977 5047 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.547992 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548005 5047 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548016 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548027 5047 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548039 5047 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548050 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548063 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548074 5047 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548086 5047 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548098 5047 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548110 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548121 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548132 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548144 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548155 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548167 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548178 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548189 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548201 5047 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548212 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548223 5047 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548235 5047 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548246 5047 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548258 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548269 5047 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548280 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548291 5047 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548302 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548314 5047 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548325 5047 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548336 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548347 5047 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548358 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548369 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548380 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548391 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548402 5047 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548416 5047 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548428 5047 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548442 5047 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548454 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548466 5047 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548478 5047 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548491 5047 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548503 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548517 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548529 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548541 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548553 5047 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548565 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548576 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548587 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548599 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548611 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548624 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548635 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548646 5047 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548688 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548700 5047 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548714 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548725 5047 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548736 5047 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548750 5047 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548765 5047 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548783 5047 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548795 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548806 5047 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548817 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548829 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548840 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548852 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548864 5047 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548876 5047 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548887 5047 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548921 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548939 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548955 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548968 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548979 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.548990 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549002 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549013 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549024 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549035 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549046 5047 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549058 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549069 5047 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549081 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549093 5047 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549104 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549116 5047 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549127 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549138 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549150 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549161 5047 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549173 5047 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549184 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549194 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549205 5047 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549216 5047 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549228 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549238 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549249 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549260 5047 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549271 5047 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549281 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549292 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549303 5047 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549314 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549325 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549336 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549347 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549359 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549369 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549380 5047 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549391 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549402 5047 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549412 5047 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549424 5047 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549435 5047 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549446 5047 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549458 5047 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549470 5047 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549481 5047 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549492 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549503 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549513 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549526 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549537 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549548 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549559 5047 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549570 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549581 5047 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549591 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549602 5047 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549612 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.549624 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.637357 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.637404 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.637416 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.637438 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.637452 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.644488 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.649461 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.656441 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:13 crc kubenswrapper[5047]: W0223 06:46:13.671136 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-816d38bf5ac2cc943575cbd08b512b93f2df80bc1243f2d0cdbac71dd661b326 WatchSource:0}: Error finding container 816d38bf5ac2cc943575cbd08b512b93f2df80bc1243f2d0cdbac71dd661b326: Status 404 returned error can't find the container with id 816d38bf5ac2cc943575cbd08b512b93f2df80bc1243f2d0cdbac71dd661b326 Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.733018 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"816d38bf5ac2cc943575cbd08b512b93f2df80bc1243f2d0cdbac71dd661b326"} Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.733800 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"32c82d59eabbe64a7bf97d1c1d776777f3668d3eebbf54d22751b5cc8d3cdd10"} Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.735303 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"00f5af6d1952f27e0c167612e8cfa515588be4897b96bf882ec72f8c9e888476"} Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.740418 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.740443 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.740453 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.740465 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.740476 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.843994 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.844029 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.844038 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.844054 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.844065 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.946992 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.947035 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.947044 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.947059 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.947071 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5047]: I0223 06:46:13.953336 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:13 crc kubenswrapper[5047]: E0223 06:46:13.953475 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:14.953445949 +0000 UTC m=+97.204773103 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.049597 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.049652 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.049737 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.049759 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.049771 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.054104 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.054147 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.054192 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.054225 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:14 crc kubenswrapper[5047]: E0223 06:46:14.054348 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:14 crc kubenswrapper[5047]: E0223 06:46:14.054368 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:14 crc kubenswrapper[5047]: E0223 06:46:14.054379 5047 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:14 crc kubenswrapper[5047]: E0223 06:46:14.054379 5047 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:14 crc kubenswrapper[5047]: E0223 06:46:14.054392 5047 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:14 crc kubenswrapper[5047]: E0223 06:46:14.054422 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:15.054405972 +0000 UTC m=+97.305733106 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:14 crc kubenswrapper[5047]: E0223 06:46:14.054441 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:15.054428603 +0000 UTC m=+97.305755737 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:14 crc kubenswrapper[5047]: E0223 06:46:14.054459 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:15.054450993 +0000 UTC m=+97.305778127 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:14 crc kubenswrapper[5047]: E0223 06:46:14.054587 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:14 crc kubenswrapper[5047]: E0223 06:46:14.054630 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:14 crc kubenswrapper[5047]: E0223 06:46:14.054644 5047 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:14 crc kubenswrapper[5047]: E0223 06:46:14.054714 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:15.05469223 +0000 UTC m=+97.306019364 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.153026 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.153069 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.153079 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.153097 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.153110 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.256278 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.256338 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.256350 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.256371 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.256386 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.345104 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.346317 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 09:00:37.008579125 +0000 UTC Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.347979 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.349857 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.351953 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.352754 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.353311 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.353932 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.354505 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.355206 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.355729 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.356222 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.356954 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.357417 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.357939 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.358446 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.358981 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.359462 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.359598 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.359864 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.360055 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.360110 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.360155 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.360174 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.360608 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.362366 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.363680 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.365003 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.367014 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.369784 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.372081 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.373443 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.376149 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.377535 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.380311 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.381670 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.382973 5047 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.384248 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.386897 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.388325 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.388994 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.392404 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.395079 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.396411 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.399392 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.400883 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.402791 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.404209 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.406050 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.406745 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.407296 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.407877 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.408573 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.410551 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.411127 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.411684 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.412707 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.413281 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.414293 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.414801 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.462701 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.462755 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.462764 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.462782 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.462793 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.565391 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.565429 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.565438 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.565453 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.565464 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.668850 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.669205 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.669219 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.669237 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.669249 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.738866 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884"} Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.738934 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1"} Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.740839 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b"} Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.754974 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:14Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.767106 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:14Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.771170 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.771360 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.771451 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.771548 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.771644 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.782046 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:14Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.797089 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:14Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.811758 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:14Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.825656 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:14Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.838217 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:14Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.855335 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:14Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.871392 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:14Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.874608 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.874644 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.874652 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.874668 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.874678 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.885472 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:14Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.903818 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:14Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.920538 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:14Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.961846 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:14 crc kubenswrapper[5047]: E0223 06:46:14.962105 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:16.962069478 +0000 UTC m=+99.213396612 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.979152 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.979190 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.979202 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.979220 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5047]: I0223 06:46:14.979231 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.063139 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.063476 5047 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.063673 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:17.063617788 +0000 UTC m=+99.314945042 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.063664 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.064020 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.064076 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.064020 5047 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.064179 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.064227 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.064249 5047 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.064192 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:17.064177334 +0000 UTC m=+99.315504498 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.064196 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.064394 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.064397 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:17.064345008 +0000 UTC m=+99.315672192 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.064418 5047 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.064549 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:17.064531484 +0000 UTC m=+99.315858658 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.083101 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.083125 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.083135 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.083149 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.083160 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.187088 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.187590 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.187767 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.188013 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.188168 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.291991 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.292403 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.292599 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.292986 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.293410 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.340720 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.340731 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.341018 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.341148 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.340746 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:15 crc kubenswrapper[5047]: E0223 06:46:15.341282 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.346929 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 08:21:37.124256379 +0000 UTC Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.398680 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.398817 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.398972 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.399070 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.399103 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.502135 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.502188 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.502202 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.502226 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.502239 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.605134 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.605206 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.605224 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.605252 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.605271 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.708366 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.708409 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.708417 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.708433 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.708442 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.810551 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.810594 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.810603 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.810616 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.810626 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.912371 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.912415 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.912425 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.912442 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5047]: I0223 06:46:15.912451 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.014435 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.014476 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.014505 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.014520 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.014541 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.063673 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-d5mx4"] Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.064081 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d5mx4" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.065598 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.065844 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.066213 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.081175 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.092274 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.117619 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.117658 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.117667 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.117684 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.117694 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.119822 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.143449 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.161461 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.171024 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.172372 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d-hosts-file\") pod \"node-resolver-d5mx4\" (UID: \"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\") " pod="openshift-dns/node-resolver-d5mx4" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.172423 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7nvs\" (UniqueName: \"kubernetes.io/projected/5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d-kube-api-access-s7nvs\") pod \"node-resolver-d5mx4\" (UID: \"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\") " pod="openshift-dns/node-resolver-d5mx4" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.184210 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.220679 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.220725 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.220734 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.220758 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.220768 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.273355 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d-hosts-file\") pod \"node-resolver-d5mx4\" (UID: \"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\") " pod="openshift-dns/node-resolver-d5mx4" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.273416 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7nvs\" (UniqueName: \"kubernetes.io/projected/5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d-kube-api-access-s7nvs\") pod \"node-resolver-d5mx4\" (UID: \"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\") " pod="openshift-dns/node-resolver-d5mx4" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.273513 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d-hosts-file\") pod \"node-resolver-d5mx4\" (UID: \"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\") " pod="openshift-dns/node-resolver-d5mx4" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.290212 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7nvs\" (UniqueName: \"kubernetes.io/projected/5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d-kube-api-access-s7nvs\") pod \"node-resolver-d5mx4\" (UID: \"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\") " pod="openshift-dns/node-resolver-d5mx4" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.323698 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.323738 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.323746 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.323765 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.323774 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.347978 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 15:09:24.649674424 +0000 UTC Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.376506 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d5mx4" Feb 23 06:46:16 crc kubenswrapper[5047]: W0223 06:46:16.389254 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6146e4_f9c9_4d32_9de3_e26a05eb6c6d.slice/crio-70dde49987641221bf45b8d10ad9ddbc09b6d17dd7d632f3e2b68f5a666d8494 WatchSource:0}: Error finding container 70dde49987641221bf45b8d10ad9ddbc09b6d17dd7d632f3e2b68f5a666d8494: Status 404 returned error can't find the container with id 70dde49987641221bf45b8d10ad9ddbc09b6d17dd7d632f3e2b68f5a666d8494 Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.426428 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.426476 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.426490 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.426511 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.426528 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.431230 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-n5dz9"] Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.431454 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-wh6hv"] Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.431619 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-9gf5k"] Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.431651 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.431835 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.432985 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.433572 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.434378 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.434970 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.435128 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.435186 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.435589 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.435649 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.435722 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.435872 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.435456 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.435485 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.436437 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.449189 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.459966 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.474082 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.487166 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.501818 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.512887 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.525295 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.528958 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.528991 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.529002 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.529020 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.529034 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.534779 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.547327 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.559197 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.572801 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.577524 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df53e315-5672-4e94-96bd-fd4f705103c2-cni-binary-copy\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.577577 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9z88\" (UniqueName: \"kubernetes.io/projected/ca275411-978b-439b-ab4b-f98a7ac42f8b-kube-api-access-p9z88\") pod \"machine-config-daemon-wh6hv\" (UID: \"ca275411-978b-439b-ab4b-f98a7ac42f8b\") " pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.577600 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-multus-conf-dir\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.577643 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl4z6\" (UniqueName: \"kubernetes.io/projected/df53e315-5672-4e94-96bd-fd4f705103c2-kube-api-access-tl4z6\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.577676 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca275411-978b-439b-ab4b-f98a7ac42f8b-proxy-tls\") pod \"machine-config-daemon-wh6hv\" (UID: \"ca275411-978b-439b-ab4b-f98a7ac42f8b\") " pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.577770 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-os-release\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.577814 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df53e315-5672-4e94-96bd-fd4f705103c2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.577861 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-cni-binary-copy\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.577917 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-run-k8s-cni-cncf-io\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.577952 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-run-multus-certs\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.577993 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-etc-kubernetes\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578088 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsvvf\" (UniqueName: \"kubernetes.io/projected/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-kube-api-access-lsvvf\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578125 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df53e315-5672-4e94-96bd-fd4f705103c2-system-cni-dir\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578150 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df53e315-5672-4e94-96bd-fd4f705103c2-cnibin\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578180 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ca275411-978b-439b-ab4b-f98a7ac42f8b-rootfs\") pod \"machine-config-daemon-wh6hv\" (UID: \"ca275411-978b-439b-ab4b-f98a7ac42f8b\") " pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578203 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-multus-daemon-config\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578255 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-var-lib-cni-bin\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578278 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/df53e315-5672-4e94-96bd-fd4f705103c2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578332 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-cnibin\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578360 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-var-lib-cni-multus\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578389 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca275411-978b-439b-ab4b-f98a7ac42f8b-mcd-auth-proxy-config\") pod \"machine-config-daemon-wh6hv\" (UID: \"ca275411-978b-439b-ab4b-f98a7ac42f8b\") " pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578414 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-system-cni-dir\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578440 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-hostroot\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578489 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-multus-socket-dir-parent\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578531 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-run-netns\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578564 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-var-lib-kubelet\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578585 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-multus-cni-dir\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.578603 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df53e315-5672-4e94-96bd-fd4f705103c2-os-release\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.582634 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.598639 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.612306 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.625233 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.633261 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.633293 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.633304 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.633321 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.633335 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.640568 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.652237 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.663657 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679225 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-run-netns\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679264 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-var-lib-kubelet\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679282 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-multus-cni-dir\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679297 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df53e315-5672-4e94-96bd-fd4f705103c2-os-release\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679316 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9z88\" (UniqueName: \"kubernetes.io/projected/ca275411-978b-439b-ab4b-f98a7ac42f8b-kube-api-access-p9z88\") pod \"machine-config-daemon-wh6hv\" (UID: \"ca275411-978b-439b-ab4b-f98a7ac42f8b\") " pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679330 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-multus-conf-dir\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679351 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df53e315-5672-4e94-96bd-fd4f705103c2-cni-binary-copy\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679366 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl4z6\" (UniqueName: \"kubernetes.io/projected/df53e315-5672-4e94-96bd-fd4f705103c2-kube-api-access-tl4z6\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679390 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca275411-978b-439b-ab4b-f98a7ac42f8b-proxy-tls\") pod \"machine-config-daemon-wh6hv\" (UID: \"ca275411-978b-439b-ab4b-f98a7ac42f8b\") " pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679405 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-os-release\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679423 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df53e315-5672-4e94-96bd-fd4f705103c2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679438 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-run-k8s-cni-cncf-io\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679453 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-run-multus-certs\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679562 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-etc-kubernetes\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679581 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-cni-binary-copy\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679587 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df53e315-5672-4e94-96bd-fd4f705103c2-os-release\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679647 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df53e315-5672-4e94-96bd-fd4f705103c2-system-cni-dir\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679606 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df53e315-5672-4e94-96bd-fd4f705103c2-system-cni-dir\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679709 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df53e315-5672-4e94-96bd-fd4f705103c2-cnibin\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679741 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ca275411-978b-439b-ab4b-f98a7ac42f8b-rootfs\") pod \"machine-config-daemon-wh6hv\" (UID: \"ca275411-978b-439b-ab4b-f98a7ac42f8b\") " pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679763 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsvvf\" (UniqueName: \"kubernetes.io/projected/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-kube-api-access-lsvvf\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679781 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-multus-daemon-config\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679816 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-var-lib-cni-bin\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679833 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/df53e315-5672-4e94-96bd-fd4f705103c2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679870 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-cnibin\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679885 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-var-lib-cni-multus\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679922 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca275411-978b-439b-ab4b-f98a7ac42f8b-mcd-auth-proxy-config\") pod \"machine-config-daemon-wh6hv\" (UID: \"ca275411-978b-439b-ab4b-f98a7ac42f8b\") " pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679940 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-system-cni-dir\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679957 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-hostroot\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679976 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-multus-conf-dir\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.679997 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-multus-socket-dir-parent\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.680070 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-run-netns\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.680098 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-var-lib-kubelet\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.680336 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df53e315-5672-4e94-96bd-fd4f705103c2-cnibin\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.680378 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ca275411-978b-439b-ab4b-f98a7ac42f8b-rootfs\") pod \"machine-config-daemon-wh6hv\" (UID: \"ca275411-978b-439b-ab4b-f98a7ac42f8b\") " pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.680417 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-multus-cni-dir\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.680468 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-var-lib-cni-multus\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.680635 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-run-k8s-cni-cncf-io\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.680833 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-cnibin\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.680942 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-hostroot\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.681001 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-multus-socket-dir-parent\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.681061 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-system-cni-dir\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.681078 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-var-lib-cni-bin\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.681187 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-os-release\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.681211 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-etc-kubernetes\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.681231 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-host-run-multus-certs\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.681330 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca275411-978b-439b-ab4b-f98a7ac42f8b-mcd-auth-proxy-config\") pod \"machine-config-daemon-wh6hv\" (UID: \"ca275411-978b-439b-ab4b-f98a7ac42f8b\") " pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.681663 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df53e315-5672-4e94-96bd-fd4f705103c2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.681923 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-cni-binary-copy\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.682304 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/df53e315-5672-4e94-96bd-fd4f705103c2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.682394 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df53e315-5672-4e94-96bd-fd4f705103c2-cni-binary-copy\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.682635 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-multus-daemon-config\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.684549 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca275411-978b-439b-ab4b-f98a7ac42f8b-proxy-tls\") pod \"machine-config-daemon-wh6hv\" (UID: \"ca275411-978b-439b-ab4b-f98a7ac42f8b\") " pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.697269 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl4z6\" (UniqueName: \"kubernetes.io/projected/df53e315-5672-4e94-96bd-fd4f705103c2-kube-api-access-tl4z6\") pod \"multus-additional-cni-plugins-9gf5k\" (UID: \"df53e315-5672-4e94-96bd-fd4f705103c2\") " pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.697287 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsvvf\" (UniqueName: \"kubernetes.io/projected/e0fbd5e6-7dcc-4a13-936e-0db2e66394e8-kube-api-access-lsvvf\") pod \"multus-n5dz9\" (UID: \"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\") " pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.697370 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9z88\" (UniqueName: \"kubernetes.io/projected/ca275411-978b-439b-ab4b-f98a7ac42f8b-kube-api-access-p9z88\") pod \"machine-config-daemon-wh6hv\" (UID: \"ca275411-978b-439b-ab4b-f98a7ac42f8b\") " pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.736292 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.736343 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.736353 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.736371 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.736384 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.746447 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d5mx4" event={"ID":"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d","Type":"ContainerStarted","Data":"69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f"} Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.746482 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d5mx4" event={"ID":"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d","Type":"ContainerStarted","Data":"70dde49987641221bf45b8d10ad9ddbc09b6d17dd7d632f3e2b68f5a666d8494"} Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.750613 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n5dz9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.758363 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.758644 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:46:16 crc kubenswrapper[5047]: W0223 06:46:16.760738 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0fbd5e6_7dcc_4a13_936e_0db2e66394e8.slice/crio-b78b1ba3e6f4ce5e5af16aea48935e894568ba3f61737d4c815f61efd02482af WatchSource:0}: Error finding container b78b1ba3e6f4ce5e5af16aea48935e894568ba3f61737d4c815f61efd02482af: Status 404 returned error can't find the container with id b78b1ba3e6f4ce5e5af16aea48935e894568ba3f61737d4c815f61efd02482af Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.764870 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.768839 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.780108 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: W0223 06:46:16.785078 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf53e315_5672_4e94_96bd_fd4f705103c2.slice/crio-7f60030f9126b48c55c798795d1e955383d9f96aa438956690329d19273fd4b4 WatchSource:0}: Error finding container 7f60030f9126b48c55c798795d1e955383d9f96aa438956690329d19273fd4b4: Status 404 returned error can't find the container with id 7f60030f9126b48c55c798795d1e955383d9f96aa438956690329d19273fd4b4 Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.794433 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.795381 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rklm9"] Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.796815 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.799694 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.799734 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.799854 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.799878 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.800066 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.800202 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.800337 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.808752 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.822397 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.834536 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.839783 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.839854 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.839867 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.839884 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.839935 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.844366 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.856063 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.869304 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881061 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-cni-bin\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881123 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qs44\" (UniqueName: \"kubernetes.io/projected/d56904fe-1a5a-4fde-b122-947fd9a28130-kube-api-access-7qs44\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881143 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-openvswitch\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881162 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-env-overrides\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881218 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-node-log\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881237 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d56904fe-1a5a-4fde-b122-947fd9a28130-ovn-node-metrics-cert\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881289 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881314 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-kubelet\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881362 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-var-lib-openvswitch\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881377 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-log-socket\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881390 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-ovnkube-config\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881442 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-cni-netd\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881459 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-ovnkube-script-lib\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881539 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-systemd-units\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881581 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-run-netns\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881599 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-ovn\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881618 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-systemd\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881633 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-etc-openvswitch\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881647 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-slash\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.881679 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-run-ovn-kubernetes\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.887038 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.899040 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.912426 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.924331 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.934200 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.942178 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.942220 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.942231 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.942249 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.942259 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.947703 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.966304 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.978807 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982201 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982332 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-kubelet\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982361 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-var-lib-openvswitch\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982386 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-log-socket\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982412 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-cni-netd\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982427 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-ovnkube-config\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982452 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-ovnkube-script-lib\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982492 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-systemd-units\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982505 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-kubelet\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982519 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-run-netns\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982558 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-run-netns\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982594 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-ovn\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982609 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-var-lib-openvswitch\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982627 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-systemd\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982635 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-log-socket\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982653 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-cni-netd\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982673 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-ovn\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982702 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-systemd-units\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: E0223 06:46:16.982700 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:20.982667752 +0000 UTC m=+103.233994966 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982723 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-systemd\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982764 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-etc-openvswitch\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982944 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-slash\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.982968 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-etc-openvswitch\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983046 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-run-ovn-kubernetes\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983055 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-slash\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983091 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-run-ovn-kubernetes\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983123 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qs44\" (UniqueName: \"kubernetes.io/projected/d56904fe-1a5a-4fde-b122-947fd9a28130-kube-api-access-7qs44\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983156 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-cni-bin\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983175 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-env-overrides\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983194 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-openvswitch\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983212 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-node-log\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983239 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d56904fe-1a5a-4fde-b122-947fd9a28130-ovn-node-metrics-cert\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983262 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983318 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983320 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-cni-bin\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983633 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-ovnkube-config\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983698 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-node-log\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983725 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-openvswitch\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983794 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-ovnkube-script-lib\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.983844 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-env-overrides\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.990195 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:16 crc kubenswrapper[5047]: I0223 06:46:16.992746 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d56904fe-1a5a-4fde-b122-947fd9a28130-ovn-node-metrics-cert\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.001655 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qs44\" (UniqueName: \"kubernetes.io/projected/d56904fe-1a5a-4fde-b122-947fd9a28130-kube-api-access-7qs44\") pod \"ovnkube-node-rklm9\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.002873 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.013926 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.045711 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.045760 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.045769 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.045785 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.045808 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.084654 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.084708 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.084732 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.084752 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.084883 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.084899 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.084930 5047 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.084985 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:21.084966733 +0000 UTC m=+103.336293867 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.084991 5047 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.085105 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.085142 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.085141 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:21.085114158 +0000 UTC m=+103.336441282 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.085154 5047 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.085217 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:21.085211811 +0000 UTC m=+103.336538945 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.085103 5047 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.085288 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:21.085282403 +0000 UTC m=+103.336609537 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.130934 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:17 crc kubenswrapper[5047]: W0223 06:46:17.142186 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd56904fe_1a5a_4fde_b122_947fd9a28130.slice/crio-ea324bcdbb4441d059365008950e26825b4335a7f697e202136bcde5032be187 WatchSource:0}: Error finding container ea324bcdbb4441d059365008950e26825b4335a7f697e202136bcde5032be187: Status 404 returned error can't find the container with id ea324bcdbb4441d059365008950e26825b4335a7f697e202136bcde5032be187 Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.149134 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.149172 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.149183 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.149200 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.149210 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.252082 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.252130 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.252143 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.252160 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.252169 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.340257 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.340287 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.340350 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.340533 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.340648 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:17 crc kubenswrapper[5047]: E0223 06:46:17.340735 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.348598 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 06:03:33.582072062 +0000 UTC Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.354534 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.354568 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.354577 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.354592 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.354604 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.457274 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.457310 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.457322 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.457336 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.457348 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.559686 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.559744 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.559756 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.559770 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.559780 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.662459 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.662505 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.662513 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.662530 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.662540 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.752049 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.754862 5047 generic.go:334] "Generic (PLEG): container finished" podID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerID="b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f" exitCode=0 Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.755099 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.755167 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerStarted","Data":"ea324bcdbb4441d059365008950e26825b4335a7f697e202136bcde5032be187"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.758139 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n5dz9" event={"ID":"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8","Type":"ContainerStarted","Data":"e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.758188 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n5dz9" event={"ID":"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8","Type":"ContainerStarted","Data":"b78b1ba3e6f4ce5e5af16aea48935e894568ba3f61737d4c815f61efd02482af"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.762728 5047 generic.go:334] "Generic (PLEG): container finished" podID="df53e315-5672-4e94-96bd-fd4f705103c2" containerID="8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a" exitCode=0 Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.762786 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" event={"ID":"df53e315-5672-4e94-96bd-fd4f705103c2","Type":"ContainerDied","Data":"8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.762805 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" event={"ID":"df53e315-5672-4e94-96bd-fd4f705103c2","Type":"ContainerStarted","Data":"7f60030f9126b48c55c798795d1e955383d9f96aa438956690329d19273fd4b4"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.766323 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.766759 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.766691 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.766770 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.766788 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.766800 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.766819 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.766802 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"33744bb5437c6d425b3239935e97fc6596bfcc47e6b37369510b2be75e2a18a3"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.768196 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.791805 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.805830 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.817896 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.828259 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.836752 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.849681 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.860264 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.869706 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.869739 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.869750 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.869770 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.869781 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.874706 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.887459 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.903277 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.919533 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.932383 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.945638 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.959273 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.969920 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.973213 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.973236 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.973267 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.973283 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.973292 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.987445 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:17 crc kubenswrapper[5047]: I0223 06:46:17.998973 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.011124 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.020846 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.031028 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.044926 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.075262 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.075297 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.075305 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.075323 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.075334 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.178045 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.178477 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.178488 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.178506 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.178518 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.291075 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.291132 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.291144 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.291161 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.291171 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.349198 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 17:26:55.582077941 +0000 UTC Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.354072 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.354634 5047 scope.go:117] "RemoveContainer" containerID="d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.364548 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.383887 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.395388 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.395422 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.395430 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.395448 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.395460 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.417374 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.432243 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.445362 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.463415 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.483422 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.499098 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.499355 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.499444 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.499535 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.499638 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.502844 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.519938 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.535502 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.553417 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.603671 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.603720 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.603764 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.603787 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.603802 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.707044 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.707099 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.707109 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.707131 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.707142 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.772395 5047 generic.go:334] "Generic (PLEG): container finished" podID="df53e315-5672-4e94-96bd-fd4f705103c2" containerID="686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9" exitCode=0 Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.772495 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" event={"ID":"df53e315-5672-4e94-96bd-fd4f705103c2","Type":"ContainerDied","Data":"686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.777157 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.780375 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.780549 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.786316 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerStarted","Data":"a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.786361 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerStarted","Data":"60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.786382 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerStarted","Data":"e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.786400 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerStarted","Data":"68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.786413 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerStarted","Data":"d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.786428 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerStarted","Data":"00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.790159 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.805751 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.809845 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.809888 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.809919 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.809940 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.809955 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.818082 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.840272 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.856210 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.871499 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.887851 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.902791 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.914707 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.915006 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.915036 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.915049 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.915066 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.915081 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.931153 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.945743 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.961310 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.974202 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.985736 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:18 crc kubenswrapper[5047]: I0223 06:46:18.996038 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.015058 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.017563 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.017635 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.017652 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.017671 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.017684 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.033689 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.047797 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.060864 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.079033 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.092621 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.107740 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.120504 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.120562 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.120580 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.120610 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.120629 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.128471 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.144983 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.223323 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.223368 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.223382 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.223402 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.223416 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.326551 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.326597 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.326607 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.326626 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.326639 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.340770 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:19 crc kubenswrapper[5047]: E0223 06:46:19.340886 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.341251 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:19 crc kubenswrapper[5047]: E0223 06:46:19.341310 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.341355 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:19 crc kubenswrapper[5047]: E0223 06:46:19.341394 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.351215 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 05:18:21.40774962 +0000 UTC Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.430326 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.430381 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.430397 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.430420 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.430441 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.532796 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.532842 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.532854 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.532873 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.532887 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.635107 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.635159 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.635172 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.635191 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.635204 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.738759 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.738810 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.738822 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.738845 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.738857 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.792346 5047 generic.go:334] "Generic (PLEG): container finished" podID="df53e315-5672-4e94-96bd-fd4f705103c2" containerID="5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d" exitCode=0 Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.792432 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" event={"ID":"df53e315-5672-4e94-96bd-fd4f705103c2","Type":"ContainerDied","Data":"5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d"} Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.818935 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.838225 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.842588 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.842661 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.842685 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.842718 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.842744 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.860524 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.877126 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.895525 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.911397 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.928009 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.941649 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.945833 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.945861 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.945873 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.945890 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.945923 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.959829 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.981281 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:19 crc kubenswrapper[5047]: I0223 06:46:19.999697 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.014733 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:20Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.048576 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.048638 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.048654 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.048672 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.048686 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.152005 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.152080 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.152100 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.152131 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.152153 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.256040 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.256097 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.256111 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.256133 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.256147 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.352165 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 17:15:10.367836959 +0000 UTC Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.359786 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.359859 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.359876 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.359934 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.359953 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.365806 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.462797 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.462855 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.462869 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.462892 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.462930 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.567058 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.567153 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.567173 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.567212 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.567235 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.670887 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.670956 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.670965 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.670981 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.670992 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.774581 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.774645 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.774660 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.774687 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.774702 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.802287 5047 generic.go:334] "Generic (PLEG): container finished" podID="df53e315-5672-4e94-96bd-fd4f705103c2" containerID="2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2" exitCode=0 Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.802429 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" event={"ID":"df53e315-5672-4e94-96bd-fd4f705103c2","Type":"ContainerDied","Data":"2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2"} Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.871017 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:20Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.876704 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.876740 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.876749 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.876767 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.876779 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.891535 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:20Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.909952 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:20Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.923852 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:20Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.953985 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:20Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.970011 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:20Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.980071 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.980118 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.980133 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.980151 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.980161 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5047]: I0223 06:46:20.986131 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:20Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.003314 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:21Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.015615 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:21Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.039491 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:21Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.048779 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.048934 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:29.048911848 +0000 UTC m=+111.300238972 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.063527 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:21Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.080513 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:21Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.083765 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.083792 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.083802 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.083818 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.083828 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.097721 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:21Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.149267 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.149313 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.149335 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.149357 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.149503 5047 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.149543 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.149557 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.149592 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.149608 5047 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.149603 5047 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.149639 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:29.149608193 +0000 UTC m=+111.400935537 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.149676 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:29.149656845 +0000 UTC m=+111.400983999 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.149565 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.149696 5047 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.149717 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:29.149694116 +0000 UTC m=+111.401021260 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.149744 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:29.149733617 +0000 UTC m=+111.401060761 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.186099 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.186160 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.186179 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.186204 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.186224 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.290407 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.290463 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.290480 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.290508 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.290527 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.340535 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.340656 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.340575 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.340807 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.340975 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:21 crc kubenswrapper[5047]: E0223 06:46:21.341131 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.352336 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 22:04:08.101208349 +0000 UTC Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.392986 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.393059 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.393084 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.393111 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.393132 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.496363 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.496465 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.496500 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.496537 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.496559 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.600670 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.600721 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.600733 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.600752 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.600764 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.704261 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.704329 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.704347 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.704471 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.704497 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.813449 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.813537 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.813562 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.813594 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.813618 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.823172 5047 generic.go:334] "Generic (PLEG): container finished" podID="df53e315-5672-4e94-96bd-fd4f705103c2" containerID="10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928" exitCode=0 Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.823315 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" event={"ID":"df53e315-5672-4e94-96bd-fd4f705103c2","Type":"ContainerDied","Data":"10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928"} Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.845830 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerStarted","Data":"7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016"} Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.856798 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:21Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.879335 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:21Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.917834 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.918294 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.918885 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.919330 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.919698 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.917611 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:21Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.945650 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:21Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.968717 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:21Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:21 crc kubenswrapper[5047]: I0223 06:46:21.991510 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:21Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.008471 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.024144 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.024199 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.024211 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.024230 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.024243 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.029603 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.057884 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.079613 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.096507 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.116761 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.131795 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.131848 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.131918 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.131941 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.131953 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.137225 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.234749 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.234805 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.234816 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.234832 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.234841 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.278047 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.278677 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.278694 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.278719 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.278736 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: E0223 06:46:22.297493 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.307051 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.307113 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.307131 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.307161 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.307181 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: E0223 06:46:22.321488 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.328200 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.328260 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.328285 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.328309 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.328328 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: E0223 06:46:22.345163 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.350402 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.350452 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.350464 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.350482 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.350496 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.352477 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 05:02:21.303755471 +0000 UTC Feb 23 06:46:22 crc kubenswrapper[5047]: E0223 06:46:22.366149 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.371091 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.371133 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.371145 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.371166 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.371180 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: E0223 06:46:22.383063 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: E0223 06:46:22.383187 5047 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.384816 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.384853 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.384864 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.384880 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.384892 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.487682 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.487721 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.487732 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.487749 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.487761 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.595120 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.595168 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.595179 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.595196 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.595211 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.697436 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.697489 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.697503 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.697524 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.697535 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.800851 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.800941 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.800962 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.800989 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.801008 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.806558 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bhfwp"] Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.807327 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bhfwp" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.810533 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.811221 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.811577 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.812157 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.834978 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.846675 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.853311 5047 generic.go:334] "Generic (PLEG): container finished" podID="df53e315-5672-4e94-96bd-fd4f705103c2" containerID="4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea" exitCode=0 Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.853364 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" event={"ID":"df53e315-5672-4e94-96bd-fd4f705103c2","Type":"ContainerDied","Data":"4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea"} Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.862142 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.871481 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ca4c2cb-5b99-4efc-a494-2e1bc2894054-host\") pod \"node-ca-bhfwp\" (UID: \"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\") " pod="openshift-image-registry/node-ca-bhfwp" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.871545 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ca4c2cb-5b99-4efc-a494-2e1bc2894054-serviceca\") pod \"node-ca-bhfwp\" (UID: \"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\") " pod="openshift-image-registry/node-ca-bhfwp" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.871649 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdn6r\" (UniqueName: \"kubernetes.io/projected/5ca4c2cb-5b99-4efc-a494-2e1bc2894054-kube-api-access-pdn6r\") pod \"node-ca-bhfwp\" (UID: \"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\") " pod="openshift-image-registry/node-ca-bhfwp" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.878288 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.889067 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.903763 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.904862 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.904916 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.904931 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.904949 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.904962 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.934000 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.952221 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.965739 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.972788 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdn6r\" (UniqueName: \"kubernetes.io/projected/5ca4c2cb-5b99-4efc-a494-2e1bc2894054-kube-api-access-pdn6r\") pod \"node-ca-bhfwp\" (UID: \"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\") " pod="openshift-image-registry/node-ca-bhfwp" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.972900 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ca4c2cb-5b99-4efc-a494-2e1bc2894054-host\") pod \"node-ca-bhfwp\" (UID: \"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\") " pod="openshift-image-registry/node-ca-bhfwp" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.972947 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ca4c2cb-5b99-4efc-a494-2e1bc2894054-serviceca\") pod \"node-ca-bhfwp\" (UID: \"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\") " pod="openshift-image-registry/node-ca-bhfwp" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.973220 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ca4c2cb-5b99-4efc-a494-2e1bc2894054-host\") pod \"node-ca-bhfwp\" (UID: \"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\") " pod="openshift-image-registry/node-ca-bhfwp" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.974173 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ca4c2cb-5b99-4efc-a494-2e1bc2894054-serviceca\") pod \"node-ca-bhfwp\" (UID: \"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\") " pod="openshift-image-registry/node-ca-bhfwp" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.980616 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:22Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:22 crc kubenswrapper[5047]: I0223 06:46:22.996272 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdn6r\" (UniqueName: \"kubernetes.io/projected/5ca4c2cb-5b99-4efc-a494-2e1bc2894054-kube-api-access-pdn6r\") pod \"node-ca-bhfwp\" (UID: \"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\") " pod="openshift-image-registry/node-ca-bhfwp" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.003521 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.007697 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.007751 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.007763 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.007787 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.007800 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.020075 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.033323 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.044458 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.057006 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.070978 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.091675 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.111200 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.111276 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.111295 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.111321 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.111342 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.111463 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.126268 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bhfwp" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.126961 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.141759 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: W0223 06:46:23.157389 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ca4c2cb_5b99_4efc_a494_2e1bc2894054.slice/crio-525528c56d9124b1c04a7e8f16c6ff7312753e07e7745dded77161da42bf56bd WatchSource:0}: Error finding container 525528c56d9124b1c04a7e8f16c6ff7312753e07e7745dded77161da42bf56bd: Status 404 returned error can't find the container with id 525528c56d9124b1c04a7e8f16c6ff7312753e07e7745dded77161da42bf56bd Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.158328 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.173482 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.188575 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.211367 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.215598 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.215660 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.215671 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.215691 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.215720 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.225703 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.252998 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.273569 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.285730 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.319357 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.319441 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.319464 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.319499 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.319523 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.340386 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.340440 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:23 crc kubenswrapper[5047]: E0223 06:46:23.340517 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.340574 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:23 crc kubenswrapper[5047]: E0223 06:46:23.340613 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:23 crc kubenswrapper[5047]: E0223 06:46:23.340763 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.353416 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 17:55:11.766559508 +0000 UTC Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.422247 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.422317 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.422343 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.422410 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.422437 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.525669 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.526218 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.526233 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.526253 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.526266 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.629621 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.629681 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.629695 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.629716 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.629731 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.732706 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.732756 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.732765 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.732782 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.732793 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.836261 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.836318 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.836331 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.836353 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.836366 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.862760 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bhfwp" event={"ID":"5ca4c2cb-5b99-4efc-a494-2e1bc2894054","Type":"ContainerStarted","Data":"5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.862831 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bhfwp" event={"ID":"5ca4c2cb-5b99-4efc-a494-2e1bc2894054","Type":"ContainerStarted","Data":"525528c56d9124b1c04a7e8f16c6ff7312753e07e7745dded77161da42bf56bd"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.870677 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerStarted","Data":"ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.870978 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.882965 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.890177 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" event={"ID":"df53e315-5672-4e94-96bd-fd4f705103c2","Type":"ContainerStarted","Data":"2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.901308 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.909772 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.924450 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.936593 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.939192 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.939243 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.939257 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.939275 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.939289 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.950509 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.965091 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.981932 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:23 crc kubenswrapper[5047]: I0223 06:46:23.996090 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:23Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.008147 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.020066 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.031010 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.042244 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.042367 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.042388 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.042415 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.042437 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.048401 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.070273 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.087698 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.102111 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.137080 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.146283 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.146358 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.146369 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.146392 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.146407 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.187665 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.206121 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.222261 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.233562 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.250140 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.250183 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.250194 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.250212 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.250223 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.255188 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.269687 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.288585 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.301469 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.312507 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.330425 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.352158 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.352533 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.352635 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.352728 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.352830 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.354664 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 02:28:56.54436648 +0000 UTC Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.360971 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.378250 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.456262 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.456352 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.456395 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.456436 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.456455 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.559726 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.559791 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.559808 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.559835 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.559854 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.663058 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.663124 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.663143 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.663169 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.663187 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.766770 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.766842 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.766956 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.766993 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.767022 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.870816 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.870893 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.870977 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.871020 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.871044 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.896068 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.896140 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.933212 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.955077 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.974133 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.974227 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.974248 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.974280 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.974302 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5047]: I0223 06:46:24.979314 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.004734 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:25Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.022726 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:25Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.037237 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:25Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.061458 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:25Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.076893 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.077016 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.077034 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.077063 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.077081 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.102069 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:25Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.119953 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:25Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.142235 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:25Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.158166 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:25Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.179603 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:25Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.180866 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.180954 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.180974 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.181002 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.181023 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.206957 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:25Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.222616 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:25Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.239335 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:25Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.284155 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.284224 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.284238 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.284260 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.284271 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.340129 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.340204 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.340232 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:25 crc kubenswrapper[5047]: E0223 06:46:25.340352 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:25 crc kubenswrapper[5047]: E0223 06:46:25.340498 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:25 crc kubenswrapper[5047]: E0223 06:46:25.340717 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.355946 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 15:12:33.273124023 +0000 UTC Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.389383 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.389455 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.389474 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.389506 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.389528 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.493782 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.493866 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.493888 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.493959 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.493980 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.596790 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.596838 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.596851 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.596869 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.596881 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.700071 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.700137 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.700151 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.700173 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.700189 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.803877 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.803947 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.803958 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.803979 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.804028 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.908253 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.908308 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.908320 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.908476 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5047]: I0223 06:46:25.908503 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.011727 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.011806 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.011826 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.011853 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.011870 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.114339 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.114377 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.114390 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.114410 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.114422 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.217534 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.217587 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.217598 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.217617 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.217628 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.320396 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.320455 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.320468 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.320497 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.320514 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.356128 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 22:25:52.860018059 +0000 UTC Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.423000 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.423067 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.423079 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.423095 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.423104 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.526034 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.526142 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.526161 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.526189 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.526206 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.628708 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.628747 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.628758 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.628774 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.628784 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.732046 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.732115 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.732128 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.732158 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.732173 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.835492 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.835557 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.835576 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.835602 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.835627 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.904968 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/0.log" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.909627 5047 generic.go:334] "Generic (PLEG): container finished" podID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerID="ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714" exitCode=1 Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.909677 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714"} Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.911224 5047 scope.go:117] "RemoveContainer" containerID="ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.933614 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:26Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.938712 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.938749 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.938764 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.938783 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.938797 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.952941 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:26Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.970668 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:26Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:26 crc kubenswrapper[5047]: I0223 06:46:26.995442 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:26Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.016185 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.035106 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.043281 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.043375 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.043402 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.043459 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.043481 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:27Z","lastTransitionTime":"2026-02-23T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.057181 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.085275 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.104404 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.126484 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.141944 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.146691 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.146734 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.146746 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.146767 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.146782 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:27Z","lastTransitionTime":"2026-02-23T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.167280 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.201157 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:26Z\\\",\\\"message\\\":\\\"7\\\\nI0223 06:46:26.339697 6746 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:26.339703 6746 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:26.340193 6746 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:26.339669 6746 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:46:26.340259 6746 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 06:46:26.340316 6746 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 06:46:26.340347 6746 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:26.340390 6746 factory.go:656] Stopping watch factory\\\\nI0223 06:46:26.340403 6746 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:26.340406 6746 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0223 06:46:26.340432 6746 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:46:26.340439 6746 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:26.340500 6746 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.225365 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.249514 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.249562 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.249577 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.249598 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.249611 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:27Z","lastTransitionTime":"2026-02-23T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.339992 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.339992 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:27 crc kubenswrapper[5047]: E0223 06:46:27.340282 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.340041 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:27 crc kubenswrapper[5047]: E0223 06:46:27.340356 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:27 crc kubenswrapper[5047]: E0223 06:46:27.340491 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.352273 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.352322 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.352340 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.352360 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.352423 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:27Z","lastTransitionTime":"2026-02-23T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.356958 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 10:39:07.309837926 +0000 UTC Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.454730 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.454767 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.454779 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.454792 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.454802 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:27Z","lastTransitionTime":"2026-02-23T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.557750 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.557798 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.557811 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.557827 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.557840 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:27Z","lastTransitionTime":"2026-02-23T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.661339 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.661403 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.661422 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.661449 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.661468 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:27Z","lastTransitionTime":"2026-02-23T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.764502 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.764563 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.764575 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.764597 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.764612 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:27Z","lastTransitionTime":"2026-02-23T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.867486 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.867542 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.867552 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.867576 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.867590 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:27Z","lastTransitionTime":"2026-02-23T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.917724 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/0.log" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.921779 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerStarted","Data":"677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c"} Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.922465 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.938735 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.958763 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.972245 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.972299 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.972316 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.972335 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.972348 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:27Z","lastTransitionTime":"2026-02-23T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:27 crc kubenswrapper[5047]: I0223 06:46:27.975944 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.004709 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.037690 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.053803 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.068421 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.074389 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.074429 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.074441 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.074458 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.074470 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:28Z","lastTransitionTime":"2026-02-23T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.086120 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.104542 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.121118 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.177624 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.177943 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.178067 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.178188 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.178279 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:28Z","lastTransitionTime":"2026-02-23T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.188884 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.204240 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.227511 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.253103 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:26Z\\\",\\\"message\\\":\\\"7\\\\nI0223 06:46:26.339697 6746 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:26.339703 6746 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:26.340193 6746 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:26.339669 6746 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:46:26.340259 6746 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 06:46:26.340316 6746 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 06:46:26.340347 6746 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:26.340390 6746 factory.go:656] Stopping watch factory\\\\nI0223 06:46:26.340403 6746 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:26.340406 6746 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0223 06:46:26.340432 6746 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:46:26.340439 6746 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:26.340500 6746 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.281810 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.282145 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.282336 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.282505 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.282702 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:28Z","lastTransitionTime":"2026-02-23T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.358005 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 12:35:54.377609433 +0000 UTC Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.358933 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.375635 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.386686 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.386741 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.386753 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.386772 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.386791 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:28Z","lastTransitionTime":"2026-02-23T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.412734 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.428845 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.445824 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.480742 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.489290 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.489349 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.489362 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.489383 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.489399 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:28Z","lastTransitionTime":"2026-02-23T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.498083 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.512653 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.526644 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.537484 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.556530 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.579864 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:26Z\\\",\\\"message\\\":\\\"7\\\\nI0223 06:46:26.339697 6746 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:26.339703 6746 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:26.340193 6746 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:26.339669 6746 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:46:26.340259 6746 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 06:46:26.340316 6746 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 06:46:26.340347 6746 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:26.340390 6746 factory.go:656] Stopping watch factory\\\\nI0223 06:46:26.340403 6746 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:26.340406 6746 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0223 06:46:26.340432 6746 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:46:26.340439 6746 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:26.340500 6746 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.592963 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.593201 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.593298 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.593481 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.593564 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:28Z","lastTransitionTime":"2026-02-23T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.595539 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.607403 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.696830 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.697130 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.697196 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.697269 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.697338 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:28Z","lastTransitionTime":"2026-02-23T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.800937 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.801003 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.801071 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.801097 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.801117 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:28Z","lastTransitionTime":"2026-02-23T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.904639 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.904730 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.904759 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.904797 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.904821 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:28Z","lastTransitionTime":"2026-02-23T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.928558 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/1.log" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.929497 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/0.log" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.933365 5047 generic.go:334] "Generic (PLEG): container finished" podID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerID="677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c" exitCode=1 Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.933454 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c"} Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.933524 5047 scope.go:117] "RemoveContainer" containerID="ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.935665 5047 scope.go:117] "RemoveContainer" containerID="677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c" Feb 23 06:46:28 crc kubenswrapper[5047]: E0223 06:46:28.935879 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.945270 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d"] Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.946039 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.949037 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.949078 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.949730 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.967147 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.985016 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:28 crc kubenswrapper[5047]: I0223 06:46:28.999240 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.008472 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.008523 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.008539 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.008562 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.008574 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:29Z","lastTransitionTime":"2026-02-23T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.026115 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.040689 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.046484 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a8a99af-4edb-4cf0-8c50-3ae9a6e38181-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vz92d\" (UID: \"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.046528 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a8a99af-4edb-4cf0-8c50-3ae9a6e38181-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vz92d\" (UID: \"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.046561 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtvdl\" (UniqueName: \"kubernetes.io/projected/6a8a99af-4edb-4cf0-8c50-3ae9a6e38181-kube-api-access-jtvdl\") pod \"ovnkube-control-plane-749d76644c-vz92d\" (UID: \"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.046579 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a8a99af-4edb-4cf0-8c50-3ae9a6e38181-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vz92d\" (UID: \"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.067647 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.081769 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.101155 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.112156 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.112244 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.112275 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.112313 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.112340 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:29Z","lastTransitionTime":"2026-02-23T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.125233 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.147708 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.148112 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:45.148065154 +0000 UTC m=+127.399392328 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.148255 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a8a99af-4edb-4cf0-8c50-3ae9a6e38181-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vz92d\" (UID: \"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.148311 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a8a99af-4edb-4cf0-8c50-3ae9a6e38181-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vz92d\" (UID: \"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.148356 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtvdl\" (UniqueName: \"kubernetes.io/projected/6a8a99af-4edb-4cf0-8c50-3ae9a6e38181-kube-api-access-jtvdl\") pod \"ovnkube-control-plane-749d76644c-vz92d\" (UID: \"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.148395 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a8a99af-4edb-4cf0-8c50-3ae9a6e38181-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vz92d\" (UID: \"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.149446 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a8a99af-4edb-4cf0-8c50-3ae9a6e38181-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vz92d\" (UID: \"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.149535 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a8a99af-4edb-4cf0-8c50-3ae9a6e38181-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vz92d\" (UID: \"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.156718 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a8a99af-4edb-4cf0-8c50-3ae9a6e38181-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vz92d\" (UID: \"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.163467 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:26Z\\\",\\\"message\\\":\\\"7\\\\nI0223 06:46:26.339697 6746 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:26.339703 6746 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:26.340193 6746 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:26.339669 6746 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:46:26.340259 6746 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 06:46:26.340316 6746 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 06:46:26.340347 6746 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:26.340390 6746 factory.go:656] Stopping watch factory\\\\nI0223 06:46:26.340403 6746 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:26.340406 6746 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0223 06:46:26.340432 6746 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:46:26.340439 6746 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:26.340500 6746 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"message\\\":\\\"emoval\\\\nI0223 06:46:27.871478 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0223 06:46:27.871499 6899 factory.go:656] Stopping watch factory\\\\nI0223 06:46:27.871528 6899 handler.go:208] Removed *v1.Node event handler 7\\\\nI0223 06:46:27.871546 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:27.871559 6899 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:27.871572 6899 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:27.871583 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:27.871622 6899 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871820 6899 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871953 6899 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872096 6899 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872403 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872627 6899 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.180622 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.186622 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtvdl\" (UniqueName: \"kubernetes.io/projected/6a8a99af-4edb-4cf0-8c50-3ae9a6e38181-kube-api-access-jtvdl\") pod \"ovnkube-control-plane-749d76644c-vz92d\" (UID: \"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.201388 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.214606 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.215801 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.215843 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.215852 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.215873 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.215883 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:29Z","lastTransitionTime":"2026-02-23T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.237198 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.249486 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.249542 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.249577 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.249604 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.249701 5047 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.249762 5047 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.249786 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.249801 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.249831 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.249812 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.249848 5047 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.249850 5047 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.249877 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:45.24982383 +0000 UTC m=+127.501151134 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.249953 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:45.249896332 +0000 UTC m=+127.501223626 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.249982 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:45.249971214 +0000 UTC m=+127.501298358 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.250006 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:45.249994095 +0000 UTC m=+127.501321459 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.253769 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.278441 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.283522 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.292273 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: W0223 06:46:29.304147 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a8a99af_4edb_4cf0_8c50_3ae9a6e38181.slice/crio-22cd76e2bdf641d5cda7fccd376fc55c720ecfec39d1def06985b149c8b92725 WatchSource:0}: Error finding container 22cd76e2bdf641d5cda7fccd376fc55c720ecfec39d1def06985b149c8b92725: Status 404 returned error can't find the container with id 22cd76e2bdf641d5cda7fccd376fc55c720ecfec39d1def06985b149c8b92725 Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.312968 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.320527 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.320667 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.320765 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.320850 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.320999 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:29Z","lastTransitionTime":"2026-02-23T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.330895 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.340627 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.340649 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.340709 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.340890 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.341512 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.341612 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.349138 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.358526 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 03:35:50.011149076 +0000 UTC Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.366115 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.385238 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:26Z\\\",\\\"message\\\":\\\"7\\\\nI0223 06:46:26.339697 6746 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:26.339703 6746 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:26.340193 6746 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:26.339669 6746 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:46:26.340259 6746 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 06:46:26.340316 6746 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 06:46:26.340347 6746 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:26.340390 6746 factory.go:656] Stopping watch factory\\\\nI0223 06:46:26.340403 6746 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:26.340406 6746 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0223 06:46:26.340432 6746 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:46:26.340439 6746 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:26.340500 6746 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"message\\\":\\\"emoval\\\\nI0223 06:46:27.871478 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0223 06:46:27.871499 6899 factory.go:656] Stopping watch factory\\\\nI0223 06:46:27.871528 6899 handler.go:208] Removed *v1.Node event handler 7\\\\nI0223 06:46:27.871546 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:27.871559 6899 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:27.871572 6899 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:27.871583 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:27.871622 6899 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871820 6899 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871953 6899 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872096 6899 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872403 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872627 6899 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.398890 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.414031 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.423772 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.423840 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.423861 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.423896 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.423952 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:29Z","lastTransitionTime":"2026-02-23T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.427808 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.446260 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.469388 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.492596 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.526339 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.526389 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.526401 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.526419 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.526431 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:29Z","lastTransitionTime":"2026-02-23T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.628667 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.628718 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.628729 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.628746 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.628760 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:29Z","lastTransitionTime":"2026-02-23T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.685622 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-54jbp"] Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.686842 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.686988 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.701370 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.718388 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.730843 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.731818 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.731855 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.731866 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.731882 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.731892 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:29Z","lastTransitionTime":"2026-02-23T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.753675 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.755034 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.755072 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvfcd\" (UniqueName: \"kubernetes.io/projected/cb811549-5811-4996-ba8c-6f8848a80ce7-kube-api-access-zvfcd\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.777441 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.811368 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.829202 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.834808 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.834844 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.834855 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.834871 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.834883 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:29Z","lastTransitionTime":"2026-02-23T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.847749 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.856712 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.856797 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvfcd\" (UniqueName: \"kubernetes.io/projected/cb811549-5811-4996-ba8c-6f8848a80ce7-kube-api-access-zvfcd\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.856938 5047 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.857038 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs podName:cb811549-5811-4996-ba8c-6f8848a80ce7 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:30.357003157 +0000 UTC m=+112.608330331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs") pod "network-metrics-daemon-54jbp" (UID: "cb811549-5811-4996-ba8c-6f8848a80ce7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.861750 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.874689 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.879424 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvfcd\" (UniqueName: \"kubernetes.io/projected/cb811549-5811-4996-ba8c-6f8848a80ce7-kube-api-access-zvfcd\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.896027 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.919008 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce16cd9cfe55939d35d6cf0ca055b26f7a8aafa2e8bd66214865f7feb08c0714\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:26Z\\\",\\\"message\\\":\\\"7\\\\nI0223 06:46:26.339697 6746 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:26.339703 6746 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:26.340193 6746 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:26.339669 6746 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:46:26.340259 6746 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 06:46:26.340316 6746 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 06:46:26.340347 6746 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:26.340390 6746 factory.go:656] Stopping watch factory\\\\nI0223 06:46:26.340403 6746 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:26.340406 6746 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0223 06:46:26.340432 6746 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:46:26.340439 6746 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:26.340500 6746 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"message\\\":\\\"emoval\\\\nI0223 06:46:27.871478 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0223 06:46:27.871499 6899 factory.go:656] Stopping watch factory\\\\nI0223 06:46:27.871528 6899 handler.go:208] Removed *v1.Node event handler 7\\\\nI0223 06:46:27.871546 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:27.871559 6899 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:27.871572 6899 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:27.871583 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:27.871622 6899 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871820 6899 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871953 6899 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872096 6899 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872403 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872627 6899 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.933712 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.944594 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.944646 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.944659 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.944679 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.944691 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:29Z","lastTransitionTime":"2026-02-23T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.950858 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" event={"ID":"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181","Type":"ContainerStarted","Data":"2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175"} Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.950895 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" event={"ID":"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181","Type":"ContainerStarted","Data":"7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352"} Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.950926 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" event={"ID":"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181","Type":"ContainerStarted","Data":"22cd76e2bdf641d5cda7fccd376fc55c720ecfec39d1def06985b149c8b92725"} Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.953044 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/1.log" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.955572 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.957024 5047 scope.go:117] "RemoveContainer" containerID="677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c" Feb 23 06:46:29 crc kubenswrapper[5047]: E0223 06:46:29.957182 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.972216 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.984247 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:29 crc kubenswrapper[5047]: I0223 06:46:29.995715 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.010665 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.024770 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.042334 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.047855 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.047930 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.047949 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.047974 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.047991 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:30Z","lastTransitionTime":"2026-02-23T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.063986 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.103974 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.120821 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.140785 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.151259 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.151306 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.151323 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.151346 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.151364 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:30Z","lastTransitionTime":"2026-02-23T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.160208 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.186189 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.216933 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"message\\\":\\\"emoval\\\\nI0223 06:46:27.871478 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0223 06:46:27.871499 6899 factory.go:656] Stopping watch factory\\\\nI0223 06:46:27.871528 6899 handler.go:208] Removed *v1.Node event handler 7\\\\nI0223 06:46:27.871546 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:27.871559 6899 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:27.871572 6899 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:27.871583 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:27.871622 6899 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871820 6899 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871953 6899 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872096 6899 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872403 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872627 6899 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.240783 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.254976 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.255025 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.255041 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.255063 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.255083 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:30Z","lastTransitionTime":"2026-02-23T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.261037 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.281762 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.297923 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.314465 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.361736 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.362321 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.362523 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.362673 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.362799 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:30Z","lastTransitionTime":"2026-02-23T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.364477 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:30 crc kubenswrapper[5047]: E0223 06:46:30.364847 5047 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:30 crc kubenswrapper[5047]: E0223 06:46:30.365131 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs podName:cb811549-5811-4996-ba8c-6f8848a80ce7 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:31.365102286 +0000 UTC m=+113.616429440 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs") pod "network-metrics-daemon-54jbp" (UID: "cb811549-5811-4996-ba8c-6f8848a80ce7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.365868 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 10:25:39.408454113 +0000 UTC Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.466404 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.466497 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.466527 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.466566 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.466593 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:30Z","lastTransitionTime":"2026-02-23T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.570075 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.570120 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.570133 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.570154 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.570169 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:30Z","lastTransitionTime":"2026-02-23T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.673650 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.673713 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.673732 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.673759 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.673781 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:30Z","lastTransitionTime":"2026-02-23T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.777173 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.777244 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.777264 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.777297 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.777317 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:30Z","lastTransitionTime":"2026-02-23T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.880782 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.881249 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.881337 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.881418 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.881512 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:30Z","lastTransitionTime":"2026-02-23T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.984578 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.984625 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.984636 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.984654 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:30 crc kubenswrapper[5047]: I0223 06:46:30.984667 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:30Z","lastTransitionTime":"2026-02-23T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.088598 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.088671 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.088689 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.088716 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.088737 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:31Z","lastTransitionTime":"2026-02-23T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.192522 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.192605 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.192629 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.192661 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.192685 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:31Z","lastTransitionTime":"2026-02-23T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.296777 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.296832 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.296845 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.296865 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.296880 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:31Z","lastTransitionTime":"2026-02-23T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.340481 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.340602 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.340483 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.340738 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:31 crc kubenswrapper[5047]: E0223 06:46:31.340788 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:31 crc kubenswrapper[5047]: E0223 06:46:31.340852 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:31 crc kubenswrapper[5047]: E0223 06:46:31.340973 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:31 crc kubenswrapper[5047]: E0223 06:46:31.341171 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.366590 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 00:16:57.711641414 +0000 UTC Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.377592 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:31 crc kubenswrapper[5047]: E0223 06:46:31.377864 5047 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:31 crc kubenswrapper[5047]: E0223 06:46:31.378026 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs podName:cb811549-5811-4996-ba8c-6f8848a80ce7 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:33.377993039 +0000 UTC m=+115.629320393 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs") pod "network-metrics-daemon-54jbp" (UID: "cb811549-5811-4996-ba8c-6f8848a80ce7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.399312 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.399392 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.399412 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.399443 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.399477 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:31Z","lastTransitionTime":"2026-02-23T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.502993 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.503060 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.503068 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.503087 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.503097 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:31Z","lastTransitionTime":"2026-02-23T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.605844 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.605978 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.605997 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.606029 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.606048 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:31Z","lastTransitionTime":"2026-02-23T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.709966 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.710029 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.710044 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.710071 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.710088 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:31Z","lastTransitionTime":"2026-02-23T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.813883 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.813993 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.814016 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.814043 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.814063 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:31Z","lastTransitionTime":"2026-02-23T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.917436 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.917538 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.917552 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.917581 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:31 crc kubenswrapper[5047]: I0223 06:46:31.917595 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:31Z","lastTransitionTime":"2026-02-23T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.021309 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.021385 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.021404 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.021433 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.021451 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.125585 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.125662 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.125671 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.125692 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.125704 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.229493 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.229572 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.229590 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.229620 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.229642 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.333850 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.333939 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.333953 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.333976 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.333990 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.367329 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 07:25:14.356256029 +0000 UTC Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.437782 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.437857 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.437871 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.437897 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.437936 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.540858 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.540894 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.540919 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.540933 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.540942 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.643873 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.644015 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.644049 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.644091 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.644114 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.747465 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.747528 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.747545 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.747569 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.747587 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.768387 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.768442 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.768467 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.768500 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.768525 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: E0223 06:46:32.793654 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.800058 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.800111 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.800130 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.800157 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.800175 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: E0223 06:46:32.821245 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.827838 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.827893 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.827940 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.827967 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.827992 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: E0223 06:46:32.851396 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.857623 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.857693 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.857718 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.857751 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.857776 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: E0223 06:46:32.880324 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.885097 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.885198 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.885221 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.885253 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.885279 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:32 crc kubenswrapper[5047]: E0223 06:46:32.904679 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:32 crc kubenswrapper[5047]: E0223 06:46:32.904927 5047 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.907385 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.907440 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.907455 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.907477 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:32 crc kubenswrapper[5047]: I0223 06:46:32.907492 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:32Z","lastTransitionTime":"2026-02-23T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.010715 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.010776 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.010791 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.010813 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.010829 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:33Z","lastTransitionTime":"2026-02-23T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.114032 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.114080 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.114089 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.114109 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.114125 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:33Z","lastTransitionTime":"2026-02-23T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.217221 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.217263 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.217280 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.217301 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.217315 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:33Z","lastTransitionTime":"2026-02-23T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.321635 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.321698 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.321719 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.321750 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.321773 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:33Z","lastTransitionTime":"2026-02-23T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.340099 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.340155 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.340152 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.340040 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:33 crc kubenswrapper[5047]: E0223 06:46:33.340349 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:33 crc kubenswrapper[5047]: E0223 06:46:33.340499 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:33 crc kubenswrapper[5047]: E0223 06:46:33.340694 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:33 crc kubenswrapper[5047]: E0223 06:46:33.340875 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.368389 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 08:36:57.48952857 +0000 UTC Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.404198 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:33 crc kubenswrapper[5047]: E0223 06:46:33.404361 5047 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:33 crc kubenswrapper[5047]: E0223 06:46:33.404444 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs podName:cb811549-5811-4996-ba8c-6f8848a80ce7 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:37.404424803 +0000 UTC m=+119.655751927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs") pod "network-metrics-daemon-54jbp" (UID: "cb811549-5811-4996-ba8c-6f8848a80ce7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.424852 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.424928 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.424957 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.424981 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.424997 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:33Z","lastTransitionTime":"2026-02-23T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.527845 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.527948 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.527971 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.528002 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.528021 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:33Z","lastTransitionTime":"2026-02-23T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.631070 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.631129 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.631139 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.631156 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.631166 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:33Z","lastTransitionTime":"2026-02-23T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.734291 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.734369 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.734393 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.734432 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.734454 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:33Z","lastTransitionTime":"2026-02-23T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.837870 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.837943 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.837956 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.837977 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.837992 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:33Z","lastTransitionTime":"2026-02-23T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.941175 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.941239 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.941253 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.941275 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:33 crc kubenswrapper[5047]: I0223 06:46:33.941291 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:33Z","lastTransitionTime":"2026-02-23T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.045640 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.045716 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.045734 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.045763 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.045787 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:34Z","lastTransitionTime":"2026-02-23T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.149485 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.149533 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.149543 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.149563 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.149574 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:34Z","lastTransitionTime":"2026-02-23T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.253197 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.253254 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.253268 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.253290 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.253308 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:34Z","lastTransitionTime":"2026-02-23T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.355787 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.355899 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.356115 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.356146 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.356168 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:34Z","lastTransitionTime":"2026-02-23T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.369003 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 11:09:08.877450616 +0000 UTC Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.460379 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.460449 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.460467 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.460493 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.460528 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:34Z","lastTransitionTime":"2026-02-23T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.564006 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.564089 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.564107 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.564137 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.564156 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:34Z","lastTransitionTime":"2026-02-23T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.667723 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.667804 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.667826 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.667858 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.667881 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:34Z","lastTransitionTime":"2026-02-23T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.771385 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.771442 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.771456 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.771498 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.771514 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:34Z","lastTransitionTime":"2026-02-23T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.874495 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.874564 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.874583 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.874608 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.874626 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:34Z","lastTransitionTime":"2026-02-23T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.976339 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.976401 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.976421 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.976444 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:34 crc kubenswrapper[5047]: I0223 06:46:34.976464 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:34Z","lastTransitionTime":"2026-02-23T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.080411 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.080486 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.080506 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.080534 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.080556 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:35Z","lastTransitionTime":"2026-02-23T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.183707 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.183750 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.183760 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.183775 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.183785 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:35Z","lastTransitionTime":"2026-02-23T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.288129 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.288207 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.288233 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.288264 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.288286 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:35Z","lastTransitionTime":"2026-02-23T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.340842 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.340964 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.341033 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.341092 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:35 crc kubenswrapper[5047]: E0223 06:46:35.341255 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:35 crc kubenswrapper[5047]: E0223 06:46:35.341415 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:35 crc kubenswrapper[5047]: E0223 06:46:35.341662 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:35 crc kubenswrapper[5047]: E0223 06:46:35.341940 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.370269 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 09:00:52.583855143 +0000 UTC Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.391598 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.391695 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.391727 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.391761 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.391786 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:35Z","lastTransitionTime":"2026-02-23T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.496686 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.496748 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.496764 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.496784 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.496800 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:35Z","lastTransitionTime":"2026-02-23T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.582629 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.600391 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.600459 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.600479 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.600505 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.600523 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:35Z","lastTransitionTime":"2026-02-23T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.604039 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.619430 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.655166 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.677540 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.699667 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.704619 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.704690 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.704709 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.704739 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.704760 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:35Z","lastTransitionTime":"2026-02-23T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.719212 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.740075 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.760096 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.795624 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"message\\\":\\\"emoval\\\\nI0223 06:46:27.871478 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0223 06:46:27.871499 6899 factory.go:656] Stopping watch factory\\\\nI0223 06:46:27.871528 6899 handler.go:208] Removed *v1.Node event handler 7\\\\nI0223 06:46:27.871546 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:27.871559 6899 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:27.871572 6899 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:27.871583 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:27.871622 6899 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871820 6899 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871953 6899 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872096 6899 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872403 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872627 6899 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.808894 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.809003 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.809022 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.809064 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.809090 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:35Z","lastTransitionTime":"2026-02-23T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.816470 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.837897 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.863104 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.881966 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.902313 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.912441 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.912509 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.912523 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.912546 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.912561 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:35Z","lastTransitionTime":"2026-02-23T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.928362 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:35 crc kubenswrapper[5047]: I0223 06:46:35.952381 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.016065 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.016175 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.016209 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.016244 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.016266 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:36Z","lastTransitionTime":"2026-02-23T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.119259 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.119328 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.119345 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.119371 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.119394 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:36Z","lastTransitionTime":"2026-02-23T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.223317 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.223372 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.223384 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.223403 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.223416 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:36Z","lastTransitionTime":"2026-02-23T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.326531 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.326599 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.326614 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.326638 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.326656 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:36Z","lastTransitionTime":"2026-02-23T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.371152 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 02:37:54.885987321 +0000 UTC Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.430384 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.430699 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.430726 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.430760 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.430782 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:36Z","lastTransitionTime":"2026-02-23T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.535155 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.535229 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.535241 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.535262 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.535274 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:36Z","lastTransitionTime":"2026-02-23T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.638323 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.638416 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.638437 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.638472 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.638494 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:36Z","lastTransitionTime":"2026-02-23T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.741421 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.741466 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.741477 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.741494 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.741505 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:36Z","lastTransitionTime":"2026-02-23T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.845167 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.846192 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.846435 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.846676 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.847166 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:36Z","lastTransitionTime":"2026-02-23T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.951158 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.951221 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.951246 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.951278 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:36 crc kubenswrapper[5047]: I0223 06:46:36.951298 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:36Z","lastTransitionTime":"2026-02-23T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.054698 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.054770 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.054792 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.054868 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.054895 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:37Z","lastTransitionTime":"2026-02-23T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.159589 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.160151 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.160179 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.160217 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.160242 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:37Z","lastTransitionTime":"2026-02-23T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.263429 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.263478 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.263489 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.263507 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.263523 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:37Z","lastTransitionTime":"2026-02-23T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.340700 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.340757 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.340713 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:37 crc kubenswrapper[5047]: E0223 06:46:37.340968 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.341050 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:37 crc kubenswrapper[5047]: E0223 06:46:37.341172 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:37 crc kubenswrapper[5047]: E0223 06:46:37.341254 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:37 crc kubenswrapper[5047]: E0223 06:46:37.341304 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.366746 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.366808 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.366826 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.366849 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.366869 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:37Z","lastTransitionTime":"2026-02-23T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.372053 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 17:03:43.942983701 +0000 UTC Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.457169 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:37 crc kubenswrapper[5047]: E0223 06:46:37.457374 5047 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:37 crc kubenswrapper[5047]: E0223 06:46:37.457499 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs podName:cb811549-5811-4996-ba8c-6f8848a80ce7 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:45.457466667 +0000 UTC m=+127.708793801 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs") pod "network-metrics-daemon-54jbp" (UID: "cb811549-5811-4996-ba8c-6f8848a80ce7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.470770 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.470813 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.470823 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.470842 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.470856 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:37Z","lastTransitionTime":"2026-02-23T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.574993 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.575078 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.575096 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.575131 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.575163 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:37Z","lastTransitionTime":"2026-02-23T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.678813 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.678890 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.678939 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.678966 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.678992 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:37Z","lastTransitionTime":"2026-02-23T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.783970 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.784057 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.784075 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.784102 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.784119 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:37Z","lastTransitionTime":"2026-02-23T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.887788 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.887883 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.887936 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.887966 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.887986 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:37Z","lastTransitionTime":"2026-02-23T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.990102 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.990166 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.990186 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.990215 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:37 crc kubenswrapper[5047]: I0223 06:46:37.990238 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:37Z","lastTransitionTime":"2026-02-23T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.094520 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.094583 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.094602 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.094626 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.094645 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:38Z","lastTransitionTime":"2026-02-23T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.198440 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.198511 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.198528 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.198556 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.198574 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:38Z","lastTransitionTime":"2026-02-23T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:38 crc kubenswrapper[5047]: E0223 06:46:38.299318 5047 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.360385 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.372723 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 14:21:43.333278008 +0000 UTC Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.392478 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"message\\\":\\\"emoval\\\\nI0223 06:46:27.871478 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0223 06:46:27.871499 6899 factory.go:656] Stopping watch factory\\\\nI0223 06:46:27.871528 6899 handler.go:208] Removed *v1.Node event handler 7\\\\nI0223 06:46:27.871546 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:27.871559 6899 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:27.871572 6899 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:27.871583 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:27.871622 6899 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871820 6899 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871953 6899 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872096 6899 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872403 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872627 6899 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.408988 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.422631 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.439763 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: E0223 06:46:38.449754 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.454050 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.467985 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.479385 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.495131 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.512771 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.533164 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.550814 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.575642 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.592465 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.609433 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:38 crc kubenswrapper[5047]: I0223 06:46:38.626575 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:39 crc kubenswrapper[5047]: I0223 06:46:39.340825 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:39 crc kubenswrapper[5047]: I0223 06:46:39.340864 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:39 crc kubenswrapper[5047]: I0223 06:46:39.340977 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:39 crc kubenswrapper[5047]: I0223 06:46:39.341054 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:39 crc kubenswrapper[5047]: E0223 06:46:39.341161 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:39 crc kubenswrapper[5047]: E0223 06:46:39.341335 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:39 crc kubenswrapper[5047]: E0223 06:46:39.341586 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:39 crc kubenswrapper[5047]: E0223 06:46:39.341655 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:39 crc kubenswrapper[5047]: I0223 06:46:39.373681 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 07:03:40.993141745 +0000 UTC Feb 23 06:46:40 crc kubenswrapper[5047]: I0223 06:46:40.374752 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 05:44:42.151663634 +0000 UTC Feb 23 06:46:41 crc kubenswrapper[5047]: I0223 06:46:41.340328 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:41 crc kubenswrapper[5047]: I0223 06:46:41.340330 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:41 crc kubenswrapper[5047]: E0223 06:46:41.340556 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:41 crc kubenswrapper[5047]: I0223 06:46:41.340389 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:41 crc kubenswrapper[5047]: E0223 06:46:41.340692 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:41 crc kubenswrapper[5047]: I0223 06:46:41.340368 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:41 crc kubenswrapper[5047]: E0223 06:46:41.340791 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:41 crc kubenswrapper[5047]: E0223 06:46:41.341008 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:41 crc kubenswrapper[5047]: I0223 06:46:41.375493 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 11:13:34.137296126 +0000 UTC Feb 23 06:46:42 crc kubenswrapper[5047]: I0223 06:46:42.376531 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 15:15:40.52309202 +0000 UTC Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.046987 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.047062 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.047316 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.047396 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.047418 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:43Z","lastTransitionTime":"2026-02-23T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:43 crc kubenswrapper[5047]: E0223 06:46:43.065514 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:43Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.073033 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.073114 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.073127 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.073147 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.073182 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:43Z","lastTransitionTime":"2026-02-23T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:43 crc kubenswrapper[5047]: E0223 06:46:43.097726 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:43Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.104324 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.104436 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.104465 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.104518 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.104543 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:43Z","lastTransitionTime":"2026-02-23T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:43 crc kubenswrapper[5047]: E0223 06:46:43.128518 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:43Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.134525 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.134591 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.134610 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.134642 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.134661 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:43Z","lastTransitionTime":"2026-02-23T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:43 crc kubenswrapper[5047]: E0223 06:46:43.157872 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:43Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.163365 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.163425 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.163448 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.163475 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.163492 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:43Z","lastTransitionTime":"2026-02-23T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:43 crc kubenswrapper[5047]: E0223 06:46:43.184110 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:43Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:43 crc kubenswrapper[5047]: E0223 06:46:43.184381 5047 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.340316 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.340317 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.340539 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.340718 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:43 crc kubenswrapper[5047]: E0223 06:46:43.340736 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:43 crc kubenswrapper[5047]: E0223 06:46:43.341051 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:43 crc kubenswrapper[5047]: E0223 06:46:43.341159 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:43 crc kubenswrapper[5047]: E0223 06:46:43.341282 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:43 crc kubenswrapper[5047]: I0223 06:46:43.377009 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 19:41:35.62420404 +0000 UTC Feb 23 06:46:43 crc kubenswrapper[5047]: E0223 06:46:43.451355 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:46:44 crc kubenswrapper[5047]: I0223 06:46:44.342766 5047 scope.go:117] "RemoveContainer" containerID="677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c" Feb 23 06:46:44 crc kubenswrapper[5047]: I0223 06:46:44.377848 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 14:05:17.702507736 +0000 UTC Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.021152 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/1.log" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.024657 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerStarted","Data":"3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc"} Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.025661 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.041656 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.058721 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.118585 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.143243 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.157161 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.157765 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:47:17.157744183 +0000 UTC m=+159.409071317 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.161457 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.173940 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.183037 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.199125 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.218143 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"message\\\":\\\"emoval\\\\nI0223 06:46:27.871478 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0223 06:46:27.871499 6899 factory.go:656] Stopping watch factory\\\\nI0223 06:46:27.871528 6899 handler.go:208] Removed *v1.Node event handler 7\\\\nI0223 06:46:27.871546 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:27.871559 6899 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:27.871572 6899 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:27.871583 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:27.871622 6899 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871820 6899 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871953 6899 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872096 6899 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872403 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872627 6899 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.235597 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.249669 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.258398 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.258451 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.258488 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.258511 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.258564 5047 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.258643 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:47:17.258619194 +0000 UTC m=+159.509946328 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.258674 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.258683 5047 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.258747 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.258696 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.258791 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.258800 5047 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.258807 5047 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.258779 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:47:17.258757938 +0000 UTC m=+159.510085072 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.258855 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:47:17.258848511 +0000 UTC m=+159.510175645 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.258873 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:47:17.258864731 +0000 UTC m=+159.510191865 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.266983 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.287802 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.298674 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.310463 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.320196 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:45Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.340827 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.340933 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.340999 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.340832 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.341110 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.340854 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.341220 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.341298 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.378627 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 02:22:05.987697875 +0000 UTC Feb 23 06:46:45 crc kubenswrapper[5047]: I0223 06:46:45.460664 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.460952 5047 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:45 crc kubenswrapper[5047]: E0223 06:46:45.461080 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs podName:cb811549-5811-4996-ba8c-6f8848a80ce7 nodeName:}" failed. No retries permitted until 2026-02-23 06:47:01.461016683 +0000 UTC m=+143.712343837 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs") pod "network-metrics-daemon-54jbp" (UID: "cb811549-5811-4996-ba8c-6f8848a80ce7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.030368 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/2.log" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.031182 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/1.log" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.034231 5047 generic.go:334] "Generic (PLEG): container finished" podID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerID="3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc" exitCode=1 Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.034277 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc"} Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.034319 5047 scope.go:117] "RemoveContainer" containerID="677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.034915 5047 scope.go:117] "RemoveContainer" containerID="3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc" Feb 23 06:46:46 crc kubenswrapper[5047]: E0223 06:46:46.035097 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.056593 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.072082 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.085014 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.097695 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.110176 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.125400 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.148667 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://677d1cf1245f220c8a46d8728deb3781d5c2f9ced1170edabcb8e1a02ab1362c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"message\\\":\\\"emoval\\\\nI0223 06:46:27.871478 6899 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0223 06:46:27.871499 6899 factory.go:656] Stopping watch factory\\\\nI0223 06:46:27.871528 6899 handler.go:208] Removed *v1.Node event handler 7\\\\nI0223 06:46:27.871546 6899 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 06:46:27.871559 6899 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 06:46:27.871572 6899 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 06:46:27.871583 6899 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:46:27.871622 6899 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871820 6899 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.871953 6899 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872096 6899 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872403 6899 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:46:27.872627 6899 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:45Z\\\",\\\"message\\\":\\\".396596 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0223 06:46:45.396601 7154 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9gf5k in node crc\\\\nI0223 06:46:45.396434 7154 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0223 06:46:45.396608 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9gf5k after 0 failed attempt(s)\\\\nI0223 06:46:45.396611 7154 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-54jbp] creating logical port openshift-multus_network-metrics-daemon-54jbp for pod on switch crc\\\\nI0223 06:46:45.396602 7154 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0223 06:46:45.396568 7154 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:46:45.396682 7154 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.163383 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.180798 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.193342 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.218977 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.232731 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.265191 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.280379 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.296796 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.313400 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:46 crc kubenswrapper[5047]: I0223 06:46:46.379599 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 15:32:11.306008923 +0000 UTC Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.040715 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/2.log" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.044466 5047 scope.go:117] "RemoveContainer" containerID="3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc" Feb 23 06:46:47 crc kubenswrapper[5047]: E0223 06:46:47.044856 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.057511 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.072051 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.085552 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.101636 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.115439 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.141237 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.165525 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.180625 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.199439 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.223827 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.255450 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:45Z\\\",\\\"message\\\":\\\".396596 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0223 06:46:45.396601 7154 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9gf5k in node crc\\\\nI0223 06:46:45.396434 7154 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0223 06:46:45.396608 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9gf5k after 0 failed attempt(s)\\\\nI0223 06:46:45.396611 7154 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-54jbp] creating logical port openshift-multus_network-metrics-daemon-54jbp for pod on switch crc\\\\nI0223 06:46:45.396602 7154 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0223 06:46:45.396568 7154 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:46:45.396682 7154 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.272866 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.290031 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.304787 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.318869 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.331087 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.340513 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.340520 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.340702 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:47 crc kubenswrapper[5047]: E0223 06:46:47.340819 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.340847 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:47 crc kubenswrapper[5047]: E0223 06:46:47.341058 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:47 crc kubenswrapper[5047]: E0223 06:46:47.341124 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:47 crc kubenswrapper[5047]: E0223 06:46:47.341306 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:47 crc kubenswrapper[5047]: I0223 06:46:47.380132 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 19:21:02.815411155 +0000 UTC Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.357851 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.376510 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.380539 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 11:18:38.412373273 +0000 UTC Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.392847 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.411483 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.432276 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: E0223 06:46:48.452526 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.453722 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.479301 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.506381 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.532620 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.547155 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.562424 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.574774 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.592083 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.624572 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:45Z\\\",\\\"message\\\":\\\".396596 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0223 06:46:45.396601 7154 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9gf5k in node crc\\\\nI0223 06:46:45.396434 7154 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0223 06:46:45.396608 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9gf5k after 0 failed attempt(s)\\\\nI0223 06:46:45.396611 7154 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-54jbp] creating logical port openshift-multus_network-metrics-daemon-54jbp for pod on switch crc\\\\nI0223 06:46:45.396602 7154 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0223 06:46:45.396568 7154 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:46:45.396682 7154 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.641283 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:48 crc kubenswrapper[5047]: I0223 06:46:48.657635 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:49 crc kubenswrapper[5047]: I0223 06:46:49.340935 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:49 crc kubenswrapper[5047]: I0223 06:46:49.340978 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:49 crc kubenswrapper[5047]: I0223 06:46:49.340984 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:49 crc kubenswrapper[5047]: I0223 06:46:49.341033 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:49 crc kubenswrapper[5047]: E0223 06:46:49.341795 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:49 crc kubenswrapper[5047]: E0223 06:46:49.341951 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:49 crc kubenswrapper[5047]: E0223 06:46:49.342093 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:49 crc kubenswrapper[5047]: E0223 06:46:49.342260 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:49 crc kubenswrapper[5047]: I0223 06:46:49.380968 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 08:31:12.076915033 +0000 UTC Feb 23 06:46:50 crc kubenswrapper[5047]: I0223 06:46:50.381969 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 20:45:22.14306346 +0000 UTC Feb 23 06:46:51 crc kubenswrapper[5047]: I0223 06:46:51.340869 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:51 crc kubenswrapper[5047]: I0223 06:46:51.340889 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:51 crc kubenswrapper[5047]: E0223 06:46:51.341107 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:51 crc kubenswrapper[5047]: I0223 06:46:51.340940 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:51 crc kubenswrapper[5047]: I0223 06:46:51.340939 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:51 crc kubenswrapper[5047]: E0223 06:46:51.341309 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:51 crc kubenswrapper[5047]: E0223 06:46:51.341497 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:51 crc kubenswrapper[5047]: E0223 06:46:51.341630 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:51 crc kubenswrapper[5047]: I0223 06:46:51.382829 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 09:28:15.855982741 +0000 UTC Feb 23 06:46:52 crc kubenswrapper[5047]: I0223 06:46:52.384530 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 19:06:37.086911638 +0000 UTC Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.288892 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.288973 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.289488 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.289785 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.289800 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:53Z","lastTransitionTime":"2026-02-23T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:53 crc kubenswrapper[5047]: E0223 06:46:53.303003 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.308426 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.308468 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.308487 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.308505 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.308517 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:53Z","lastTransitionTime":"2026-02-23T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:53 crc kubenswrapper[5047]: E0223 06:46:53.327781 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.332572 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.332620 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.332631 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.332660 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.332673 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:53Z","lastTransitionTime":"2026-02-23T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.340923 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.341030 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:53 crc kubenswrapper[5047]: E0223 06:46:53.341185 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.341199 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.341282 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:53 crc kubenswrapper[5047]: E0223 06:46:53.341485 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:53 crc kubenswrapper[5047]: E0223 06:46:53.341605 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:53 crc kubenswrapper[5047]: E0223 06:46:53.341686 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:53 crc kubenswrapper[5047]: E0223 06:46:53.346547 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.351259 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.351315 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.351362 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.351387 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.351405 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:53Z","lastTransitionTime":"2026-02-23T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.356669 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 23 06:46:53 crc kubenswrapper[5047]: E0223 06:46:53.370711 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.376697 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.376790 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.376814 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.376847 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.376867 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:53Z","lastTransitionTime":"2026-02-23T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:53 crc kubenswrapper[5047]: I0223 06:46:53.385652 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 09:58:12.957608212 +0000 UTC Feb 23 06:46:53 crc kubenswrapper[5047]: E0223 06:46:53.391440 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:53 crc kubenswrapper[5047]: E0223 06:46:53.391628 5047 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:46:53 crc kubenswrapper[5047]: E0223 06:46:53.453977 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:46:54 crc kubenswrapper[5047]: I0223 06:46:54.386745 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 02:12:32.907521711 +0000 UTC Feb 23 06:46:55 crc kubenswrapper[5047]: I0223 06:46:55.339860 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:55 crc kubenswrapper[5047]: I0223 06:46:55.339936 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:55 crc kubenswrapper[5047]: I0223 06:46:55.339893 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:55 crc kubenswrapper[5047]: I0223 06:46:55.339880 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:55 crc kubenswrapper[5047]: E0223 06:46:55.340063 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:55 crc kubenswrapper[5047]: E0223 06:46:55.340454 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:55 crc kubenswrapper[5047]: E0223 06:46:55.340340 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:55 crc kubenswrapper[5047]: E0223 06:46:55.340561 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:55 crc kubenswrapper[5047]: I0223 06:46:55.386954 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 13:18:54.806292932 +0000 UTC Feb 23 06:46:56 crc kubenswrapper[5047]: I0223 06:46:56.387118 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 00:11:30.816228752 +0000 UTC Feb 23 06:46:57 crc kubenswrapper[5047]: I0223 06:46:57.339834 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:57 crc kubenswrapper[5047]: I0223 06:46:57.339961 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:57 crc kubenswrapper[5047]: I0223 06:46:57.339957 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:57 crc kubenswrapper[5047]: I0223 06:46:57.340008 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:57 crc kubenswrapper[5047]: E0223 06:46:57.340067 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:57 crc kubenswrapper[5047]: E0223 06:46:57.340133 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:57 crc kubenswrapper[5047]: E0223 06:46:57.340203 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:57 crc kubenswrapper[5047]: E0223 06:46:57.340293 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:57 crc kubenswrapper[5047]: I0223 06:46:57.388343 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 17:09:50.481037397 +0000 UTC Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.366582 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.385737 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.388733 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 14:50:27.119373922 +0000 UTC Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.412183 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.429658 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.445358 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: E0223 06:46:58.454999 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.460683 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.478130 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.495012 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.514561 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:45Z\\\",\\\"message\\\":\\\".396596 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0223 06:46:45.396601 7154 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9gf5k in node crc\\\\nI0223 06:46:45.396434 7154 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0223 06:46:45.396608 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9gf5k after 0 failed attempt(s)\\\\nI0223 06:46:45.396611 7154 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-54jbp] creating logical port openshift-multus_network-metrics-daemon-54jbp for pod on switch crc\\\\nI0223 06:46:45.396602 7154 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0223 06:46:45.396568 7154 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:46:45.396682 7154 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.526687 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ac5be00-0def-4e19-9494-852debd9f858\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07882d40dca6a83764ed89d4374217a2f3f47d2b2cf1ea24d5b78d1b5d637f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30f84f8932b31484defc9e203eab4381b61d6fd0315c51b5f388477d46d0a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8820373a5aa097eb52489502d77cfb56d8c2c4b7edb0d19705e9ab00a2e131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.542358 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.562609 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.577483 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.593154 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.609263 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.624608 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:58 crc kubenswrapper[5047]: I0223 06:46:58.644817 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:46:59 crc kubenswrapper[5047]: I0223 06:46:59.339959 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:59 crc kubenswrapper[5047]: I0223 06:46:59.340084 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:59 crc kubenswrapper[5047]: E0223 06:46:59.340308 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:59 crc kubenswrapper[5047]: I0223 06:46:59.340172 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:59 crc kubenswrapper[5047]: I0223 06:46:59.340094 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:46:59 crc kubenswrapper[5047]: E0223 06:46:59.340629 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:59 crc kubenswrapper[5047]: E0223 06:46:59.340596 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:59 crc kubenswrapper[5047]: E0223 06:46:59.340986 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:46:59 crc kubenswrapper[5047]: I0223 06:46:59.389308 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 08:07:57.302023095 +0000 UTC Feb 23 06:47:00 crc kubenswrapper[5047]: I0223 06:47:00.390392 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 05:30:07.520998916 +0000 UTC Feb 23 06:47:01 crc kubenswrapper[5047]: I0223 06:47:01.340652 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:01 crc kubenswrapper[5047]: I0223 06:47:01.340707 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:01 crc kubenswrapper[5047]: I0223 06:47:01.340761 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:01 crc kubenswrapper[5047]: I0223 06:47:01.340761 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:01 crc kubenswrapper[5047]: E0223 06:47:01.340827 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:01 crc kubenswrapper[5047]: E0223 06:47:01.341042 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:01 crc kubenswrapper[5047]: E0223 06:47:01.341263 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:01 crc kubenswrapper[5047]: E0223 06:47:01.341294 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:01 crc kubenswrapper[5047]: I0223 06:47:01.390883 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 15:17:54.436808912 +0000 UTC Feb 23 06:47:01 crc kubenswrapper[5047]: I0223 06:47:01.559235 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:01 crc kubenswrapper[5047]: E0223 06:47:01.559390 5047 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:47:01 crc kubenswrapper[5047]: E0223 06:47:01.559467 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs podName:cb811549-5811-4996-ba8c-6f8848a80ce7 nodeName:}" failed. No retries permitted until 2026-02-23 06:47:33.559445127 +0000 UTC m=+175.810772261 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs") pod "network-metrics-daemon-54jbp" (UID: "cb811549-5811-4996-ba8c-6f8848a80ce7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:47:02 crc kubenswrapper[5047]: I0223 06:47:02.341700 5047 scope.go:117] "RemoveContainer" containerID="3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc" Feb 23 06:47:02 crc kubenswrapper[5047]: E0223 06:47:02.342036 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" Feb 23 06:47:02 crc kubenswrapper[5047]: I0223 06:47:02.391065 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 18:06:59.050751838 +0000 UTC Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.340446 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.340563 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.340507 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:03 crc kubenswrapper[5047]: E0223 06:47:03.340709 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:03 crc kubenswrapper[5047]: E0223 06:47:03.340603 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.340458 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:03 crc kubenswrapper[5047]: E0223 06:47:03.340995 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:03 crc kubenswrapper[5047]: E0223 06:47:03.341176 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.350925 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.391201 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 07:30:22.463814593 +0000 UTC Feb 23 06:47:03 crc kubenswrapper[5047]: E0223 06:47:03.457112 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.503438 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.503488 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.503498 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.503516 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.503528 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:03Z","lastTransitionTime":"2026-02-23T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:03 crc kubenswrapper[5047]: E0223 06:47:03.522551 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:03Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.526801 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.526854 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.526871 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.526892 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.526929 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:03Z","lastTransitionTime":"2026-02-23T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:03 crc kubenswrapper[5047]: E0223 06:47:03.539508 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:03Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.543009 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.543081 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.543096 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.543114 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.543124 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:03Z","lastTransitionTime":"2026-02-23T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:03 crc kubenswrapper[5047]: E0223 06:47:03.554376 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:03Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.558254 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.558300 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.558317 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.558336 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.558347 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:03Z","lastTransitionTime":"2026-02-23T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:03 crc kubenswrapper[5047]: E0223 06:47:03.570451 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:03Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.574450 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.574513 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.574529 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.574553 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:03 crc kubenswrapper[5047]: I0223 06:47:03.574572 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:03Z","lastTransitionTime":"2026-02-23T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:03 crc kubenswrapper[5047]: E0223 06:47:03.595297 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:03Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:03 crc kubenswrapper[5047]: E0223 06:47:03.595427 5047 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.116992 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n5dz9_e0fbd5e6-7dcc-4a13-936e-0db2e66394e8/kube-multus/0.log" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.117088 5047 generic.go:334] "Generic (PLEG): container finished" podID="e0fbd5e6-7dcc-4a13-936e-0db2e66394e8" containerID="e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea" exitCode=1 Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.117235 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n5dz9" event={"ID":"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8","Type":"ContainerDied","Data":"e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea"} Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.118430 5047 scope.go:117] "RemoveContainer" containerID="e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.145879 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.173262 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.193672 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.210662 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.225842 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.251956 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.273617 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:45Z\\\",\\\"message\\\":\\\".396596 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0223 06:46:45.396601 7154 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9gf5k in node crc\\\\nI0223 06:46:45.396434 7154 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0223 06:46:45.396608 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9gf5k after 0 failed attempt(s)\\\\nI0223 06:46:45.396611 7154 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-54jbp] creating logical port openshift-multus_network-metrics-daemon-54jbp for pod on switch crc\\\\nI0223 06:46:45.396602 7154 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0223 06:46:45.396568 7154 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:46:45.396682 7154 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.289688 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ac5be00-0def-4e19-9494-852debd9f858\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07882d40dca6a83764ed89d4374217a2f3f47d2b2cf1ea24d5b78d1b5d637f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30f84f8932b31484defc9e203eab4381b61d6fd0315c51b5f388477d46d0a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8820373a5aa097eb52489502d77cfb56d8c2c4b7edb0d19705e9ab00a2e131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.304018 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.314188 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.323123 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.335418 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.347107 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.356721 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec66e286-df44-470e-920a-7cd1b5f0dc4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://808841a24cbb860aa46294fd300f6b1fbd869ea49128ecdff9d8020c9201bde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.373722 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.387620 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.391935 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 08:22:50.087695086 +0000 UTC Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.400550 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"2026-02-23T06:46:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137\\\\n2026-02-23T06:46:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137 to /host/opt/cni/bin/\\\\n2026-02-23T06:46:18Z [verbose] multus-daemon started\\\\n2026-02-23T06:46:18Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:47:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:04 crc kubenswrapper[5047]: I0223 06:47:04.411919 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:04Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.124562 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n5dz9_e0fbd5e6-7dcc-4a13-936e-0db2e66394e8/kube-multus/0.log" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.124669 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n5dz9" event={"ID":"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8","Type":"ContainerStarted","Data":"86e559edb20cfecd437f4e96833d020bea8dda6dd89ead3660ea47947a6c0efd"} Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.141410 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.159023 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.180135 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.195329 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec66e286-df44-470e-920a-7cd1b5f0dc4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://808841a24cbb860aa46294fd300f6b1fbd869ea49128ecdff9d8020c9201bde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.215163 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.234013 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.260414 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.277786 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.300194 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e559edb20cfecd437f4e96833d020bea8dda6dd89ead3660ea47947a6c0efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"2026-02-23T06:46:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137\\\\n2026-02-23T06:46:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137 to /host/opt/cni/bin/\\\\n2026-02-23T06:46:18Z [verbose] multus-daemon started\\\\n2026-02-23T06:46:18Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:47:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:47:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.318789 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.340532 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.340610 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:05 crc kubenswrapper[5047]: E0223 06:47:05.340690 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.340746 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.340618 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:05 crc kubenswrapper[5047]: E0223 06:47:05.340843 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:05 crc kubenswrapper[5047]: E0223 06:47:05.341024 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:05 crc kubenswrapper[5047]: E0223 06:47:05.341236 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.352362 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:45Z\\\",\\\"message\\\":\\\".396596 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0223 06:46:45.396601 7154 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9gf5k in node crc\\\\nI0223 06:46:45.396434 7154 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0223 06:46:45.396608 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9gf5k after 0 failed attempt(s)\\\\nI0223 06:46:45.396611 7154 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-54jbp] creating logical port openshift-multus_network-metrics-daemon-54jbp for pod on switch crc\\\\nI0223 06:46:45.396602 7154 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0223 06:46:45.396568 7154 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:46:45.396682 7154 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.373810 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ac5be00-0def-4e19-9494-852debd9f858\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07882d40dca6a83764ed89d4374217a2f3f47d2b2cf1ea24d5b78d1b5d637f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30f84f8932b31484defc9e203eab4381b61d6fd0315c51b5f388477d46d0a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8820373a5aa097eb52489502d77cfb56d8c2c4b7edb0d19705e9ab00a2e131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.393140 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 09:27:16.708969525 +0000 UTC Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.398978 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.422876 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.444062 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.464628 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.479695 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:05 crc kubenswrapper[5047]: I0223 06:47:05.504208 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:06 crc kubenswrapper[5047]: I0223 06:47:06.393641 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 15:33:55.374129527 +0000 UTC Feb 23 06:47:07 crc kubenswrapper[5047]: I0223 06:47:07.340442 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:07 crc kubenswrapper[5047]: I0223 06:47:07.340513 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:07 crc kubenswrapper[5047]: I0223 06:47:07.340474 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:07 crc kubenswrapper[5047]: I0223 06:47:07.340442 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:07 crc kubenswrapper[5047]: E0223 06:47:07.340648 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:07 crc kubenswrapper[5047]: E0223 06:47:07.340801 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:07 crc kubenswrapper[5047]: E0223 06:47:07.340991 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:07 crc kubenswrapper[5047]: E0223 06:47:07.341084 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:07 crc kubenswrapper[5047]: I0223 06:47:07.393829 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 11:28:16.313612262 +0000 UTC Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.356453 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ac5be00-0def-4e19-9494-852debd9f858\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07882d40dca6a83764ed89d4374217a2f3f47d2b2cf1ea24d5b78d1b5d637f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30f84f8932b31484defc9e203eab4381b61d6fd0315c51b5f388477d46d0a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8820373a5aa097eb52489502d77cfb56d8c2c4b7edb0d19705e9ab00a2e131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.378398 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.394276 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 09:31:24.691053318 +0000 UTC Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.397529 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.416131 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.439386 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.456516 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: E0223 06:47:08.458451 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.480683 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.516688 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:45Z\\\",\\\"message\\\":\\\".396596 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0223 06:46:45.396601 7154 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9gf5k in node crc\\\\nI0223 06:46:45.396434 7154 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0223 06:46:45.396608 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9gf5k after 0 failed attempt(s)\\\\nI0223 06:46:45.396611 7154 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-54jbp] creating logical port openshift-multus_network-metrics-daemon-54jbp for pod on switch crc\\\\nI0223 06:46:45.396602 7154 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0223 06:46:45.396568 7154 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:46:45.396682 7154 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.531883 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.553131 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.572952 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.591261 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec66e286-df44-470e-920a-7cd1b5f0dc4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://808841a24cbb860aa46294fd300f6b1fbd869ea49128ecdff9d8020c9201bde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.613967 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.635809 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.654612 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.665890 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.676737 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e559edb20cfecd437f4e96833d020bea8dda6dd89ead3660ea47947a6c0efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"2026-02-23T06:46:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137\\\\n2026-02-23T06:46:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137 to /host/opt/cni/bin/\\\\n2026-02-23T06:46:18Z [verbose] multus-daemon started\\\\n2026-02-23T06:46:18Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:47:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:47:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:08 crc kubenswrapper[5047]: I0223 06:47:08.686710 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:09 crc kubenswrapper[5047]: I0223 06:47:09.340296 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:09 crc kubenswrapper[5047]: E0223 06:47:09.340471 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:09 crc kubenswrapper[5047]: I0223 06:47:09.340882 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:09 crc kubenswrapper[5047]: E0223 06:47:09.340965 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:09 crc kubenswrapper[5047]: I0223 06:47:09.341080 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:09 crc kubenswrapper[5047]: E0223 06:47:09.341130 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:09 crc kubenswrapper[5047]: I0223 06:47:09.341068 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:09 crc kubenswrapper[5047]: E0223 06:47:09.341324 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:09 crc kubenswrapper[5047]: I0223 06:47:09.394932 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 02:40:53.500454619 +0000 UTC Feb 23 06:47:10 crc kubenswrapper[5047]: I0223 06:47:10.395928 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 04:34:53.541248285 +0000 UTC Feb 23 06:47:11 crc kubenswrapper[5047]: I0223 06:47:11.340146 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:11 crc kubenswrapper[5047]: I0223 06:47:11.340250 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:11 crc kubenswrapper[5047]: I0223 06:47:11.340305 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:11 crc kubenswrapper[5047]: I0223 06:47:11.340399 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:11 crc kubenswrapper[5047]: E0223 06:47:11.340426 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:11 crc kubenswrapper[5047]: E0223 06:47:11.340547 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:11 crc kubenswrapper[5047]: E0223 06:47:11.340658 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:11 crc kubenswrapper[5047]: E0223 06:47:11.340734 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:11 crc kubenswrapper[5047]: I0223 06:47:11.397070 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 14:16:35.186944545 +0000 UTC Feb 23 06:47:12 crc kubenswrapper[5047]: I0223 06:47:12.397819 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 02:58:46.183121254 +0000 UTC Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.340156 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.340191 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.340274 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:13 crc kubenswrapper[5047]: E0223 06:47:13.340326 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.340163 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:13 crc kubenswrapper[5047]: E0223 06:47:13.340413 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:13 crc kubenswrapper[5047]: E0223 06:47:13.340530 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:13 crc kubenswrapper[5047]: E0223 06:47:13.340668 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.398765 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 19:35:44.770708159 +0000 UTC Feb 23 06:47:13 crc kubenswrapper[5047]: E0223 06:47:13.459895 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.684533 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.684573 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.684581 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.684595 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.684605 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:13Z","lastTransitionTime":"2026-02-23T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:13 crc kubenswrapper[5047]: E0223 06:47:13.697638 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.702268 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.702370 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.702396 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.702427 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.702451 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:13Z","lastTransitionTime":"2026-02-23T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:13 crc kubenswrapper[5047]: E0223 06:47:13.751065 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.759312 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.759366 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.759380 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.759400 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.759412 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:13Z","lastTransitionTime":"2026-02-23T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:13 crc kubenswrapper[5047]: E0223 06:47:13.777567 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.781870 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.781951 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.781969 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.781990 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.782000 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:13Z","lastTransitionTime":"2026-02-23T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:13 crc kubenswrapper[5047]: E0223 06:47:13.796180 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.800018 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.800080 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.800091 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.800114 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:13 crc kubenswrapper[5047]: I0223 06:47:13.800124 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:13Z","lastTransitionTime":"2026-02-23T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:13 crc kubenswrapper[5047]: E0223 06:47:13.810983 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:13 crc kubenswrapper[5047]: E0223 06:47:13.811095 5047 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:47:14 crc kubenswrapper[5047]: I0223 06:47:14.399602 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 00:38:39.661633715 +0000 UTC Feb 23 06:47:15 crc kubenswrapper[5047]: I0223 06:47:15.340368 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:15 crc kubenswrapper[5047]: I0223 06:47:15.340402 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:15 crc kubenswrapper[5047]: I0223 06:47:15.340491 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:15 crc kubenswrapper[5047]: I0223 06:47:15.340367 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:15 crc kubenswrapper[5047]: E0223 06:47:15.340618 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:15 crc kubenswrapper[5047]: E0223 06:47:15.340769 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:15 crc kubenswrapper[5047]: E0223 06:47:15.340839 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:15 crc kubenswrapper[5047]: E0223 06:47:15.341052 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:15 crc kubenswrapper[5047]: I0223 06:47:15.400198 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 09:18:19.523694916 +0000 UTC Feb 23 06:47:16 crc kubenswrapper[5047]: I0223 06:47:16.342626 5047 scope.go:117] "RemoveContainer" containerID="3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc" Feb 23 06:47:16 crc kubenswrapper[5047]: I0223 06:47:16.401366 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 15:44:46.643331422 +0000 UTC Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.177634 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/2.log" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.181332 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerStarted","Data":"a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3"} Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.181793 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.194515 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.207001 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.223277 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.241010 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:45Z\\\",\\\"message\\\":\\\".396596 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0223 06:46:45.396601 7154 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9gf5k in node crc\\\\nI0223 06:46:45.396434 7154 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0223 06:46:45.396608 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9gf5k after 0 failed attempt(s)\\\\nI0223 06:46:45.396611 7154 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-54jbp] creating logical port openshift-multus_network-metrics-daemon-54jbp for pod on switch crc\\\\nI0223 06:46:45.396602 7154 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0223 06:46:45.396568 7154 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:46:45.396682 7154 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.241772 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.241933 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.241897663 +0000 UTC m=+223.493224797 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.254617 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ac5be00-0def-4e19-9494-852debd9f858\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07882d40dca6a83764ed89d4374217a2f3f47d2b2cf1ea24d5b78d1b5d637f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30f84f8932b31484defc9e203eab4381b61d6fd0315c51b5f388477d46d0a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8820373a5aa097eb52489502d77cfb56d8c2c4b7edb0d19705e9ab00a2e131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.272490 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.315131 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.327417 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.339002 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.340161 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.340192 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.340297 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.340308 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.340327 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.340501 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.340555 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.340815 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.343263 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.343322 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.343364 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.343405 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.343405 5047 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.343505 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.343521 5047 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.343532 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.343557 5047 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.343572 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.343597 5047 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.343610 5047 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.343532 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.343504635 +0000 UTC m=+223.594831809 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.343680 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.343651099 +0000 UTC m=+223.594978243 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.343700 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.34369117 +0000 UTC m=+223.595018314 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:47:17 crc kubenswrapper[5047]: E0223 06:47:17.343719 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.343710211 +0000 UTC m=+223.595037365 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.356225 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.357407 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.368553 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.382715 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec66e286-df44-470e-920a-7cd1b5f0dc4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://808841a24cbb860aa46294fd300f6b1fbd869ea49128ecdff9d8020c9201bde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.397669 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.402814 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 10:28:13.802398687 +0000 UTC Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.410864 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.421613 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.445459 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.458486 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:17 crc kubenswrapper[5047]: I0223 06:47:17.470678 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e559edb20cfecd437f4e96833d020bea8dda6dd89ead3660ea47947a6c0efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"2026-02-23T06:46:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137\\\\n2026-02-23T06:46:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137 to /host/opt/cni/bin/\\\\n2026-02-23T06:46:18Z [verbose] multus-daemon started\\\\n2026-02-23T06:46:18Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:47:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:47:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.186407 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/3.log" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.187125 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/2.log" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.189680 5047 generic.go:334] "Generic (PLEG): container finished" podID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerID="a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3" exitCode=1 Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.189776 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3"} Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.189820 5047 scope.go:117] "RemoveContainer" containerID="3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.190404 5047 scope.go:117] "RemoveContainer" containerID="a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3" Feb 23 06:47:18 crc kubenswrapper[5047]: E0223 06:47:18.190661 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.206969 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.225343 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.237832 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.249057 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.264037 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.278716 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.296425 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:45Z\\\",\\\"message\\\":\\\".396596 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0223 06:46:45.396601 7154 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9gf5k in node crc\\\\nI0223 06:46:45.396434 7154 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0223 06:46:45.396608 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9gf5k after 0 failed attempt(s)\\\\nI0223 06:46:45.396611 7154 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-54jbp] creating logical port openshift-multus_network-metrics-daemon-54jbp for pod on switch crc\\\\nI0223 06:46:45.396602 7154 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0223 06:46:45.396568 7154 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:46:45.396682 7154 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:47:17Z\\\",\\\"message\\\":\\\"s:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0223 06:47:17.268425 7479 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:47:17.268515 7479 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z]\\\\nI0223 06:47:17.268382 7479 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.309318 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ac5be00-0def-4e19-9494-852debd9f858\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07882d40dca6a83764ed89d4374217a2f3f47d2b2cf1ea24d5b78d1b5d637f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30f84f8932b31484defc9e203eab4381b61d6fd0315c51b5f388477d46d0a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8820373a5aa097eb52489502d77cfb56d8c2c4b7edb0d19705e9ab00a2e131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.321528 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.332128 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.340984 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.353418 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.364467 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.373582 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec66e286-df44-470e-920a-7cd1b5f0dc4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://808841a24cbb860aa46294fd300f6b1fbd869ea49128ecdff9d8020c9201bde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.391071 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.402469 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.403394 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 07:22:32.137614741 +0000 UTC Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.414541 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e559edb20cfecd437f4e96833d020bea8dda6dd89ead3660ea47947a6c0efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"2026-02-23T06:46:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137\\\\n2026-02-23T06:46:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137 to /host/opt/cni/bin/\\\\n2026-02-23T06:46:18Z [verbose] multus-daemon started\\\\n2026-02-23T06:46:18Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:47:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:47:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.425305 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.435595 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf015224-9482-4d3e-ae9f-89925330c564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467ed8a21c0c42fdc43da9adf454627fd84c723f7336563bfbcb6c06a048009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f18d208b90ca9fc5264ec1c3860df730a39cca8c3249c17cec85518f4d2c0bae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0223 06:45:10.983788 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0223 06:45:10.985605 1 observer_polling.go:159] Starting file observer\\\\nI0223 06:45:10.987317 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0223 06:45:10.988649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0223 06:45:39.397168 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0223 06:45:41.385031 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:10Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db410610d66ac6995629552ac1e2a2d1f6f1ee0716cfced332200b1ec230d213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2394c5130c5db1be3f5acb17466223dbea987efe7cbe5a2fb40033d9db92bc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea2a8886b868cf981148d459b9d6f92665b1d346bd8d2a07478b2715a8db06f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.445200 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.456948 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: E0223 06:47:18.460894 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.468043 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.478633 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec66e286-df44-470e-920a-7cd1b5f0dc4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://808841a24cbb860aa46294fd300f6b1fbd869ea49128ecdff9d8020c9201bde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.491701 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.502541 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.514535 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e559edb20cfecd437f4e96833d020bea8dda6dd89ead3660ea47947a6c0efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"2026-02-23T06:46:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137\\\\n2026-02-23T06:46:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137 to /host/opt/cni/bin/\\\\n2026-02-23T06:46:18Z [verbose] multus-daemon started\\\\n2026-02-23T06:46:18Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:47:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:47:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.525001 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.536676 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf015224-9482-4d3e-ae9f-89925330c564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467ed8a21c0c42fdc43da9adf454627fd84c723f7336563bfbcb6c06a048009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f18d208b90ca9fc5264ec1c3860df730a39cca8c3249c17cec85518f4d2c0bae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0223 06:45:10.983788 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0223 06:45:10.985605 1 observer_polling.go:159] Starting file observer\\\\nI0223 06:45:10.987317 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0223 06:45:10.988649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0223 06:45:39.397168 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0223 06:45:41.385031 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:10Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db410610d66ac6995629552ac1e2a2d1f6f1ee0716cfced332200b1ec230d213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2394c5130c5db1be3f5acb17466223dbea987efe7cbe5a2fb40033d9db92bc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea2a8886b868cf981148d459b9d6f92665b1d346bd8d2a07478b2715a8db06f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.555453 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.566868 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.579457 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.590758 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.600881 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.616017 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.633916 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e330986ccb8dd60e5fa1918691d8f0f423937c30372a98e9efb416e4a307cdc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:46:45Z\\\",\\\"message\\\":\\\".396596 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0223 06:46:45.396601 7154 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9gf5k in node crc\\\\nI0223 06:46:45.396434 7154 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0223 06:46:45.396608 7154 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9gf5k after 0 failed attempt(s)\\\\nI0223 06:46:45.396611 7154 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-54jbp] creating logical port openshift-multus_network-metrics-daemon-54jbp for pod on switch crc\\\\nI0223 06:46:45.396602 7154 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0223 06:46:45.396568 7154 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:46:45.396682 7154 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:47:17Z\\\",\\\"message\\\":\\\"s:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0223 06:47:17.268425 7479 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:47:17.268515 7479 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z]\\\\nI0223 06:47:17.268382 7479 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.647108 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ac5be00-0def-4e19-9494-852debd9f858\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07882d40dca6a83764ed89d4374217a2f3f47d2b2cf1ea24d5b78d1b5d637f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30f84f8932b31484defc9e203eab4381b61d6fd0315c51b5f388477d46d0a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8820373a5aa097eb52489502d77cfb56d8c2c4b7edb0d19705e9ab00a2e131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.662006 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:18 crc kubenswrapper[5047]: I0223 06:47:18.674189 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.197564 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/3.log" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.203042 5047 scope.go:117] "RemoveContainer" containerID="a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3" Feb 23 06:47:19 crc kubenswrapper[5047]: E0223 06:47:19.203244 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.218869 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec66e286-df44-470e-920a-7cd1b5f0dc4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://808841a24cbb860aa46294fd300f6b1fbd869ea49128ecdff9d8020c9201bde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.242788 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.263976 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.282824 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf015224-9482-4d3e-ae9f-89925330c564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467ed8a21c0c42fdc43da9adf454627fd84c723f7336563bfbcb6c06a048009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f18d208b90ca9fc5264ec1c3860df730a39cca8c3249c17cec85518f4d2c0bae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0223 06:45:10.983788 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0223 06:45:10.985605 1 observer_polling.go:159] Starting file observer\\\\nI0223 06:45:10.987317 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0223 06:45:10.988649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0223 06:45:39.397168 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0223 06:45:41.385031 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:10Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db410610d66ac6995629552ac1e2a2d1f6f1ee0716cfced332200b1ec230d213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2394c5130c5db1be3f5acb17466223dbea987efe7cbe5a2fb40033d9db92bc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea2a8886b868cf981148d459b9d6f92665b1d346bd8d2a07478b2715a8db06f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.312436 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.327548 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.340683 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.340806 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:19 crc kubenswrapper[5047]: E0223 06:47:19.340888 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:19 crc kubenswrapper[5047]: E0223 06:47:19.341012 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.341112 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.341150 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:19 crc kubenswrapper[5047]: E0223 06:47:19.341212 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:19 crc kubenswrapper[5047]: E0223 06:47:19.341285 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.342101 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e559edb20cfecd437f4e96833d020bea8dda6dd89ead3660ea47947a6c0efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"2026-02-23T06:46:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137\\\\n2026-02-23T06:46:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137 to /host/opt/cni/bin/\\\\n2026-02-23T06:46:18Z [verbose] multus-daemon started\\\\n2026-02-23T06:46:18Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:47:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:47:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.355749 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.382010 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.402723 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:47:17Z\\\",\\\"message\\\":\\\"s:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0223 06:47:17.268425 7479 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:47:17.268515 7479 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z]\\\\nI0223 06:47:17.268382 7479 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:47:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.403600 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 19:46:05.620164029 +0000 UTC Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.414649 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ac5be00-0def-4e19-9494-852debd9f858\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07882d40dca6a83764ed89d4374217a2f3f47d2b2cf1ea24d5b78d1b5d637f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30f84f8932b31484defc9e203eab4381b61d6fd0315c51b5f388477d46d0a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8820373a5aa097eb52489502d77cfb56d8c2c4b7edb0d19705e9ab00a2e131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.431438 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.444546 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.458944 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.471757 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.483156 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.495248 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.509561 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:19 crc kubenswrapper[5047]: I0223 06:47:19.521561 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:20 crc kubenswrapper[5047]: I0223 06:47:20.404564 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 15:26:08.508485146 +0000 UTC Feb 23 06:47:21 crc kubenswrapper[5047]: I0223 06:47:21.340857 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:21 crc kubenswrapper[5047]: I0223 06:47:21.340960 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:21 crc kubenswrapper[5047]: E0223 06:47:21.341124 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:21 crc kubenswrapper[5047]: I0223 06:47:21.341180 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:21 crc kubenswrapper[5047]: E0223 06:47:21.341228 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:21 crc kubenswrapper[5047]: I0223 06:47:21.341222 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:21 crc kubenswrapper[5047]: E0223 06:47:21.341412 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:21 crc kubenswrapper[5047]: E0223 06:47:21.341597 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:21 crc kubenswrapper[5047]: I0223 06:47:21.405778 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 11:41:36.560936175 +0000 UTC Feb 23 06:47:22 crc kubenswrapper[5047]: I0223 06:47:22.406718 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 01:46:47.707279537 +0000 UTC Feb 23 06:47:23 crc kubenswrapper[5047]: I0223 06:47:23.340652 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:23 crc kubenswrapper[5047]: I0223 06:47:23.340731 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:23 crc kubenswrapper[5047]: I0223 06:47:23.340768 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:23 crc kubenswrapper[5047]: I0223 06:47:23.340845 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:23 crc kubenswrapper[5047]: E0223 06:47:23.340839 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:23 crc kubenswrapper[5047]: E0223 06:47:23.341057 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:23 crc kubenswrapper[5047]: E0223 06:47:23.341140 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:23 crc kubenswrapper[5047]: E0223 06:47:23.341241 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:23 crc kubenswrapper[5047]: I0223 06:47:23.407258 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 13:53:48.907494021 +0000 UTC Feb 23 06:47:23 crc kubenswrapper[5047]: E0223 06:47:23.462300 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.140096 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.140157 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.140172 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.140194 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.140209 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:24Z","lastTransitionTime":"2026-02-23T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:24 crc kubenswrapper[5047]: E0223 06:47:24.161012 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.166800 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.166895 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.166967 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.167041 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.167067 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:24Z","lastTransitionTime":"2026-02-23T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:24 crc kubenswrapper[5047]: E0223 06:47:24.184795 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.189451 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.189551 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.189603 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.189627 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.189644 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:24Z","lastTransitionTime":"2026-02-23T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:24 crc kubenswrapper[5047]: E0223 06:47:24.206412 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.213043 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.213141 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.213180 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.213235 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.213255 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:24Z","lastTransitionTime":"2026-02-23T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:24 crc kubenswrapper[5047]: E0223 06:47:24.231493 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.235656 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.235707 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.235719 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.235738 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.235750 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:24Z","lastTransitionTime":"2026-02-23T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:24 crc kubenswrapper[5047]: E0223 06:47:24.250182 5047 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b1b59f5d-eb2f-45ea-8116-a187d6509bf4\\\",\\\"systemUUID\\\":\\\"d12f1023-e3c1-471d-b7b7-19c07f350921\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:24Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:24 crc kubenswrapper[5047]: E0223 06:47:24.250340 5047 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:47:24 crc kubenswrapper[5047]: I0223 06:47:24.408258 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 21:10:54.385120353 +0000 UTC Feb 23 06:47:25 crc kubenswrapper[5047]: I0223 06:47:25.339926 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:25 crc kubenswrapper[5047]: I0223 06:47:25.340064 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:25 crc kubenswrapper[5047]: E0223 06:47:25.340104 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:25 crc kubenswrapper[5047]: I0223 06:47:25.339947 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:25 crc kubenswrapper[5047]: I0223 06:47:25.339945 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:25 crc kubenswrapper[5047]: E0223 06:47:25.340267 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:25 crc kubenswrapper[5047]: E0223 06:47:25.340490 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:25 crc kubenswrapper[5047]: E0223 06:47:25.340559 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:25 crc kubenswrapper[5047]: I0223 06:47:25.408558 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 14:45:16.949269344 +0000 UTC Feb 23 06:47:26 crc kubenswrapper[5047]: I0223 06:47:26.409134 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 07:25:23.138942435 +0000 UTC Feb 23 06:47:27 crc kubenswrapper[5047]: I0223 06:47:27.340710 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:27 crc kubenswrapper[5047]: I0223 06:47:27.340780 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:27 crc kubenswrapper[5047]: I0223 06:47:27.340884 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:27 crc kubenswrapper[5047]: E0223 06:47:27.341036 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:27 crc kubenswrapper[5047]: I0223 06:47:27.341377 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:27 crc kubenswrapper[5047]: E0223 06:47:27.341445 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:27 crc kubenswrapper[5047]: E0223 06:47:27.341666 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:27 crc kubenswrapper[5047]: E0223 06:47:27.341746 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:27 crc kubenswrapper[5047]: I0223 06:47:27.410515 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 06:02:50.255507473 +0000 UTC Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.361170 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec66e286-df44-470e-920a-7cd1b5f0dc4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://808841a24cbb860aa46294fd300f6b1fbd869ea49128ecdff9d8020c9201bde3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d42897d51439f03b46f67d68b7ff2425a23f157d0d16572ce4097d28d672ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.380521 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709e655d8de2f221b7eec7673edb6efdb2f3865f3cb2144f07f8a72a85ad3884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f0c5744c273be72b16ab5f8e102ee507089cef3dbafca7153672f6d6caa4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.395443 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.411441 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 16:14:51.404324353 +0000 UTC Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.411662 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf015224-9482-4d3e-ae9f-89925330c564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467ed8a21c0c42fdc43da9adf454627fd84c723f7336563bfbcb6c06a048009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f18d208b90ca9fc5264ec1c3860df730a39cca8c3249c17cec85518f4d2c0bae\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0223 06:45:10.983788 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0223 06:45:10.985605 1 observer_polling.go:159] Starting file observer\\\\nI0223 06:45:10.987317 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0223 06:45:10.988649 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0223 06:45:39.397168 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0223 06:45:41.385031 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:10Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db410610d66ac6995629552ac1e2a2d1f6f1ee0716cfced332200b1ec230d213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2394c5130c5db1be3f5acb17466223dbea987efe7cbe5a2fb40033d9db92bc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea2a8886b868cf981148d459b9d6f92665b1d346bd8d2a07478b2715a8db06f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.436952 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e227a064-d314-4f6d-a1d0-5003f15074c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e413accac4c70e5ce78db751e2bd726818177ee4b9b17854c56e60e496b8aa19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c79589c35927536925d456a4e49fe55cfdffbcf72092c38cb3bfb52461f0c2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c951c3ecab2ce95c34ff0a98d9d155d3efb250fa9abb3f01d0dfe086d003ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9e75181f9414c14f4df471739b47f7a9dc40c15c1af03f2db3319c5f30926b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9af42ee878c33af46a11d99fbed4ab3da6ce23b72cb255de54f0b3416d5c9dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b26253a42f10f63a489883e28d51cccc71f7cfe48ed7e8f01c9f3fe16e188e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed295bae89a93e9c625f883f6b2d68b8a15bfd46d50e1909dc0cebf6706aef16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ff1c37c2de1fd6fdae606877a544076f5b88d07cc00f685758f52305cfbe054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.454557 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c644bd6dd10278a6d37269a1b43043162474850f0db7fb12693a343543d3496b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: E0223 06:47:28.464038 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.475464 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n5dz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e559edb20cfecd437f4e96833d020bea8dda6dd89ead3660ea47947a6c0efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:47:03Z\\\",\\\"message\\\":\\\"2026-02-23T06:46:17+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137\\\\n2026-02-23T06:46:17+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c73682ee-1245-4ebf-8d69-0276668d8137 to /host/opt/cni/bin/\\\\n2026-02-23T06:46:18Z [verbose] multus-daemon started\\\\n2026-02-23T06:46:18Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:47:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:47:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsvvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n5dz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.492977 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca275411-978b-439b-ab4b-f98a7ac42f8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d19685244bda7b4b64def42191df6f12d4c184bcfb0685a5b30f19090a262d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9z88\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wh6hv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.512476 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df53e315-5672-4e94-96bd-fd4f705103c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc00615f039620c11f5a8751d941fca009cb877570c903debb9da76f555483c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c8a433e44a02cfb3f9833fb8621693c0a2825a93587cd9200ee1cb111ede82a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686cfbdd5f95b31b5e98b43daec06da29614fcf3fdf1c696c24bd2e9978226c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a405d3074efdec83dbcab5047f575adc23f5b5a781a7b5ed13de82a6b8ea50d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ac4b7d3488e2b11bad3d41c28f5fcc2ef3aa76966581e13ae17d19537b54ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10e792956a3b3947513062e421e29ca4bbc433be5dbd685f00b8fdaa1acc4928\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa3dde4ff333d457f4d3cf7df670e4ca25b03fc27457b229f494657a5214eea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tl4z6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9gf5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.543315 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d56904fe-1a5a-4fde-b122-947fd9a28130\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:47:17Z\\\",\\\"message\\\":\\\"s:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0223 06:47:17.268425 7479 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:47:17.268515 7479 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:17Z is after 2025-08-24T17:21:41Z]\\\\nI0223 06:47:17.268382 7479 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:47:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7qs44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rklm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.560109 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ac5be00-0def-4e19-9494-852debd9f858\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07882d40dca6a83764ed89d4374217a2f3f47d2b2cf1ea24d5b78d1b5d637f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30f84f8932b31484defc9e203eab4381b61d6fd0315c51b5f388477d46d0a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8820373a5aa097eb52489502d77cfb56d8c2c4b7edb0d19705e9ab00a2e131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a9dcd6668a413f3c15dfa9dc822f6f224ad7864ad8776169b66afea9f2f1dc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.578309 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24137acc-2a91-4059-b282-cee970a1a349\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:33Z\\\",\\\"message\\\":\\\"W0223 06:45:32.641706 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:32.642258 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829132 cert, and key in /tmp/serving-cert-1405730035/serving-signer.crt, /tmp/serving-cert-1405730035/serving-signer.key\\\\nI0223 06:45:33.281076 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:33.284807 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:33.284967 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:33.285676 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1405730035/tls.crt::/tmp/serving-cert-1405730035/tls.key\\\\\\\"\\\\nF0223 06:45:33.556959 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:33Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:44:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:44:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:44:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.596214 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.612037 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.631361 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f69d89cc7afb08768876f86037a2fb6fdd6bf7870ff29cca76b11d63d8b6ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.645747 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d5mx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e6146e4-f9c9-4d32-9de3-e26a05eb6c6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69ac993205b71c2fe0a57832d32b982480f139e44373a860fd798ebdaf20911f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7nvs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d5mx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.660758 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bhfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ca4c2cb-5b99-4efc-a494-2e1bc2894054\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ec7eed3dac698c3203fed4f9b0e6f3fea5f1c77387765ee9c0d8d16582c4820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pdn6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bhfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.676719 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a8a99af-4edb-4cf0-8c50-3ae9a6e38181\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fa87ef494abcaa050f83ef57731e15003200091602154d6dbb1d47e13346352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2829b45d767c56f74094474eeb2d84a69b386d79e33a87e376c477579c5c5175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtvdl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vz92d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:28 crc kubenswrapper[5047]: I0223 06:47:28.689357 5047 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-54jbp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb811549-5811-4996-ba8c-6f8848a80ce7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvfcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:46:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-54jbp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:47:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:47:29 crc kubenswrapper[5047]: I0223 06:47:29.340740 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:29 crc kubenswrapper[5047]: I0223 06:47:29.341104 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:29 crc kubenswrapper[5047]: I0223 06:47:29.341194 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:29 crc kubenswrapper[5047]: I0223 06:47:29.341225 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:29 crc kubenswrapper[5047]: E0223 06:47:29.341338 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:29 crc kubenswrapper[5047]: E0223 06:47:29.341569 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:29 crc kubenswrapper[5047]: E0223 06:47:29.341663 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:29 crc kubenswrapper[5047]: E0223 06:47:29.341728 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:29 crc kubenswrapper[5047]: I0223 06:47:29.412739 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 22:19:42.078740581 +0000 UTC Feb 23 06:47:30 crc kubenswrapper[5047]: I0223 06:47:30.413628 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 16:05:20.986906071 +0000 UTC Feb 23 06:47:31 crc kubenswrapper[5047]: I0223 06:47:31.351060 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:31 crc kubenswrapper[5047]: I0223 06:47:31.351231 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:31 crc kubenswrapper[5047]: I0223 06:47:31.351349 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:31 crc kubenswrapper[5047]: E0223 06:47:31.351348 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:31 crc kubenswrapper[5047]: I0223 06:47:31.351457 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:31 crc kubenswrapper[5047]: E0223 06:47:31.351510 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:31 crc kubenswrapper[5047]: E0223 06:47:31.351638 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:31 crc kubenswrapper[5047]: E0223 06:47:31.351688 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:31 crc kubenswrapper[5047]: I0223 06:47:31.414009 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 00:12:04.834998302 +0000 UTC Feb 23 06:47:32 crc kubenswrapper[5047]: I0223 06:47:32.414391 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 04:51:30.627495396 +0000 UTC Feb 23 06:47:33 crc kubenswrapper[5047]: I0223 06:47:33.340698 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:33 crc kubenswrapper[5047]: I0223 06:47:33.341048 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:33 crc kubenswrapper[5047]: I0223 06:47:33.341064 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:33 crc kubenswrapper[5047]: I0223 06:47:33.341089 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:33 crc kubenswrapper[5047]: E0223 06:47:33.341267 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:33 crc kubenswrapper[5047]: E0223 06:47:33.341364 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:33 crc kubenswrapper[5047]: E0223 06:47:33.341451 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:33 crc kubenswrapper[5047]: E0223 06:47:33.341763 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:33 crc kubenswrapper[5047]: I0223 06:47:33.342061 5047 scope.go:117] "RemoveContainer" containerID="a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3" Feb 23 06:47:33 crc kubenswrapper[5047]: E0223 06:47:33.342222 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" Feb 23 06:47:33 crc kubenswrapper[5047]: I0223 06:47:33.415563 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 07:23:42.315586552 +0000 UTC Feb 23 06:47:33 crc kubenswrapper[5047]: E0223 06:47:33.465088 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:47:33 crc kubenswrapper[5047]: I0223 06:47:33.563179 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:33 crc kubenswrapper[5047]: E0223 06:47:33.563416 5047 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:47:33 crc kubenswrapper[5047]: E0223 06:47:33.563521 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs podName:cb811549-5811-4996-ba8c-6f8848a80ce7 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:37.56348924 +0000 UTC m=+239.814816374 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs") pod "network-metrics-daemon-54jbp" (UID: "cb811549-5811-4996-ba8c-6f8848a80ce7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.415936 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 02:59:55.3878938 +0000 UTC Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.604278 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.604336 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.604350 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.604370 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.604385 5047 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:47:34Z","lastTransitionTime":"2026-02-23T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.648781 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9"] Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.649321 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.651049 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.651797 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.652398 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.653700 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.694182 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=17.694147552 podStartE2EDuration="17.694147552s" podCreationTimestamp="2026-02-23 06:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:34.671115864 +0000 UTC m=+176.922443008" watchObservedRunningTime="2026-02-23 06:47:34.694147552 +0000 UTC m=+176.945474696" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.694486 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=74.694479732 podStartE2EDuration="1m14.694479732s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:34.693998438 +0000 UTC m=+176.945325592" watchObservedRunningTime="2026-02-23 06:47:34.694479732 +0000 UTC m=+176.945806866" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.738267 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n5dz9" podStartSLOduration=105.738242464 podStartE2EDuration="1m45.738242464s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:34.726548374 +0000 UTC m=+176.977875508" watchObservedRunningTime="2026-02-23 06:47:34.738242464 +0000 UTC m=+176.989569598" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.738435 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podStartSLOduration=105.738431369 podStartE2EDuration="1m45.738431369s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:34.738052718 +0000 UTC m=+176.989379852" watchObservedRunningTime="2026-02-23 06:47:34.738431369 +0000 UTC m=+176.989758503" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.762194 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9gf5k" podStartSLOduration=105.762169109 podStartE2EDuration="1m45.762169109s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:34.761799547 +0000 UTC m=+177.013126691" watchObservedRunningTime="2026-02-23 06:47:34.762169109 +0000 UTC m=+177.013496243" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.762608 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d5mx4" podStartSLOduration=105.762600731 podStartE2EDuration="1m45.762600731s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:34.747576324 +0000 UTC m=+176.998903458" watchObservedRunningTime="2026-02-23 06:47:34.762600731 +0000 UTC m=+177.013927875" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.776195 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd981e2c-7a59-47d2-a1fb-775fb634a365-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.776260 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fd981e2c-7a59-47d2-a1fb-775fb634a365-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.776290 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fd981e2c-7a59-47d2-a1fb-775fb634a365-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.776305 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd981e2c-7a59-47d2-a1fb-775fb634a365-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.776328 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fd981e2c-7a59-47d2-a1fb-775fb634a365-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.803110 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.803087697 podStartE2EDuration="41.803087697s" podCreationTimestamp="2026-02-23 06:46:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:34.802812429 +0000 UTC m=+177.054139563" watchObservedRunningTime="2026-02-23 06:47:34.803087697 +0000 UTC m=+177.054414831" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.822316 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=76.822294865 podStartE2EDuration="1m16.822294865s" podCreationTimestamp="2026-02-23 06:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:34.822154231 +0000 UTC m=+177.073481355" watchObservedRunningTime="2026-02-23 06:47:34.822294865 +0000 UTC m=+177.073622019" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.876988 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fd981e2c-7a59-47d2-a1fb-775fb634a365-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.877173 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd981e2c-7a59-47d2-a1fb-775fb634a365-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.877201 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fd981e2c-7a59-47d2-a1fb-775fb634a365-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.877232 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fd981e2c-7a59-47d2-a1fb-775fb634a365-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.877255 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd981e2c-7a59-47d2-a1fb-775fb634a365-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.877289 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fd981e2c-7a59-47d2-a1fb-775fb634a365-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.877369 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fd981e2c-7a59-47d2-a1fb-775fb634a365-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.877987 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fd981e2c-7a59-47d2-a1fb-775fb634a365-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.894650 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd981e2c-7a59-47d2-a1fb-775fb634a365-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.898654 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd981e2c-7a59-47d2-a1fb-775fb634a365-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fhmx9\" (UID: \"fd981e2c-7a59-47d2-a1fb-775fb634a365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.906056 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bhfwp" podStartSLOduration=105.906031818 podStartE2EDuration="1m45.906031818s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:34.905886853 +0000 UTC m=+177.157213987" watchObservedRunningTime="2026-02-23 06:47:34.906031818 +0000 UTC m=+177.157358952" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.918942 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vz92d" podStartSLOduration=105.918919682 podStartE2EDuration="1m45.918919682s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:34.918145469 +0000 UTC m=+177.169472603" watchObservedRunningTime="2026-02-23 06:47:34.918919682 +0000 UTC m=+177.170246836" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.944981 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=31.944954028 podStartE2EDuration="31.944954028s" podCreationTimestamp="2026-02-23 06:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:34.944556927 +0000 UTC m=+177.195884061" watchObservedRunningTime="2026-02-23 06:47:34.944954028 +0000 UTC m=+177.196281182" Feb 23 06:47:34 crc kubenswrapper[5047]: I0223 06:47:34.965933 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" Feb 23 06:47:35 crc kubenswrapper[5047]: I0223 06:47:35.262370 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" event={"ID":"fd981e2c-7a59-47d2-a1fb-775fb634a365","Type":"ContainerStarted","Data":"b35a5c4f6eaa8eb7e9184ef5fc1573971f0dcd0b1253df9f1a8c4d371a99db4c"} Feb 23 06:47:35 crc kubenswrapper[5047]: I0223 06:47:35.262438 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" event={"ID":"fd981e2c-7a59-47d2-a1fb-775fb634a365","Type":"ContainerStarted","Data":"8c92615fbb44c82b7dc2d5dd0cd764927e6d1fcba324d7d2e47dcfb820d441bc"} Feb 23 06:47:35 crc kubenswrapper[5047]: I0223 06:47:35.276044 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fhmx9" podStartSLOduration=106.276021745 podStartE2EDuration="1m46.276021745s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:35.27585922 +0000 UTC m=+177.527186364" watchObservedRunningTime="2026-02-23 06:47:35.276021745 +0000 UTC m=+177.527348879" Feb 23 06:47:35 crc kubenswrapper[5047]: I0223 06:47:35.340660 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:35 crc kubenswrapper[5047]: I0223 06:47:35.340713 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:35 crc kubenswrapper[5047]: I0223 06:47:35.340798 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:35 crc kubenswrapper[5047]: I0223 06:47:35.340816 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:35 crc kubenswrapper[5047]: E0223 06:47:35.340958 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:35 crc kubenswrapper[5047]: E0223 06:47:35.341117 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:35 crc kubenswrapper[5047]: E0223 06:47:35.341262 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:35 crc kubenswrapper[5047]: E0223 06:47:35.341378 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:35 crc kubenswrapper[5047]: I0223 06:47:35.416345 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 18:44:14.070975612 +0000 UTC Feb 23 06:47:35 crc kubenswrapper[5047]: I0223 06:47:35.416479 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 23 06:47:35 crc kubenswrapper[5047]: I0223 06:47:35.425361 5047 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 23 06:47:37 crc kubenswrapper[5047]: I0223 06:47:37.340647 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:37 crc kubenswrapper[5047]: I0223 06:47:37.340686 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:37 crc kubenswrapper[5047]: I0223 06:47:37.340800 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:37 crc kubenswrapper[5047]: I0223 06:47:37.341753 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:37 crc kubenswrapper[5047]: E0223 06:47:37.341950 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:37 crc kubenswrapper[5047]: E0223 06:47:37.342118 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:37 crc kubenswrapper[5047]: E0223 06:47:37.342244 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:37 crc kubenswrapper[5047]: E0223 06:47:37.342400 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:38 crc kubenswrapper[5047]: E0223 06:47:38.466502 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:47:39 crc kubenswrapper[5047]: I0223 06:47:39.340190 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:39 crc kubenswrapper[5047]: I0223 06:47:39.340332 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:39 crc kubenswrapper[5047]: E0223 06:47:39.340369 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:39 crc kubenswrapper[5047]: I0223 06:47:39.340438 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:39 crc kubenswrapper[5047]: E0223 06:47:39.340676 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:39 crc kubenswrapper[5047]: E0223 06:47:39.340803 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:39 crc kubenswrapper[5047]: I0223 06:47:39.341505 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:39 crc kubenswrapper[5047]: E0223 06:47:39.341813 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:41 crc kubenswrapper[5047]: I0223 06:47:41.340543 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:41 crc kubenswrapper[5047]: I0223 06:47:41.340543 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:41 crc kubenswrapper[5047]: E0223 06:47:41.340722 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:41 crc kubenswrapper[5047]: I0223 06:47:41.340570 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:41 crc kubenswrapper[5047]: I0223 06:47:41.340555 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:41 crc kubenswrapper[5047]: E0223 06:47:41.340806 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:41 crc kubenswrapper[5047]: E0223 06:47:41.340785 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:41 crc kubenswrapper[5047]: E0223 06:47:41.340945 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:43 crc kubenswrapper[5047]: I0223 06:47:43.340786 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:43 crc kubenswrapper[5047]: I0223 06:47:43.340791 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:43 crc kubenswrapper[5047]: I0223 06:47:43.340960 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:43 crc kubenswrapper[5047]: E0223 06:47:43.341102 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:43 crc kubenswrapper[5047]: I0223 06:47:43.341210 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:43 crc kubenswrapper[5047]: E0223 06:47:43.341601 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:43 crc kubenswrapper[5047]: E0223 06:47:43.341634 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:43 crc kubenswrapper[5047]: E0223 06:47:43.341755 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:43 crc kubenswrapper[5047]: E0223 06:47:43.468159 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:47:45 crc kubenswrapper[5047]: I0223 06:47:45.340426 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:45 crc kubenswrapper[5047]: I0223 06:47:45.340610 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:45 crc kubenswrapper[5047]: E0223 06:47:45.340636 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:45 crc kubenswrapper[5047]: I0223 06:47:45.341122 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:45 crc kubenswrapper[5047]: I0223 06:47:45.341180 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:45 crc kubenswrapper[5047]: E0223 06:47:45.341378 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:45 crc kubenswrapper[5047]: E0223 06:47:45.341442 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:45 crc kubenswrapper[5047]: I0223 06:47:45.342259 5047 scope.go:117] "RemoveContainer" containerID="a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3" Feb 23 06:47:45 crc kubenswrapper[5047]: E0223 06:47:45.341634 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:45 crc kubenswrapper[5047]: E0223 06:47:45.342816 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rklm9_openshift-ovn-kubernetes(d56904fe-1a5a-4fde-b122-947fd9a28130)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" Feb 23 06:47:47 crc kubenswrapper[5047]: I0223 06:47:47.340787 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:47 crc kubenswrapper[5047]: E0223 06:47:47.340977 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:47 crc kubenswrapper[5047]: I0223 06:47:47.340787 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:47 crc kubenswrapper[5047]: I0223 06:47:47.341065 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:47 crc kubenswrapper[5047]: E0223 06:47:47.341316 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:47 crc kubenswrapper[5047]: E0223 06:47:47.341466 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:47 crc kubenswrapper[5047]: I0223 06:47:47.341688 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:47 crc kubenswrapper[5047]: E0223 06:47:47.341834 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:48 crc kubenswrapper[5047]: E0223 06:47:48.469137 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:47:49 crc kubenswrapper[5047]: I0223 06:47:49.339861 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:49 crc kubenswrapper[5047]: I0223 06:47:49.339993 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:49 crc kubenswrapper[5047]: I0223 06:47:49.339894 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:49 crc kubenswrapper[5047]: E0223 06:47:49.340069 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:49 crc kubenswrapper[5047]: E0223 06:47:49.340241 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:49 crc kubenswrapper[5047]: E0223 06:47:49.340299 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:49 crc kubenswrapper[5047]: I0223 06:47:49.340724 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:49 crc kubenswrapper[5047]: E0223 06:47:49.341002 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:50 crc kubenswrapper[5047]: I0223 06:47:50.318411 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n5dz9_e0fbd5e6-7dcc-4a13-936e-0db2e66394e8/kube-multus/1.log" Feb 23 06:47:50 crc kubenswrapper[5047]: I0223 06:47:50.318932 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n5dz9_e0fbd5e6-7dcc-4a13-936e-0db2e66394e8/kube-multus/0.log" Feb 23 06:47:50 crc kubenswrapper[5047]: I0223 06:47:50.318959 5047 generic.go:334] "Generic (PLEG): container finished" podID="e0fbd5e6-7dcc-4a13-936e-0db2e66394e8" containerID="86e559edb20cfecd437f4e96833d020bea8dda6dd89ead3660ea47947a6c0efd" exitCode=1 Feb 23 06:47:50 crc kubenswrapper[5047]: I0223 06:47:50.318988 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n5dz9" event={"ID":"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8","Type":"ContainerDied","Data":"86e559edb20cfecd437f4e96833d020bea8dda6dd89ead3660ea47947a6c0efd"} Feb 23 06:47:50 crc kubenswrapper[5047]: I0223 06:47:50.319024 5047 scope.go:117] "RemoveContainer" containerID="e6bce8d7956c0832779bac55d053958271d7db40a53bbd3a657c6cf6a0d5e2ea" Feb 23 06:47:50 crc kubenswrapper[5047]: I0223 06:47:50.319380 5047 scope.go:117] "RemoveContainer" containerID="86e559edb20cfecd437f4e96833d020bea8dda6dd89ead3660ea47947a6c0efd" Feb 23 06:47:50 crc kubenswrapper[5047]: E0223 06:47:50.319510 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-n5dz9_openshift-multus(e0fbd5e6-7dcc-4a13-936e-0db2e66394e8)\"" pod="openshift-multus/multus-n5dz9" podUID="e0fbd5e6-7dcc-4a13-936e-0db2e66394e8" Feb 23 06:47:51 crc kubenswrapper[5047]: I0223 06:47:51.324455 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n5dz9_e0fbd5e6-7dcc-4a13-936e-0db2e66394e8/kube-multus/1.log" Feb 23 06:47:51 crc kubenswrapper[5047]: I0223 06:47:51.340680 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:51 crc kubenswrapper[5047]: I0223 06:47:51.340780 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:51 crc kubenswrapper[5047]: E0223 06:47:51.340850 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:51 crc kubenswrapper[5047]: E0223 06:47:51.341000 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:51 crc kubenswrapper[5047]: I0223 06:47:51.341071 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:51 crc kubenswrapper[5047]: I0223 06:47:51.341129 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:51 crc kubenswrapper[5047]: E0223 06:47:51.341210 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:51 crc kubenswrapper[5047]: E0223 06:47:51.341290 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:53 crc kubenswrapper[5047]: I0223 06:47:53.340561 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:53 crc kubenswrapper[5047]: I0223 06:47:53.340649 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:53 crc kubenswrapper[5047]: E0223 06:47:53.341237 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:53 crc kubenswrapper[5047]: I0223 06:47:53.340681 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:53 crc kubenswrapper[5047]: E0223 06:47:53.341276 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:53 crc kubenswrapper[5047]: I0223 06:47:53.340701 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:53 crc kubenswrapper[5047]: E0223 06:47:53.341472 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:53 crc kubenswrapper[5047]: E0223 06:47:53.341632 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:53 crc kubenswrapper[5047]: E0223 06:47:53.470553 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:47:55 crc kubenswrapper[5047]: I0223 06:47:55.340525 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:55 crc kubenswrapper[5047]: I0223 06:47:55.340583 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:55 crc kubenswrapper[5047]: I0223 06:47:55.340552 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:55 crc kubenswrapper[5047]: I0223 06:47:55.340525 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:55 crc kubenswrapper[5047]: E0223 06:47:55.340671 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:55 crc kubenswrapper[5047]: E0223 06:47:55.340738 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:55 crc kubenswrapper[5047]: E0223 06:47:55.340813 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:55 crc kubenswrapper[5047]: E0223 06:47:55.341060 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:57 crc kubenswrapper[5047]: I0223 06:47:57.340353 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:57 crc kubenswrapper[5047]: I0223 06:47:57.340418 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:57 crc kubenswrapper[5047]: I0223 06:47:57.340469 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:57 crc kubenswrapper[5047]: I0223 06:47:57.340353 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:57 crc kubenswrapper[5047]: E0223 06:47:57.340535 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:57 crc kubenswrapper[5047]: E0223 06:47:57.340692 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:47:57 crc kubenswrapper[5047]: E0223 06:47:57.340840 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:57 crc kubenswrapper[5047]: E0223 06:47:57.340952 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:58 crc kubenswrapper[5047]: E0223 06:47:58.471428 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:47:59 crc kubenswrapper[5047]: I0223 06:47:59.340879 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:59 crc kubenswrapper[5047]: I0223 06:47:59.341012 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:47:59 crc kubenswrapper[5047]: E0223 06:47:59.341087 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:47:59 crc kubenswrapper[5047]: I0223 06:47:59.340877 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:59 crc kubenswrapper[5047]: E0223 06:47:59.341194 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:47:59 crc kubenswrapper[5047]: I0223 06:47:59.340898 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:59 crc kubenswrapper[5047]: E0223 06:47:59.341288 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:47:59 crc kubenswrapper[5047]: E0223 06:47:59.341344 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:48:00 crc kubenswrapper[5047]: I0223 06:48:00.341611 5047 scope.go:117] "RemoveContainer" containerID="a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3" Feb 23 06:48:01 crc kubenswrapper[5047]: I0223 06:48:01.340375 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:48:01 crc kubenswrapper[5047]: I0223 06:48:01.340411 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:48:01 crc kubenswrapper[5047]: E0223 06:48:01.340995 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:48:01 crc kubenswrapper[5047]: I0223 06:48:01.340439 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:48:01 crc kubenswrapper[5047]: E0223 06:48:01.341065 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:48:01 crc kubenswrapper[5047]: I0223 06:48:01.341188 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:48:01 crc kubenswrapper[5047]: E0223 06:48:01.341328 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:48:01 crc kubenswrapper[5047]: E0223 06:48:01.341383 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:48:01 crc kubenswrapper[5047]: I0223 06:48:01.361968 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/3.log" Feb 23 06:48:01 crc kubenswrapper[5047]: I0223 06:48:01.365066 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerStarted","Data":"990884e87515f311025d6cd5914b187883806cd5a9b1797ec1cebb761603db6c"} Feb 23 06:48:01 crc kubenswrapper[5047]: I0223 06:48:01.365664 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:48:01 crc kubenswrapper[5047]: I0223 06:48:01.402733 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podStartSLOduration=132.402709508 podStartE2EDuration="2m12.402709508s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:01.401356001 +0000 UTC m=+203.652683135" watchObservedRunningTime="2026-02-23 06:48:01.402709508 +0000 UTC m=+203.654036642" Feb 23 06:48:01 crc kubenswrapper[5047]: I0223 06:48:01.474020 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-54jbp"] Feb 23 06:48:01 crc kubenswrapper[5047]: I0223 06:48:01.474149 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:48:01 crc kubenswrapper[5047]: E0223 06:48:01.474243 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:48:03 crc kubenswrapper[5047]: I0223 06:48:03.340098 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:48:03 crc kubenswrapper[5047]: I0223 06:48:03.340226 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:48:03 crc kubenswrapper[5047]: E0223 06:48:03.340614 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:48:03 crc kubenswrapper[5047]: I0223 06:48:03.340302 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:48:03 crc kubenswrapper[5047]: I0223 06:48:03.340274 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:48:03 crc kubenswrapper[5047]: E0223 06:48:03.340767 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:48:03 crc kubenswrapper[5047]: E0223 06:48:03.340847 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:48:03 crc kubenswrapper[5047]: E0223 06:48:03.341003 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:48:03 crc kubenswrapper[5047]: E0223 06:48:03.472729 5047 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:48:05 crc kubenswrapper[5047]: I0223 06:48:05.339947 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:48:05 crc kubenswrapper[5047]: I0223 06:48:05.340103 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:48:05 crc kubenswrapper[5047]: E0223 06:48:05.340249 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:48:05 crc kubenswrapper[5047]: I0223 06:48:05.340302 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:48:05 crc kubenswrapper[5047]: E0223 06:48:05.340463 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:48:05 crc kubenswrapper[5047]: I0223 06:48:05.340526 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:48:05 crc kubenswrapper[5047]: E0223 06:48:05.340787 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:48:05 crc kubenswrapper[5047]: E0223 06:48:05.340891 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:48:05 crc kubenswrapper[5047]: I0223 06:48:05.340989 5047 scope.go:117] "RemoveContainer" containerID="86e559edb20cfecd437f4e96833d020bea8dda6dd89ead3660ea47947a6c0efd" Feb 23 06:48:06 crc kubenswrapper[5047]: I0223 06:48:06.383579 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n5dz9_e0fbd5e6-7dcc-4a13-936e-0db2e66394e8/kube-multus/1.log" Feb 23 06:48:06 crc kubenswrapper[5047]: I0223 06:48:06.383665 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n5dz9" event={"ID":"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8","Type":"ContainerStarted","Data":"87ca2a1c87094d3f86cba1506eb761f303a7247c006a0f88e9dab663f29c5209"} Feb 23 06:48:07 crc kubenswrapper[5047]: I0223 06:48:07.340600 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:48:07 crc kubenswrapper[5047]: I0223 06:48:07.340718 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:48:07 crc kubenswrapper[5047]: E0223 06:48:07.341122 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-54jbp" podUID="cb811549-5811-4996-ba8c-6f8848a80ce7" Feb 23 06:48:07 crc kubenswrapper[5047]: I0223 06:48:07.340783 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:48:07 crc kubenswrapper[5047]: E0223 06:48:07.341174 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:48:07 crc kubenswrapper[5047]: I0223 06:48:07.340723 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:48:07 crc kubenswrapper[5047]: E0223 06:48:07.341200 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:48:07 crc kubenswrapper[5047]: E0223 06:48:07.341274 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:48:09 crc kubenswrapper[5047]: I0223 06:48:09.340711 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:48:09 crc kubenswrapper[5047]: I0223 06:48:09.340760 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:48:09 crc kubenswrapper[5047]: I0223 06:48:09.340917 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:48:09 crc kubenswrapper[5047]: I0223 06:48:09.340935 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:48:09 crc kubenswrapper[5047]: I0223 06:48:09.343290 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 23 06:48:09 crc kubenswrapper[5047]: I0223 06:48:09.343367 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 23 06:48:09 crc kubenswrapper[5047]: I0223 06:48:09.345547 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 23 06:48:09 crc kubenswrapper[5047]: I0223 06:48:09.345554 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 23 06:48:09 crc kubenswrapper[5047]: I0223 06:48:09.345968 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 23 06:48:09 crc kubenswrapper[5047]: I0223 06:48:09.348135 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.269494 5047 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.317158 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.317810 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:15 crc kubenswrapper[5047]: W0223 06:48:15.320396 5047 reflector.go:561] object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2": failed to list *v1.Secret: secrets "route-controller-manager-sa-dockercfg-h2zr2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 23 06:48:15 crc kubenswrapper[5047]: E0223 06:48:15.320481 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-h2zr2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"route-controller-manager-sa-dockercfg-h2zr2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.320638 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cm66d"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.321258 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:15 crc kubenswrapper[5047]: W0223 06:48:15.321296 5047 reflector.go:561] object-"openshift-route-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 23 06:48:15 crc kubenswrapper[5047]: E0223 06:48:15.321506 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:48:15 crc kubenswrapper[5047]: W0223 06:48:15.321309 5047 reflector.go:561] object-"openshift-route-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 23 06:48:15 crc kubenswrapper[5047]: E0223 06:48:15.321566 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.322677 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4"] Feb 23 06:48:15 crc kubenswrapper[5047]: W0223 06:48:15.323228 5047 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 23 06:48:15 crc kubenswrapper[5047]: E0223 06:48:15.323303 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:48:15 crc kubenswrapper[5047]: W0223 06:48:15.323412 5047 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.323465 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" Feb 23 06:48:15 crc kubenswrapper[5047]: E0223 06:48:15.323458 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:48:15 crc kubenswrapper[5047]: W0223 06:48:15.324015 5047 reflector.go:561] object-"openshift-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 23 06:48:15 crc kubenswrapper[5047]: E0223 06:48:15.324054 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.324379 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44"] Feb 23 06:48:15 crc kubenswrapper[5047]: W0223 06:48:15.325112 5047 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 23 06:48:15 crc kubenswrapper[5047]: W0223 06:48:15.325170 5047 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 23 06:48:15 crc kubenswrapper[5047]: E0223 06:48:15.325174 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:48:15 crc kubenswrapper[5047]: W0223 06:48:15.325220 5047 reflector.go:561] object-"openshift-route-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 23 06:48:15 crc kubenswrapper[5047]: E0223 06:48:15.325217 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:48:15 crc kubenswrapper[5047]: E0223 06:48:15.325244 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:48:15 crc kubenswrapper[5047]: W0223 06:48:15.325305 5047 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 23 06:48:15 crc kubenswrapper[5047]: E0223 06:48:15.325337 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.325359 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44" Feb 23 06:48:15 crc kubenswrapper[5047]: W0223 06:48:15.325456 5047 reflector.go:561] object-"openshift-route-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 23 06:48:15 crc kubenswrapper[5047]: E0223 06:48:15.325489 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:48:15 crc kubenswrapper[5047]: W0223 06:48:15.326799 5047 reflector.go:561] object-"openshift-route-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 23 06:48:15 crc kubenswrapper[5047]: E0223 06:48:15.326863 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.328242 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpzpr\" (UniqueName: \"kubernetes.io/projected/227fceb4-b866-4827-8919-05664750fa69-kube-api-access-qpzpr\") pod \"openshift-apiserver-operator-796bbdcf4f-wc7s4\" (UID: \"227fceb4-b866-4827-8919-05664750fa69\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.328342 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-serving-cert\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.328434 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26trc\" (UniqueName: \"kubernetes.io/projected/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-kube-api-access-26trc\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.328497 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-config\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.328552 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/227fceb4-b866-4827-8919-05664750fa69-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wc7s4\" (UID: \"227fceb4-b866-4827-8919-05664750fa69\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.328613 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rgs6\" (UniqueName: \"kubernetes.io/projected/3b1235b4-4a1e-4507-88c6-387adfae3d5d-kube-api-access-5rgs6\") pod \"cluster-samples-operator-665b6dd947-vff44\" (UID: \"3b1235b4-4a1e-4507-88c6-387adfae3d5d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.328674 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.328808 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-config\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.328866 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-client-ca\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.328958 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsltq\" (UniqueName: \"kubernetes.io/projected/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-kube-api-access-qsltq\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.329014 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/227fceb4-b866-4827-8919-05664750fa69-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wc7s4\" (UID: \"227fceb4-b866-4827-8919-05664750fa69\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.328884 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.329971 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.330290 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.329981 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.329060 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b1235b4-4a1e-4507-88c6-387adfae3d5d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vff44\" (UID: \"3b1235b4-4a1e-4507-88c6-387adfae3d5d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.330730 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-client-ca\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.330796 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-serving-cert\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.330815 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.338809 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.330885 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.331153 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.331283 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.331345 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.343537 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-9l6zf"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.344891 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.356044 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-csx28"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.356805 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dhcsk"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.357243 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.357886 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.357939 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.358187 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.358355 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.358621 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.359992 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qgn89"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.360693 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.360015 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.365033 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.365260 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.365208 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.365549 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.365690 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.365974 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.366015 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.366057 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.366364 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.385994 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cfg96"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.386783 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.387454 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-75wvb"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.387602 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.386804 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.388017 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.388690 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.389239 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.389697 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-75wvb" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.389855 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.389317 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ttb4m"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.391355 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.395115 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.398088 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.398243 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.398548 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.398635 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.398642 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.398883 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.399216 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.399280 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.399293 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.399366 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.399601 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.399648 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.400307 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.406815 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.413545 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.422037 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.416324 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.416323 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.416382 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.416405 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.416547 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.416604 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.422993 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.417658 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.420330 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.420374 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.420531 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.420660 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.421450 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.423379 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.423386 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.421547 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.421494 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.421642 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.421711 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.421790 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.423738 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.427965 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.428732 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.429711 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w6ztx"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.431622 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w6ztx" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.431953 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-v5zcl"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.432489 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.432606 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.434894 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-serving-cert\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.434978 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpzpr\" (UniqueName: \"kubernetes.io/projected/227fceb4-b866-4827-8919-05664750fa69-kube-api-access-qpzpr\") pod \"openshift-apiserver-operator-796bbdcf4f-wc7s4\" (UID: \"227fceb4-b866-4827-8919-05664750fa69\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.435008 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-serving-cert\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.435041 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26trc\" (UniqueName: \"kubernetes.io/projected/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-kube-api-access-26trc\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.435064 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-config\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.435087 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/227fceb4-b866-4827-8919-05664750fa69-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wc7s4\" (UID: \"227fceb4-b866-4827-8919-05664750fa69\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.435137 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rgs6\" (UniqueName: \"kubernetes.io/projected/3b1235b4-4a1e-4507-88c6-387adfae3d5d-kube-api-access-5rgs6\") pod \"cluster-samples-operator-665b6dd947-vff44\" (UID: \"3b1235b4-4a1e-4507-88c6-387adfae3d5d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.435159 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.435203 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-config\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.435220 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-client-ca\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.435223 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.435245 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsltq\" (UniqueName: \"kubernetes.io/projected/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-kube-api-access-qsltq\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.435264 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/227fceb4-b866-4827-8919-05664750fa69-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wc7s4\" (UID: \"227fceb4-b866-4827-8919-05664750fa69\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.435282 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b1235b4-4a1e-4507-88c6-387adfae3d5d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vff44\" (UID: \"3b1235b4-4a1e-4507-88c6-387adfae3d5d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.435302 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-client-ca\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.437849 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.438103 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7tlp5"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.438213 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.438416 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.438693 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.438725 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.439414 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.439596 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.439755 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.440765 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.440779 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.441038 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.441183 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.442757 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/227fceb4-b866-4827-8919-05664750fa69-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wc7s4\" (UID: \"227fceb4-b866-4827-8919-05664750fa69\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.466060 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b1235b4-4a1e-4507-88c6-387adfae3d5d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vff44\" (UID: \"3b1235b4-4a1e-4507-88c6-387adfae3d5d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.467189 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/227fceb4-b866-4827-8919-05664750fa69-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wc7s4\" (UID: \"227fceb4-b866-4827-8919-05664750fa69\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.468265 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8fh6b"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.468878 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-serving-cert\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.469163 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.469345 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.469622 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.469926 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.470064 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.469969 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.469992 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.470764 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-g5zp5"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.471655 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.472270 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.472984 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.473137 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.473371 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.473797 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.474005 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.474001 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.475438 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.475566 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.477125 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.477514 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.479407 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.479420 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fdfm4"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.480492 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.483250 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.501171 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.505429 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.507129 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-b9dcg"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.507960 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-b9dcg" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.508131 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.508626 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.509065 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rgs6\" (UniqueName: \"kubernetes.io/projected/3b1235b4-4a1e-4507-88c6-387adfae3d5d-kube-api-access-5rgs6\") pod \"cluster-samples-operator-665b6dd947-vff44\" (UID: \"3b1235b4-4a1e-4507-88c6-387adfae3d5d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.509086 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.510716 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.511506 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.511612 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.513106 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.513468 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.514304 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.515259 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.516042 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.522646 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.523580 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.524257 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.525369 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.525642 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.528919 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpzpr\" (UniqueName: \"kubernetes.io/projected/227fceb4-b866-4827-8919-05664750fa69-kube-api-access-qpzpr\") pod \"openshift-apiserver-operator-796bbdcf4f-wc7s4\" (UID: \"227fceb4-b866-4827-8919-05664750fa69\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.533448 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.534016 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.534587 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-27bdk"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.534941 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.535269 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rhztb"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.535461 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.535795 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hh4ql"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.535962 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.536174 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.536277 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.536842 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.536855 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.536886 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cm66d"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.536923 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.536937 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qgn89"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.536885 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-config\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537002 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hh4ql" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537023 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-image-import-ca\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537055 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzkqx\" (UniqueName: \"kubernetes.io/projected/b448908d-5bb0-437e-a06c-d608dd395160-kube-api-access-dzkqx\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537104 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-oauth-serving-cert\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537131 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65bc34e5-539d-4d00-8562-f8b02a455c4d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25gfm\" (UID: \"65bc34e5-539d-4d00-8562-f8b02a455c4d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537164 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-audit\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537194 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b448908d-5bb0-437e-a06c-d608dd395160-etcd-ca\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537226 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd61b952-18b4-490f-97cb-dd2938aa9d22-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rcb6l\" (UID: \"cd61b952-18b4-490f-97cb-dd2938aa9d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537263 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537291 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz9lr\" (UniqueName: \"kubernetes.io/projected/c20af121-a913-4178-aa60-35ca34fe91b2-kube-api-access-fz9lr\") pod \"dns-operator-744455d44c-w6ztx\" (UID: \"c20af121-a913-4178-aa60-35ca34fe91b2\") " pod="openshift-dns-operator/dns-operator-744455d44c-w6ztx" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537314 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-audit-policies\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537338 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkmvl\" (UniqueName: \"kubernetes.io/projected/cca77003-2334-4205-97ae-87f0ae6d34cc-kube-api-access-rkmvl\") pod \"machine-api-operator-5694c8668f-cfg96\" (UID: \"cca77003-2334-4205-97ae-87f0ae6d34cc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537366 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-service-ca\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537390 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac96f0e5-3605-48c8-80b4-ef4e02123af8-etcd-client\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537410 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac0d3a9f-d965-453b-8dcc-5cb638715ffa-trusted-ca\") pod \"console-operator-58897d9998-dhcsk\" (UID: \"ac0d3a9f-d965-453b-8dcc-5cb638715ffa\") " pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537450 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-config\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537472 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7wds\" (UniqueName: \"kubernetes.io/projected/cd20eb8d-5a96-407f-a898-1dad49ba8355-kube-api-access-j7wds\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537493 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b448908d-5bb0-437e-a06c-d608dd395160-etcd-service-ca\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537529 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7hm2\" (UniqueName: \"kubernetes.io/projected/8225b605-2014-4247-8d40-ce334502bb44-kube-api-access-r7hm2\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537553 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-oauth-config\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537579 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-serving-cert\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537605 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsf94\" (UniqueName: \"kubernetes.io/projected/b4a2bba0-ebb9-4923-a829-69112ef9c89c-kube-api-access-gsf94\") pod \"downloads-7954f5f757-75wvb\" (UID: \"b4a2bba0-ebb9-4923-a829-69112ef9c89c\") " pod="openshift-console/downloads-7954f5f757-75wvb" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537666 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac0d3a9f-d965-453b-8dcc-5cb638715ffa-config\") pod \"console-operator-58897d9998-dhcsk\" (UID: \"ac0d3a9f-d965-453b-8dcc-5cb638715ffa\") " pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537686 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65bc34e5-539d-4d00-8562-f8b02a455c4d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25gfm\" (UID: \"65bc34e5-539d-4d00-8562-f8b02a455c4d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537705 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cca77003-2334-4205-97ae-87f0ae6d34cc-config\") pod \"machine-api-operator-5694c8668f-cfg96\" (UID: \"cca77003-2334-4205-97ae-87f0ae6d34cc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537725 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b448908d-5bb0-437e-a06c-d608dd395160-etcd-client\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537741 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537760 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv876\" (UniqueName: \"kubernetes.io/projected/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-kube-api-access-pv876\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537779 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2mln\" (UniqueName: \"kubernetes.io/projected/cd61b952-18b4-490f-97cb-dd2938aa9d22-kube-api-access-r2mln\") pod \"cluster-image-registry-operator-dc59b4c8b-rcb6l\" (UID: \"cd61b952-18b4-490f-97cb-dd2938aa9d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537816 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65bc34e5-539d-4d00-8562-f8b02a455c4d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25gfm\" (UID: \"65bc34e5-539d-4d00-8562-f8b02a455c4d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537847 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b448908d-5bb0-437e-a06c-d608dd395160-serving-cert\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537874 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cca77003-2334-4205-97ae-87f0ae6d34cc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cfg96\" (UID: \"cca77003-2334-4205-97ae-87f0ae6d34cc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.537948 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmhkb\" (UniqueName: \"kubernetes.io/projected/ac0d3a9f-d965-453b-8dcc-5cb638715ffa-kube-api-access-bmhkb\") pod \"console-operator-58897d9998-dhcsk\" (UID: \"ac0d3a9f-d965-453b-8dcc-5cb638715ffa\") " pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538004 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81c95a2-b542-4fde-be6c-9ac1e59f5ea1-config\") pod \"machine-approver-56656f9798-6vkr2\" (UID: \"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538029 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd61b952-18b4-490f-97cb-dd2938aa9d22-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rcb6l\" (UID: \"cd61b952-18b4-490f-97cb-dd2938aa9d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538054 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8ee000-f48c-4d00-8602-cc8684f9f946-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qxj5w\" (UID: \"ea8ee000-f48c-4d00-8602-cc8684f9f946\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538071 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538123 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538157 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac96f0e5-3605-48c8-80b4-ef4e02123af8-audit-dir\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538180 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cca77003-2334-4205-97ae-87f0ae6d34cc-images\") pod \"machine-api-operator-5694c8668f-cfg96\" (UID: \"cca77003-2334-4205-97ae-87f0ae6d34cc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538206 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c20af121-a913-4178-aa60-35ca34fe91b2-metrics-tls\") pod \"dns-operator-744455d44c-w6ztx\" (UID: \"c20af121-a913-4178-aa60-35ca34fe91b2\") " pod="openshift-dns-operator/dns-operator-744455d44c-w6ztx" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538247 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538271 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538296 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538319 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-audit-dir\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538346 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8225b605-2014-4247-8d40-ce334502bb44-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538367 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac96f0e5-3605-48c8-80b4-ef4e02123af8-serving-cert\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538391 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk5gl\" (UniqueName: \"kubernetes.io/projected/ac96f0e5-3605-48c8-80b4-ef4e02123af8-kube-api-access-jk5gl\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538416 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f81c95a2-b542-4fde-be6c-9ac1e59f5ea1-machine-approver-tls\") pod \"machine-approver-56656f9798-6vkr2\" (UID: \"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538440 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538463 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538572 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd5sn\" (UniqueName: \"kubernetes.io/projected/aa630c01-a2ea-4988-b0db-dc5ae558c1e1-kube-api-access-sd5sn\") pod \"openshift-config-operator-7777fb866f-csx28\" (UID: \"aa630c01-a2ea-4988-b0db-dc5ae558c1e1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538601 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8225b605-2014-4247-8d40-ce334502bb44-service-ca-bundle\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538765 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-trusted-ca-bundle\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538785 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b448908d-5bb0-437e-a06c-d608dd395160-config\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538817 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/aa630c01-a2ea-4988-b0db-dc5ae558c1e1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-csx28\" (UID: \"aa630c01-a2ea-4988-b0db-dc5ae558c1e1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538838 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac96f0e5-3605-48c8-80b4-ef4e02123af8-encryption-config\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538873 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8225b605-2014-4247-8d40-ce334502bb44-config\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538894 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538957 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8225b605-2014-4247-8d40-ce334502bb44-serving-cert\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.538982 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd61b952-18b4-490f-97cb-dd2938aa9d22-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rcb6l\" (UID: \"cd61b952-18b4-490f-97cb-dd2938aa9d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.539043 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.539138 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa630c01-a2ea-4988-b0db-dc5ae558c1e1-serving-cert\") pod \"openshift-config-operator-7777fb866f-csx28\" (UID: \"aa630c01-a2ea-4988-b0db-dc5ae558c1e1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.539172 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac0d3a9f-d965-453b-8dcc-5cb638715ffa-serving-cert\") pod \"console-operator-58897d9998-dhcsk\" (UID: \"ac0d3a9f-d965-453b-8dcc-5cb638715ffa\") " pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.539220 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-etcd-serving-ca\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.539249 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkqm7\" (UniqueName: \"kubernetes.io/projected/ea8ee000-f48c-4d00-8602-cc8684f9f946-kube-api-access-xkqm7\") pod \"openshift-controller-manager-operator-756b6f6bc6-qxj5w\" (UID: \"ea8ee000-f48c-4d00-8602-cc8684f9f946\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.539401 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac96f0e5-3605-48c8-80b4-ef4e02123af8-node-pullsecrets\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.539435 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbvw2\" (UniqueName: \"kubernetes.io/projected/f81c95a2-b542-4fde-be6c-9ac1e59f5ea1-kube-api-access-rbvw2\") pod \"machine-approver-56656f9798-6vkr2\" (UID: \"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.539494 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f81c95a2-b542-4fde-be6c-9ac1e59f5ea1-auth-proxy-config\") pod \"machine-approver-56656f9798-6vkr2\" (UID: \"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.539525 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8ee000-f48c-4d00-8602-cc8684f9f946-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qxj5w\" (UID: \"ea8ee000-f48c-4d00-8602-cc8684f9f946\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.540919 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.541282 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.543763 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-9l6zf"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.548177 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-csx28"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.548642 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-75wvb"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.556927 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ttb4m"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.560056 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.561316 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7tlp5"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.564102 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-f8298"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.564835 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f8298" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.565157 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cfg96"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.567283 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.568950 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v5zcl"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.570104 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.571871 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dhcsk"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.573652 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.577275 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fdfm4"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.579171 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.582714 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.584985 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8fh6b"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.585921 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.589953 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.592946 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.593024 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w6ztx"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.595226 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.599809 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.600045 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.600794 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f8298"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.602931 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-tfqhr"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.604249 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-m247m"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.604423 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tfqhr" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.606047 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-27bdk"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.606114 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.606610 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.607672 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.608772 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-b9dcg"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.609860 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.610996 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.612088 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.613201 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rhztb"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.614405 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.615753 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hh4ql"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.617125 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.618294 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-m247m"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.619543 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p7tmt"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.620430 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.620706 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p7tmt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.621108 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p7tmt"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640598 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-audit\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640639 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b448908d-5bb0-437e-a06c-d608dd395160-etcd-ca\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640662 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd61b952-18b4-490f-97cb-dd2938aa9d22-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rcb6l\" (UID: \"cd61b952-18b4-490f-97cb-dd2938aa9d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640684 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640709 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz9lr\" (UniqueName: \"kubernetes.io/projected/c20af121-a913-4178-aa60-35ca34fe91b2-kube-api-access-fz9lr\") pod \"dns-operator-744455d44c-w6ztx\" (UID: \"c20af121-a913-4178-aa60-35ca34fe91b2\") " pod="openshift-dns-operator/dns-operator-744455d44c-w6ztx" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640726 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-audit-policies\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640742 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkmvl\" (UniqueName: \"kubernetes.io/projected/cca77003-2334-4205-97ae-87f0ae6d34cc-kube-api-access-rkmvl\") pod \"machine-api-operator-5694c8668f-cfg96\" (UID: \"cca77003-2334-4205-97ae-87f0ae6d34cc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640762 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-signing-key\") pod \"service-ca-9c57cc56f-rhztb\" (UID: \"3fee8332-3258-4cf5-a2c6-f1d0c1c684f0\") " pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640797 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-service-ca\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640815 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac96f0e5-3605-48c8-80b4-ef4e02123af8-etcd-client\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640836 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac0d3a9f-d965-453b-8dcc-5cb638715ffa-trusted-ca\") pod \"console-operator-58897d9998-dhcsk\" (UID: \"ac0d3a9f-d965-453b-8dcc-5cb638715ffa\") " pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640861 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-config\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640895 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7hm2\" (UniqueName: \"kubernetes.io/projected/8225b605-2014-4247-8d40-ce334502bb44-kube-api-access-r7hm2\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.640982 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-oauth-config\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641019 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7wds\" (UniqueName: \"kubernetes.io/projected/cd20eb8d-5a96-407f-a898-1dad49ba8355-kube-api-access-j7wds\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641047 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b448908d-5bb0-437e-a06c-d608dd395160-etcd-service-ca\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641068 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-serving-cert\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641097 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsf94\" (UniqueName: \"kubernetes.io/projected/b4a2bba0-ebb9-4923-a829-69112ef9c89c-kube-api-access-gsf94\") pod \"downloads-7954f5f757-75wvb\" (UID: \"b4a2bba0-ebb9-4923-a829-69112ef9c89c\") " pod="openshift-console/downloads-7954f5f757-75wvb" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641184 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac0d3a9f-d965-453b-8dcc-5cb638715ffa-config\") pod \"console-operator-58897d9998-dhcsk\" (UID: \"ac0d3a9f-d965-453b-8dcc-5cb638715ffa\") " pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641542 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65bc34e5-539d-4d00-8562-f8b02a455c4d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25gfm\" (UID: \"65bc34e5-539d-4d00-8562-f8b02a455c4d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641588 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cca77003-2334-4205-97ae-87f0ae6d34cc-config\") pod \"machine-api-operator-5694c8668f-cfg96\" (UID: \"cca77003-2334-4205-97ae-87f0ae6d34cc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641638 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/99864770-491a-4f8e-8f3f-688436dc18ba-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-22tgp\" (UID: \"99864770-491a-4f8e-8f3f-688436dc18ba\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641648 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-audit\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641672 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b448908d-5bb0-437e-a06c-d608dd395160-etcd-client\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641699 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641724 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv876\" (UniqueName: \"kubernetes.io/projected/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-kube-api-access-pv876\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641823 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/016925fa-5297-45db-8aa9-3f0310eb573f-trusted-ca\") pod \"ingress-operator-5b745b69d9-4b54r\" (UID: \"016925fa-5297-45db-8aa9-3f0310eb573f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641867 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2mln\" (UniqueName: \"kubernetes.io/projected/cd61b952-18b4-490f-97cb-dd2938aa9d22-kube-api-access-r2mln\") pod \"cluster-image-registry-operator-dc59b4c8b-rcb6l\" (UID: \"cd61b952-18b4-490f-97cb-dd2938aa9d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641896 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65bc34e5-539d-4d00-8562-f8b02a455c4d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25gfm\" (UID: \"65bc34e5-539d-4d00-8562-f8b02a455c4d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.641987 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b448908d-5bb0-437e-a06c-d608dd395160-serving-cert\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642018 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cca77003-2334-4205-97ae-87f0ae6d34cc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cfg96\" (UID: \"cca77003-2334-4205-97ae-87f0ae6d34cc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642102 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cc1838c-dcb5-4c77-9d34-91507d631e3d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-952vj\" (UID: \"4cc1838c-dcb5-4c77-9d34-91507d631e3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642154 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmhkb\" (UniqueName: \"kubernetes.io/projected/ac0d3a9f-d965-453b-8dcc-5cb638715ffa-kube-api-access-bmhkb\") pod \"console-operator-58897d9998-dhcsk\" (UID: \"ac0d3a9f-d965-453b-8dcc-5cb638715ffa\") " pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642160 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642191 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81c95a2-b542-4fde-be6c-9ac1e59f5ea1-config\") pod \"machine-approver-56656f9798-6vkr2\" (UID: \"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642194 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-audit-policies\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642217 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd61b952-18b4-490f-97cb-dd2938aa9d22-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rcb6l\" (UID: \"cd61b952-18b4-490f-97cb-dd2938aa9d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642243 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642266 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac96f0e5-3605-48c8-80b4-ef4e02123af8-audit-dir\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642291 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8ee000-f48c-4d00-8602-cc8684f9f946-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qxj5w\" (UID: \"ea8ee000-f48c-4d00-8602-cc8684f9f946\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642316 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642339 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cca77003-2334-4205-97ae-87f0ae6d34cc-images\") pod \"machine-api-operator-5694c8668f-cfg96\" (UID: \"cca77003-2334-4205-97ae-87f0ae6d34cc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642376 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642402 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642428 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642454 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c20af121-a913-4178-aa60-35ca34fe91b2-metrics-tls\") pod \"dns-operator-744455d44c-w6ztx\" (UID: \"c20af121-a913-4178-aa60-35ca34fe91b2\") " pod="openshift-dns-operator/dns-operator-744455d44c-w6ztx" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642484 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/016925fa-5297-45db-8aa9-3f0310eb573f-metrics-tls\") pod \"ingress-operator-5b745b69d9-4b54r\" (UID: \"016925fa-5297-45db-8aa9-3f0310eb573f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642523 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8225b605-2014-4247-8d40-ce334502bb44-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642548 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac96f0e5-3605-48c8-80b4-ef4e02123af8-serving-cert\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642573 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk5gl\" (UniqueName: \"kubernetes.io/projected/ac96f0e5-3605-48c8-80b4-ef4e02123af8-kube-api-access-jk5gl\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642600 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f81c95a2-b542-4fde-be6c-9ac1e59f5ea1-machine-approver-tls\") pod \"machine-approver-56656f9798-6vkr2\" (UID: \"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642647 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-audit-dir\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642699 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd5sn\" (UniqueName: \"kubernetes.io/projected/aa630c01-a2ea-4988-b0db-dc5ae558c1e1-kube-api-access-sd5sn\") pod \"openshift-config-operator-7777fb866f-csx28\" (UID: \"aa630c01-a2ea-4988-b0db-dc5ae558c1e1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642726 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642750 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642775 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8225b605-2014-4247-8d40-ce334502bb44-service-ca-bundle\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642798 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-trusted-ca-bundle\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642828 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b448908d-5bb0-437e-a06c-d608dd395160-config\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642838 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642856 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/016925fa-5297-45db-8aa9-3f0310eb573f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4b54r\" (UID: \"016925fa-5297-45db-8aa9-3f0310eb573f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642884 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/aa630c01-a2ea-4988-b0db-dc5ae558c1e1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-csx28\" (UID: \"aa630c01-a2ea-4988-b0db-dc5ae558c1e1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642932 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac96f0e5-3605-48c8-80b4-ef4e02123af8-encryption-config\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642962 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8225b605-2014-4247-8d40-ce334502bb44-config\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642985 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643032 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8225b605-2014-4247-8d40-ce334502bb44-serving-cert\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643055 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd61b952-18b4-490f-97cb-dd2938aa9d22-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rcb6l\" (UID: \"cd61b952-18b4-490f-97cb-dd2938aa9d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643076 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643154 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa630c01-a2ea-4988-b0db-dc5ae558c1e1-serving-cert\") pod \"openshift-config-operator-7777fb866f-csx28\" (UID: \"aa630c01-a2ea-4988-b0db-dc5ae558c1e1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643248 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac0d3a9f-d965-453b-8dcc-5cb638715ffa-serving-cert\") pod \"console-operator-58897d9998-dhcsk\" (UID: \"ac0d3a9f-d965-453b-8dcc-5cb638715ffa\") " pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643294 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfjcc\" (UniqueName: \"kubernetes.io/projected/99864770-491a-4f8e-8f3f-688436dc18ba-kube-api-access-pfjcc\") pod \"control-plane-machine-set-operator-78cbb6b69f-22tgp\" (UID: \"99864770-491a-4f8e-8f3f-688436dc18ba\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643333 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-etcd-serving-ca\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643372 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac96f0e5-3605-48c8-80b4-ef4e02123af8-node-pullsecrets\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643396 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbvw2\" (UniqueName: \"kubernetes.io/projected/f81c95a2-b542-4fde-be6c-9ac1e59f5ea1-kube-api-access-rbvw2\") pod \"machine-approver-56656f9798-6vkr2\" (UID: \"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643418 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8ee000-f48c-4d00-8602-cc8684f9f946-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qxj5w\" (UID: \"ea8ee000-f48c-4d00-8602-cc8684f9f946\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643423 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkqm7\" (UniqueName: \"kubernetes.io/projected/ea8ee000-f48c-4d00-8602-cc8684f9f946-kube-api-access-xkqm7\") pod \"openshift-controller-manager-operator-756b6f6bc6-qxj5w\" (UID: \"ea8ee000-f48c-4d00-8602-cc8684f9f946\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643488 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f81c95a2-b542-4fde-be6c-9ac1e59f5ea1-auth-proxy-config\") pod \"machine-approver-56656f9798-6vkr2\" (UID: \"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643515 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-signing-cabundle\") pod \"service-ca-9c57cc56f-rhztb\" (UID: \"3fee8332-3258-4cf5-a2c6-f1d0c1c684f0\") " pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643523 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-config\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643539 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4pkd\" (UniqueName: \"kubernetes.io/projected/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-kube-api-access-l4pkd\") pod \"service-ca-9c57cc56f-rhztb\" (UID: \"3fee8332-3258-4cf5-a2c6-f1d0c1c684f0\") " pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643574 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac96f0e5-3605-48c8-80b4-ef4e02123af8-audit-dir\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643623 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8ee000-f48c-4d00-8602-cc8684f9f946-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qxj5w\" (UID: \"ea8ee000-f48c-4d00-8602-cc8684f9f946\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643669 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643710 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc1838c-dcb5-4c77-9d34-91507d631e3d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-952vj\" (UID: \"4cc1838c-dcb5-4c77-9d34-91507d631e3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643818 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-config\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643961 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-image-import-ca\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643996 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxw6v\" (UniqueName: \"kubernetes.io/projected/016925fa-5297-45db-8aa9-3f0310eb573f-kube-api-access-nxw6v\") pod \"ingress-operator-5b745b69d9-4b54r\" (UID: \"016925fa-5297-45db-8aa9-3f0310eb573f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.644067 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac0d3a9f-d965-453b-8dcc-5cb638715ffa-trusted-ca\") pod \"console-operator-58897d9998-dhcsk\" (UID: \"ac0d3a9f-d965-453b-8dcc-5cb638715ffa\") " pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.644081 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-oauth-serving-cert\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.644270 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f81c95a2-b542-4fde-be6c-9ac1e59f5ea1-auth-proxy-config\") pod \"machine-approver-56656f9798-6vkr2\" (UID: \"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.643722 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81c95a2-b542-4fde-be6c-9ac1e59f5ea1-config\") pod \"machine-approver-56656f9798-6vkr2\" (UID: \"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.642483 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b448908d-5bb0-437e-a06c-d608dd395160-etcd-ca\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.644152 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.644246 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65bc34e5-539d-4d00-8562-f8b02a455c4d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25gfm\" (UID: \"65bc34e5-539d-4d00-8562-f8b02a455c4d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.644816 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b448908d-5bb0-437e-a06c-d608dd395160-etcd-service-ca\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.645295 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-oauth-config\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.645325 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b448908d-5bb0-437e-a06c-d608dd395160-config\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.645674 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-serving-cert\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.645711 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cca77003-2334-4205-97ae-87f0ae6d34cc-images\") pod \"machine-api-operator-5694c8668f-cfg96\" (UID: \"cca77003-2334-4205-97ae-87f0ae6d34cc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.646041 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8225b605-2014-4247-8d40-ce334502bb44-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.646167 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac0d3a9f-d965-453b-8dcc-5cb638715ffa-config\") pod \"console-operator-58897d9998-dhcsk\" (UID: \"ac0d3a9f-d965-453b-8dcc-5cb638715ffa\") " pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.646328 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cca77003-2334-4205-97ae-87f0ae6d34cc-config\") pod \"machine-api-operator-5694c8668f-cfg96\" (UID: \"cca77003-2334-4205-97ae-87f0ae6d34cc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.646740 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.646870 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-service-ca\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.647053 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8225b605-2014-4247-8d40-ce334502bb44-config\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.647126 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-oauth-serving-cert\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.647161 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac96f0e5-3605-48c8-80b4-ef4e02123af8-serving-cert\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.647280 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.647381 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8ee000-f48c-4d00-8602-cc8684f9f946-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qxj5w\" (UID: \"ea8ee000-f48c-4d00-8602-cc8684f9f946\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.647593 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.647785 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-etcd-serving-ca\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.648141 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd61b952-18b4-490f-97cb-dd2938aa9d22-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rcb6l\" (UID: \"cd61b952-18b4-490f-97cb-dd2938aa9d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.648161 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cca77003-2334-4205-97ae-87f0ae6d34cc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cfg96\" (UID: \"cca77003-2334-4205-97ae-87f0ae6d34cc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.648333 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-audit-dir\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.648374 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac96f0e5-3605-48c8-80b4-ef4e02123af8-node-pullsecrets\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.648767 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzkqx\" (UniqueName: \"kubernetes.io/projected/b448908d-5bb0-437e-a06c-d608dd395160-kube-api-access-dzkqx\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.648807 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7qj2\" (UniqueName: \"kubernetes.io/projected/4cc1838c-dcb5-4c77-9d34-91507d631e3d-kube-api-access-s7qj2\") pod \"kube-storage-version-migrator-operator-b67b599dd-952vj\" (UID: \"4cc1838c-dcb5-4c77-9d34-91507d631e3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.649316 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-config\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.649545 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd61b952-18b4-490f-97cb-dd2938aa9d22-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rcb6l\" (UID: \"cd61b952-18b4-490f-97cb-dd2938aa9d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.649762 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac96f0e5-3605-48c8-80b4-ef4e02123af8-image-import-ca\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.649840 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.649981 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac0d3a9f-d965-453b-8dcc-5cb638715ffa-serving-cert\") pod \"console-operator-58897d9998-dhcsk\" (UID: \"ac0d3a9f-d965-453b-8dcc-5cb638715ffa\") " pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.650039 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/aa630c01-a2ea-4988-b0db-dc5ae558c1e1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-csx28\" (UID: \"aa630c01-a2ea-4988-b0db-dc5ae558c1e1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.650314 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-trusted-ca-bundle\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.650774 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c20af121-a913-4178-aa60-35ca34fe91b2-metrics-tls\") pod \"dns-operator-744455d44c-w6ztx\" (UID: \"c20af121-a913-4178-aa60-35ca34fe91b2\") " pod="openshift-dns-operator/dns-operator-744455d44c-w6ztx" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.650824 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.650990 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.651364 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8225b605-2014-4247-8d40-ce334502bb44-serving-cert\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.651460 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8225b605-2014-4247-8d40-ce334502bb44-service-ca-bundle\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.651472 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac96f0e5-3605-48c8-80b4-ef4e02123af8-etcd-client\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.652470 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac96f0e5-3605-48c8-80b4-ef4e02123af8-encryption-config\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.653049 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa630c01-a2ea-4988-b0db-dc5ae558c1e1-serving-cert\") pod \"openshift-config-operator-7777fb866f-csx28\" (UID: \"aa630c01-a2ea-4988-b0db-dc5ae558c1e1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.653232 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b448908d-5bb0-437e-a06c-d608dd395160-serving-cert\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.653314 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.653646 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f81c95a2-b542-4fde-be6c-9ac1e59f5ea1-machine-approver-tls\") pod \"machine-approver-56656f9798-6vkr2\" (UID: \"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.653786 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b448908d-5bb0-437e-a06c-d608dd395160-etcd-client\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.653803 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.655317 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.656392 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.656493 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.661430 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.674733 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65bc34e5-539d-4d00-8562-f8b02a455c4d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25gfm\" (UID: \"65bc34e5-539d-4d00-8562-f8b02a455c4d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.680753 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.689058 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65bc34e5-539d-4d00-8562-f8b02a455c4d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25gfm\" (UID: \"65bc34e5-539d-4d00-8562-f8b02a455c4d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.722057 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.740937 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.749470 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/016925fa-5297-45db-8aa9-3f0310eb573f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4b54r\" (UID: \"016925fa-5297-45db-8aa9-3f0310eb573f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.749523 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfjcc\" (UniqueName: \"kubernetes.io/projected/99864770-491a-4f8e-8f3f-688436dc18ba-kube-api-access-pfjcc\") pod \"control-plane-machine-set-operator-78cbb6b69f-22tgp\" (UID: \"99864770-491a-4f8e-8f3f-688436dc18ba\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.749570 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-signing-cabundle\") pod \"service-ca-9c57cc56f-rhztb\" (UID: \"3fee8332-3258-4cf5-a2c6-f1d0c1c684f0\") " pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.749587 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4pkd\" (UniqueName: \"kubernetes.io/projected/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-kube-api-access-l4pkd\") pod \"service-ca-9c57cc56f-rhztb\" (UID: \"3fee8332-3258-4cf5-a2c6-f1d0c1c684f0\") " pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.749605 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc1838c-dcb5-4c77-9d34-91507d631e3d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-952vj\" (UID: \"4cc1838c-dcb5-4c77-9d34-91507d631e3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.749633 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxw6v\" (UniqueName: \"kubernetes.io/projected/016925fa-5297-45db-8aa9-3f0310eb573f-kube-api-access-nxw6v\") pod \"ingress-operator-5b745b69d9-4b54r\" (UID: \"016925fa-5297-45db-8aa9-3f0310eb573f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.749666 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7qj2\" (UniqueName: \"kubernetes.io/projected/4cc1838c-dcb5-4c77-9d34-91507d631e3d-kube-api-access-s7qj2\") pod \"kube-storage-version-migrator-operator-b67b599dd-952vj\" (UID: \"4cc1838c-dcb5-4c77-9d34-91507d631e3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.749697 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-signing-key\") pod \"service-ca-9c57cc56f-rhztb\" (UID: \"3fee8332-3258-4cf5-a2c6-f1d0c1c684f0\") " pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.749760 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/99864770-491a-4f8e-8f3f-688436dc18ba-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-22tgp\" (UID: \"99864770-491a-4f8e-8f3f-688436dc18ba\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.749784 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/016925fa-5297-45db-8aa9-3f0310eb573f-trusted-ca\") pod \"ingress-operator-5b745b69d9-4b54r\" (UID: \"016925fa-5297-45db-8aa9-3f0310eb573f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.749802 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cc1838c-dcb5-4c77-9d34-91507d631e3d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-952vj\" (UID: \"4cc1838c-dcb5-4c77-9d34-91507d631e3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.749847 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/016925fa-5297-45db-8aa9-3f0310eb573f-metrics-tls\") pod \"ingress-operator-5b745b69d9-4b54r\" (UID: \"016925fa-5297-45db-8aa9-3f0310eb573f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.760891 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.782590 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.814304 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.815212 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.827465 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.841176 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.861445 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.863419 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44"] Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.881590 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.904227 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.941730 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.962256 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.983523 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 23 06:48:15 crc kubenswrapper[5047]: I0223 06:48:15.997193 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4"] Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.009338 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.011579 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/016925fa-5297-45db-8aa9-3f0310eb573f-trusted-ca\") pod \"ingress-operator-5b745b69d9-4b54r\" (UID: \"016925fa-5297-45db-8aa9-3f0310eb573f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.020798 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.035639 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/016925fa-5297-45db-8aa9-3f0310eb573f-metrics-tls\") pod \"ingress-operator-5b745b69d9-4b54r\" (UID: \"016925fa-5297-45db-8aa9-3f0310eb573f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.041305 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.061443 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.081136 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.101851 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.120913 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.146057 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.160855 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.188275 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.200107 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.220123 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.224395 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cc1838c-dcb5-4c77-9d34-91507d631e3d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-952vj\" (UID: \"4cc1838c-dcb5-4c77-9d34-91507d631e3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.241270 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.260808 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.272006 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cc1838c-dcb5-4c77-9d34-91507d631e3d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-952vj\" (UID: \"4cc1838c-dcb5-4c77-9d34-91507d631e3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.280841 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.301194 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.321500 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.341110 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.361889 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.381202 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.401974 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.426687 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.436316 5047 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.436417 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-client-ca podName:d4967a59-3c6c-4ec3-9c70-4378ec3702c6 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:16.936394232 +0000 UTC m=+219.187721366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-client-ca") pod "route-controller-manager-6576b87f9c-ct67m" (UID: "d4967a59-3c6c-4ec3-9c70-4378ec3702c6") : failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.440953 5047 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.440976 5047 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.441022 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-serving-cert podName:d4967a59-3c6c-4ec3-9c70-4378ec3702c6 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:16.941004969 +0000 UTC m=+219.192332113 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-serving-cert") pod "route-controller-manager-6576b87f9c-ct67m" (UID: "d4967a59-3c6c-4ec3-9c70-4378ec3702c6") : failed to sync secret cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.441137 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-config podName:71636caa-95b7-4d64-a6f1-1cdf7dc03c07 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:16.941111293 +0000 UTC m=+219.192438427 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-config") pod "controller-manager-879f6c89f-cm66d" (UID: "71636caa-95b7-4d64-a6f1-1cdf7dc03c07") : failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.441636 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.441852 5047 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.441897 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-proxy-ca-bundles podName:71636caa-95b7-4d64-a6f1-1cdf7dc03c07 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:16.941885504 +0000 UTC m=+219.193212648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-proxy-ca-bundles") pod "controller-manager-879f6c89f-cm66d" (UID: "71636caa-95b7-4d64-a6f1-1cdf7dc03c07") : failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.441943 5047 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.441973 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-config podName:d4967a59-3c6c-4ec3-9c70-4378ec3702c6 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:16.941964077 +0000 UTC m=+219.193291221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-config") pod "route-controller-manager-6576b87f9c-ct67m" (UID: "d4967a59-3c6c-4ec3-9c70-4378ec3702c6") : failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.441997 5047 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.442023 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-client-ca podName:71636caa-95b7-4d64-a6f1-1cdf7dc03c07 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:16.942015648 +0000 UTC m=+219.193342802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-client-ca") pod "controller-manager-879f6c89f-cm66d" (UID: "71636caa-95b7-4d64-a6f1-1cdf7dc03c07") : failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.460338 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.476517 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" event={"ID":"227fceb4-b866-4827-8919-05664750fa69","Type":"ContainerStarted","Data":"03751f780600c87c78c149504c2af502340b819c137ea6b4f4661ff136371209"} Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.476577 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" event={"ID":"227fceb4-b866-4827-8919-05664750fa69","Type":"ContainerStarted","Data":"3af673172c4c96e40bd5a695a5cce62c92e3387f5b20e0403817d4ed9125549a"} Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.478509 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44" event={"ID":"3b1235b4-4a1e-4507-88c6-387adfae3d5d","Type":"ContainerStarted","Data":"9cfa1b24918bc3eb23095f97f12a83137cca7e5ae064e6067eac91726e19dfca"} Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.478531 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44" event={"ID":"3b1235b4-4a1e-4507-88c6-387adfae3d5d","Type":"ContainerStarted","Data":"548b264b936077c0bfafd50668d5b3ee4113c8dbfba4249015cb8711879da5e0"} Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.480479 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.500868 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.518664 5047 request.go:700] Waited for 1.004105069s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-controller-dockercfg-c2lfx&limit=500&resourceVersion=0 Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.520930 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.540988 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.542083 5047 projected.go:288] Couldn't get configMap openshift-route-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.561833 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.579955 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.601995 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.621278 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.642334 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.661990 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.681151 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.700979 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.720057 5047 projected.go:288] Couldn't get configMap openshift-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.722212 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.740867 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.751130 5047 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.751286 5047 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.751303 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-signing-key podName:3fee8332-3258-4cf5-a2c6-f1d0c1c684f0 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:17.251278855 +0000 UTC m=+219.502605989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-signing-key") pod "service-ca-9c57cc56f-rhztb" (UID: "3fee8332-3258-4cf5-a2c6-f1d0c1c684f0") : failed to sync secret cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.751445 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-signing-cabundle podName:3fee8332-3258-4cf5-a2c6-f1d0c1c684f0 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:17.251413769 +0000 UTC m=+219.502740993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-signing-cabundle") pod "service-ca-9c57cc56f-rhztb" (UID: "3fee8332-3258-4cf5-a2c6-f1d0c1c684f0") : failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.751206 5047 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: E0223 06:48:16.751640 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99864770-491a-4f8e-8f3f-688436dc18ba-control-plane-machine-set-operator-tls podName:99864770-491a-4f8e-8f3f-688436dc18ba nodeName:}" failed. No retries permitted until 2026-02-23 06:48:17.251589854 +0000 UTC m=+219.502917058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/99864770-491a-4f8e-8f3f-688436dc18ba-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-22tgp" (UID: "99864770-491a-4f8e-8f3f-688436dc18ba") : failed to sync secret cache: timed out waiting for the condition Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.759806 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.759858 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.760653 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.780446 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.801162 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.820439 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.840443 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.860888 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.881150 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.899937 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.921045 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.941699 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.960412 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.968983 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-client-ca\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.969041 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-client-ca\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.969103 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-serving-cert\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.969196 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-config\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.969258 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.969337 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-config\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:16 crc kubenswrapper[5047]: I0223 06:48:16.980257 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.000834 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.020620 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.041668 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.061203 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.080498 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.101775 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.120294 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.140879 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.156479 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.160733 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.180671 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.200976 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.220383 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.240023 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.261118 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.273112 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-signing-cabundle\") pod \"service-ca-9c57cc56f-rhztb\" (UID: \"3fee8332-3258-4cf5-a2c6-f1d0c1c684f0\") " pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.273214 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-signing-key\") pod \"service-ca-9c57cc56f-rhztb\" (UID: \"3fee8332-3258-4cf5-a2c6-f1d0c1c684f0\") " pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.273285 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/99864770-491a-4f8e-8f3f-688436dc18ba-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-22tgp\" (UID: \"99864770-491a-4f8e-8f3f-688436dc18ba\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.274737 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-signing-cabundle\") pod \"service-ca-9c57cc56f-rhztb\" (UID: \"3fee8332-3258-4cf5-a2c6-f1d0c1c684f0\") " pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.277842 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/99864770-491a-4f8e-8f3f-688436dc18ba-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-22tgp\" (UID: \"99864770-491a-4f8e-8f3f-688436dc18ba\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.278749 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-signing-key\") pod \"service-ca-9c57cc56f-rhztb\" (UID: \"3fee8332-3258-4cf5-a2c6-f1d0c1c684f0\") " pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.291521 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.320812 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.340469 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.361244 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.380363 5047 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.400267 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.422194 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.440697 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.460669 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.481252 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.483142 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44" event={"ID":"3b1235b4-4a1e-4507-88c6-387adfae3d5d","Type":"ContainerStarted","Data":"fb58145229c515f7d65f9ec62660e9bdb9726426bc95eb56b32748223f2fb8c4"} Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.515541 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz9lr\" (UniqueName: \"kubernetes.io/projected/c20af121-a913-4178-aa60-35ca34fe91b2-kube-api-access-fz9lr\") pod \"dns-operator-744455d44c-w6ztx\" (UID: \"c20af121-a913-4178-aa60-35ca34fe91b2\") " pod="openshift-dns-operator/dns-operator-744455d44c-w6ztx" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.519155 5047 request.go:700] Waited for 1.877525833s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.535463 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7hm2\" (UniqueName: \"kubernetes.io/projected/8225b605-2014-4247-8d40-ce334502bb44-kube-api-access-r7hm2\") pod \"authentication-operator-69f744f599-7tlp5\" (UID: \"8225b605-2014-4247-8d40-ce334502bb44\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.542814 5047 projected.go:288] Couldn't get configMap openshift-route-controller-manager/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.542860 5047 projected.go:194] Error preparing data for projected volume kube-api-access-26trc for pod openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.542978 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-kube-api-access-26trc podName:d4967a59-3c6c-4ec3-9c70-4378ec3702c6 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:18.042944033 +0000 UTC m=+220.294271167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-26trc" (UniqueName: "kubernetes.io/projected/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-kube-api-access-26trc") pod "route-controller-manager-6576b87f9c-ct67m" (UID: "d4967a59-3c6c-4ec3-9c70-4378ec3702c6") : failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.555172 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkmvl\" (UniqueName: \"kubernetes.io/projected/cca77003-2334-4205-97ae-87f0ae6d34cc-kube-api-access-rkmvl\") pod \"machine-api-operator-5694c8668f-cfg96\" (UID: \"cca77003-2334-4205-97ae-87f0ae6d34cc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.576664 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7wds\" (UniqueName: \"kubernetes.io/projected/cd20eb8d-5a96-407f-a898-1dad49ba8355-kube-api-access-j7wds\") pod \"console-f9d7485db-v5zcl\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.598037 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkqm7\" (UniqueName: \"kubernetes.io/projected/ea8ee000-f48c-4d00-8602-cc8684f9f946-kube-api-access-xkqm7\") pod \"openshift-controller-manager-operator-756b6f6bc6-qxj5w\" (UID: \"ea8ee000-f48c-4d00-8602-cc8684f9f946\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.616248 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmhkb\" (UniqueName: \"kubernetes.io/projected/ac0d3a9f-d965-453b-8dcc-5cb638715ffa-kube-api-access-bmhkb\") pod \"console-operator-58897d9998-dhcsk\" (UID: \"ac0d3a9f-d965-453b-8dcc-5cb638715ffa\") " pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.635189 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd61b952-18b4-490f-97cb-dd2938aa9d22-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rcb6l\" (UID: \"cd61b952-18b4-490f-97cb-dd2938aa9d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.657365 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsf94\" (UniqueName: \"kubernetes.io/projected/b4a2bba0-ebb9-4923-a829-69112ef9c89c-kube-api-access-gsf94\") pod \"downloads-7954f5f757-75wvb\" (UID: \"b4a2bba0-ebb9-4923-a829-69112ef9c89c\") " pod="openshift-console/downloads-7954f5f757-75wvb" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.668120 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.686178 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv876\" (UniqueName: \"kubernetes.io/projected/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-kube-api-access-pv876\") pod \"oauth-openshift-558db77b4-ttb4m\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.700367 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd5sn\" (UniqueName: \"kubernetes.io/projected/aa630c01-a2ea-4988-b0db-dc5ae558c1e1-kube-api-access-sd5sn\") pod \"openshift-config-operator-7777fb866f-csx28\" (UID: \"aa630c01-a2ea-4988-b0db-dc5ae558c1e1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.708354 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-75wvb" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.715148 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.720069 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2mln\" (UniqueName: \"kubernetes.io/projected/cd61b952-18b4-490f-97cb-dd2938aa9d22-kube-api-access-r2mln\") pod \"cluster-image-registry-operator-dc59b4c8b-rcb6l\" (UID: \"cd61b952-18b4-490f-97cb-dd2938aa9d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.720267 5047 projected.go:288] Couldn't get configMap openshift-controller-manager/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.720306 5047 projected.go:194] Error preparing data for projected volume kube-api-access-qsltq for pod openshift-controller-manager/controller-manager-879f6c89f-cm66d: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.720505 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-kube-api-access-qsltq podName:71636caa-95b7-4d64-a6f1-1cdf7dc03c07 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:18.220372893 +0000 UTC m=+220.471700107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qsltq" (UniqueName: "kubernetes.io/projected/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-kube-api-access-qsltq") pod "controller-manager-879f6c89f-cm66d" (UID: "71636caa-95b7-4d64-a6f1-1cdf7dc03c07") : failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.724221 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.730282 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.735568 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk5gl\" (UniqueName: \"kubernetes.io/projected/ac96f0e5-3605-48c8-80b4-ef4e02123af8-kube-api-access-jk5gl\") pod \"apiserver-76f77b778f-9l6zf\" (UID: \"ac96f0e5-3605-48c8-80b4-ef4e02123af8\") " pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.738167 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w6ztx" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.747262 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.773760 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.777022 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbvw2\" (UniqueName: \"kubernetes.io/projected/f81c95a2-b542-4fde-be6c-9ac1e59f5ea1-kube-api-access-rbvw2\") pod \"machine-approver-56656f9798-6vkr2\" (UID: \"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.777617 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzkqx\" (UniqueName: \"kubernetes.io/projected/b448908d-5bb0-437e-a06c-d608dd395160-kube-api-access-dzkqx\") pod \"etcd-operator-b45778765-qgn89\" (UID: \"b448908d-5bb0-437e-a06c-d608dd395160\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.784032 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.805479 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65bc34e5-539d-4d00-8562-f8b02a455c4d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25gfm\" (UID: \"65bc34e5-539d-4d00-8562-f8b02a455c4d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.825242 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxw6v\" (UniqueName: \"kubernetes.io/projected/016925fa-5297-45db-8aa9-3f0310eb573f-kube-api-access-nxw6v\") pod \"ingress-operator-5b745b69d9-4b54r\" (UID: \"016925fa-5297-45db-8aa9-3f0310eb573f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.844358 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4pkd\" (UniqueName: \"kubernetes.io/projected/3fee8332-3258-4cf5-a2c6-f1d0c1c684f0-kube-api-access-l4pkd\") pod \"service-ca-9c57cc56f-rhztb\" (UID: \"3fee8332-3258-4cf5-a2c6-f1d0c1c684f0\") " pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.860749 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7qj2\" (UniqueName: \"kubernetes.io/projected/4cc1838c-dcb5-4c77-9d34-91507d631e3d-kube-api-access-s7qj2\") pod \"kube-storage-version-migrator-operator-b67b599dd-952vj\" (UID: \"4cc1838c-dcb5-4c77-9d34-91507d631e3d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.877067 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dhcsk"] Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.878475 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfjcc\" (UniqueName: \"kubernetes.io/projected/99864770-491a-4f8e-8f3f-688436dc18ba-kube-api-access-pfjcc\") pod \"control-plane-machine-set-operator-78cbb6b69f-22tgp\" (UID: \"99864770-491a-4f8e-8f3f-688436dc18ba\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.895438 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/016925fa-5297-45db-8aa9-3f0310eb573f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4b54r\" (UID: \"016925fa-5297-45db-8aa9-3f0310eb573f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.921388 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.933081 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-client-ca\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.933236 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.941876 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.958916 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.960565 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.970462 5047 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.970499 5047 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.970514 5047 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.970469 5047 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.970583 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-serving-cert podName:d4967a59-3c6c-4ec3-9c70-4378ec3702c6 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:18.970547852 +0000 UTC m=+221.221874986 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-serving-cert") pod "route-controller-manager-6576b87f9c-ct67m" (UID: "d4967a59-3c6c-4ec3-9c70-4378ec3702c6") : failed to sync secret cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.970604 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-config podName:71636caa-95b7-4d64-a6f1-1cdf7dc03c07 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:18.970595834 +0000 UTC m=+221.221922968 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-config") pod "controller-manager-879f6c89f-cm66d" (UID: "71636caa-95b7-4d64-a6f1-1cdf7dc03c07") : failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.970622 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-client-ca podName:71636caa-95b7-4d64-a6f1-1cdf7dc03c07 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:18.970613364 +0000 UTC m=+221.221940718 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-client-ca") pod "controller-manager-879f6c89f-cm66d" (UID: "71636caa-95b7-4d64-a6f1-1cdf7dc03c07") : failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: E0223 06:48:17.970637 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-proxy-ca-bundles podName:71636caa-95b7-4d64-a6f1-1cdf7dc03c07 nodeName:}" failed. No retries permitted until 2026-02-23 06:48:18.970630325 +0000 UTC m=+221.221957459 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-proxy-ca-bundles") pod "controller-manager-879f6c89f-cm66d" (UID: "71636caa-95b7-4d64-a6f1-1cdf7dc03c07") : failed to sync configmap cache: timed out waiting for the condition Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.972374 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-config\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.980077 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" Feb 23 06:48:17 crc kubenswrapper[5047]: I0223 06:48:17.980612 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:17.999879 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.017515 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.020943 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.036314 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.042300 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.065318 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.080400 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.083350 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.084656 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26trc\" (UniqueName: \"kubernetes.io/projected/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-kube-api-access-26trc\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.096980 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.104767 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.121767 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.122714 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.131477 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26trc\" (UniqueName: \"kubernetes.io/projected/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-kube-api-access-26trc\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.142175 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.158019 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.187396 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.187445 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a75720bd-50a7-4a3c-b12c-e901126d4382-registry-certificates\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.188301 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1e6f5f-7b32-4f6f-bc21-bc520e469640-serving-cert\") pod \"service-ca-operator-777779d784-27bdk\" (UID: \"4a1e6f5f-7b32-4f6f-bc21-bc520e469640\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.188522 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk8w5\" (UniqueName: \"kubernetes.io/projected/0184e0f6-46c5-4a13-967f-ea628e1478b6-kube-api-access-wk8w5\") pod \"ingress-canary-f8298\" (UID: \"0184e0f6-46c5-4a13-967f-ea628e1478b6\") " pod="openshift-ingress-canary/ingress-canary-f8298" Feb 23 06:48:18 crc kubenswrapper[5047]: E0223 06:48:18.188592 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:18.688565579 +0000 UTC m=+220.939892713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.188643 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a75720bd-50a7-4a3c-b12c-e901126d4382-trusted-ca\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.188708 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdxvf\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-kube-api-access-jdxvf\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.188772 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/af373c51-80d6-4656-9b92-800ebb6244b5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.188800 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59v2n\" (UniqueName: \"kubernetes.io/projected/af373c51-80d6-4656-9b92-800ebb6244b5-kube-api-access-59v2n\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.188828 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d790e45e-5411-4acb-8875-23c6395250c8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4mdvr\" (UID: \"d790e45e-5411-4acb-8875-23c6395250c8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.188864 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnjqp\" (UniqueName: \"kubernetes.io/projected/ef86187f-af31-4959-8d1c-fb954bed58d0-kube-api-access-mnjqp\") pod \"migrator-59844c95c7-hh4ql\" (UID: \"ef86187f-af31-4959-8d1c-fb954bed58d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hh4ql" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.188890 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67nf7\" (UniqueName: \"kubernetes.io/projected/37497943-34b6-4c1b-988e-c7b07e9bf608-kube-api-access-67nf7\") pod \"olm-operator-6b444d44fb-z7flg\" (UID: \"37497943-34b6-4c1b-988e-c7b07e9bf608\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.188954 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhzfq\" (UniqueName: \"kubernetes.io/projected/601765d3-2ae0-4cd2-a1fb-2c54de37487b-kube-api-access-qhzfq\") pod \"marketplace-operator-79b997595-fdfm4\" (UID: \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\") " pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.188992 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/37497943-34b6-4c1b-988e-c7b07e9bf608-srv-cert\") pod \"olm-operator-6b444d44fb-z7flg\" (UID: \"37497943-34b6-4c1b-988e-c7b07e9bf608\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.189011 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/02886198-ff3b-442c-8508-c29fa5dbd216-images\") pod \"machine-config-operator-74547568cd-bwkb2\" (UID: \"02886198-ff3b-442c-8508-c29fa5dbd216\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.189046 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/af373c51-80d6-4656-9b92-800ebb6244b5-encryption-config\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.189195 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/cb3f2abe-d31e-4ece-aae1-abe99f0b1425-tmpfs\") pod \"packageserver-d55dfcdfc-nvbsc\" (UID: \"cb3f2abe-d31e-4ece-aae1-abe99f0b1425\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.189229 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e98b1fd-d7b2-4af4-92c3-f6b4df04df73-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7x976\" (UID: \"9e98b1fd-d7b2-4af4-92c3-f6b4df04df73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.189259 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a75720bd-50a7-4a3c-b12c-e901126d4382-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.189287 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af373c51-80d6-4656-9b92-800ebb6244b5-etcd-client\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.189603 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/601765d3-2ae0-4cd2-a1fb-2c54de37487b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fdfm4\" (UID: \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\") " pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.189656 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w6ztx"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.189816 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb3f2abe-d31e-4ece-aae1-abe99f0b1425-webhook-cert\") pod \"packageserver-d55dfcdfc-nvbsc\" (UID: \"cb3f2abe-d31e-4ece-aae1-abe99f0b1425\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.189879 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdc8q\" (UniqueName: \"kubernetes.io/projected/efef4bf3-fb34-4cca-b99e-a00249f38e11-kube-api-access-rdc8q\") pod \"machine-config-controller-84d6567774-t45d6\" (UID: \"efef4bf3-fb34-4cca-b99e-a00249f38e11\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.190021 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ssfv\" (UniqueName: \"kubernetes.io/projected/1301a9f3-55a0-4a5d-b851-e764b459297e-kube-api-access-5ssfv\") pod \"package-server-manager-789f6589d5-lxddc\" (UID: \"1301a9f3-55a0-4a5d-b851-e764b459297e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.190094 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-service-ca-bundle\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.190116 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-secret-volume\") pod \"collect-profiles-29530485-jxmq4\" (UID: \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.190138 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0184e0f6-46c5-4a13-967f-ea628e1478b6-cert\") pod \"ingress-canary-f8298\" (UID: \"0184e0f6-46c5-4a13-967f-ea628e1478b6\") " pod="openshift-ingress-canary/ingress-canary-f8298" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.190175 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-metrics-certs\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.190217 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1301a9f3-55a0-4a5d-b851-e764b459297e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lxddc\" (UID: \"1301a9f3-55a0-4a5d-b851-e764b459297e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.190237 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e98b1fd-d7b2-4af4-92c3-f6b4df04df73-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7x976\" (UID: \"9e98b1fd-d7b2-4af4-92c3-f6b4df04df73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.190361 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02886198-ff3b-442c-8508-c29fa5dbd216-proxy-tls\") pod \"machine-config-operator-74547568cd-bwkb2\" (UID: \"02886198-ff3b-442c-8508-c29fa5dbd216\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.190401 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02886198-ff3b-442c-8508-c29fa5dbd216-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bwkb2\" (UID: \"02886198-ff3b-442c-8508-c29fa5dbd216\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.190424 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1e6f5f-7b32-4f6f-bc21-bc520e469640-config\") pod \"service-ca-operator-777779d784-27bdk\" (UID: \"4a1e6f5f-7b32-4f6f-bc21-bc520e469640\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.190527 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a75720bd-50a7-4a3c-b12c-e901126d4382-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.190554 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af373c51-80d6-4656-9b92-800ebb6244b5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.190572 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhd28\" (UniqueName: \"kubernetes.io/projected/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-kube-api-access-hhd28\") pod \"collect-profiles-29530485-jxmq4\" (UID: \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.191229 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-bound-sa-token\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.191468 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/37497943-34b6-4c1b-988e-c7b07e9bf608-profile-collector-cert\") pod \"olm-operator-6b444d44fb-z7flg\" (UID: \"37497943-34b6-4c1b-988e-c7b07e9bf608\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.191504 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-stats-auth\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.191527 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/efef4bf3-fb34-4cca-b99e-a00249f38e11-proxy-tls\") pod \"machine-config-controller-84d6567774-t45d6\" (UID: \"efef4bf3-fb34-4cca-b99e-a00249f38e11\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.191610 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjg8j\" (UniqueName: \"kubernetes.io/projected/1379a9ff-16c4-4717-b9d8-59ea36808a48-kube-api-access-fjg8j\") pod \"catalog-operator-68c6474976-fsgwn\" (UID: \"1379a9ff-16c4-4717-b9d8-59ea36808a48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.191739 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d790e45e-5411-4acb-8875-23c6395250c8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4mdvr\" (UID: \"d790e45e-5411-4acb-8875-23c6395250c8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.191970 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88v7n\" (UniqueName: \"kubernetes.io/projected/4a1e6f5f-7b32-4f6f-bc21-bc520e469640-kube-api-access-88v7n\") pod \"service-ca-operator-777779d784-27bdk\" (UID: \"4a1e6f5f-7b32-4f6f-bc21-bc520e469640\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.192027 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af373c51-80d6-4656-9b92-800ebb6244b5-serving-cert\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.192240 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jj72\" (UniqueName: \"kubernetes.io/projected/1d363fb9-3892-412c-a2bd-ba00a377c2eb-kube-api-access-9jj72\") pod \"multus-admission-controller-857f4d67dd-b9dcg\" (UID: \"1d363fb9-3892-412c-a2bd-ba00a377c2eb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b9dcg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.192462 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/601765d3-2ae0-4cd2-a1fb-2c54de37487b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fdfm4\" (UID: \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\") " pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.192508 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2hdh\" (UniqueName: \"kubernetes.io/projected/02886198-ff3b-442c-8508-c29fa5dbd216-kube-api-access-f2hdh\") pod \"machine-config-operator-74547568cd-bwkb2\" (UID: \"02886198-ff3b-442c-8508-c29fa5dbd216\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.192614 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1379a9ff-16c4-4717-b9d8-59ea36808a48-srv-cert\") pod \"catalog-operator-68c6474976-fsgwn\" (UID: \"1379a9ff-16c4-4717-b9d8-59ea36808a48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.192644 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj6dp\" (UniqueName: \"kubernetes.io/projected/cb3f2abe-d31e-4ece-aae1-abe99f0b1425-kube-api-access-dj6dp\") pod \"packageserver-d55dfcdfc-nvbsc\" (UID: \"cb3f2abe-d31e-4ece-aae1-abe99f0b1425\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.192698 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-default-certificate\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.192730 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1d363fb9-3892-412c-a2bd-ba00a377c2eb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-b9dcg\" (UID: \"1d363fb9-3892-412c-a2bd-ba00a377c2eb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b9dcg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.192755 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb3f2abe-d31e-4ece-aae1-abe99f0b1425-apiservice-cert\") pod \"packageserver-d55dfcdfc-nvbsc\" (UID: \"cb3f2abe-d31e-4ece-aae1-abe99f0b1425\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.192777 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/af373c51-80d6-4656-9b92-800ebb6244b5-audit-dir\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.192826 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-registry-tls\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.192855 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e98b1fd-d7b2-4af4-92c3-f6b4df04df73-config\") pod \"kube-apiserver-operator-766d6c64bb-7x976\" (UID: \"9e98b1fd-d7b2-4af4-92c3-f6b4df04df73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.193510 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1379a9ff-16c4-4717-b9d8-59ea36808a48-profile-collector-cert\") pod \"catalog-operator-68c6474976-fsgwn\" (UID: \"1379a9ff-16c4-4717-b9d8-59ea36808a48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.197192 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/af373c51-80d6-4656-9b92-800ebb6244b5-audit-policies\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.197279 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d790e45e-5411-4acb-8875-23c6395250c8-config\") pod \"kube-controller-manager-operator-78b949d7b-4mdvr\" (UID: \"d790e45e-5411-4acb-8875-23c6395250c8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.197366 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9vjl\" (UniqueName: \"kubernetes.io/projected/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-kube-api-access-n9vjl\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.197580 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efef4bf3-fb34-4cca-b99e-a00249f38e11-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-t45d6\" (UID: \"efef4bf3-fb34-4cca-b99e-a00249f38e11\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.197624 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-config-volume\") pod \"collect-profiles-29530485-jxmq4\" (UID: \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.200828 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-75wvb"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.286255 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qgn89"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.298850 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:18 crc kubenswrapper[5047]: E0223 06:48:18.299067 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:18.799030103 +0000 UTC m=+221.050357237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299142 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c41c983d-45b3-45e8-9562-70d269d414ff-metrics-tls\") pod \"dns-default-p7tmt\" (UID: \"c41c983d-45b3-45e8-9562-70d269d414ff\") " pod="openshift-dns/dns-default-p7tmt" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299179 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-plugins-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299245 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/af373c51-80d6-4656-9b92-800ebb6244b5-audit-policies\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299277 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d790e45e-5411-4acb-8875-23c6395250c8-config\") pod \"kube-controller-manager-operator-78b949d7b-4mdvr\" (UID: \"d790e45e-5411-4acb-8875-23c6395250c8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299306 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9vjl\" (UniqueName: \"kubernetes.io/projected/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-kube-api-access-n9vjl\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299340 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efef4bf3-fb34-4cca-b99e-a00249f38e11-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-t45d6\" (UID: \"efef4bf3-fb34-4cca-b99e-a00249f38e11\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299368 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-config-volume\") pod \"collect-profiles-29530485-jxmq4\" (UID: \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299388 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsltq\" (UniqueName: \"kubernetes.io/projected/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-kube-api-access-qsltq\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299409 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299428 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a75720bd-50a7-4a3c-b12c-e901126d4382-registry-certificates\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299488 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1e6f5f-7b32-4f6f-bc21-bc520e469640-serving-cert\") pod \"service-ca-operator-777779d784-27bdk\" (UID: \"4a1e6f5f-7b32-4f6f-bc21-bc520e469640\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299529 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk8w5\" (UniqueName: \"kubernetes.io/projected/0184e0f6-46c5-4a13-967f-ea628e1478b6-kube-api-access-wk8w5\") pod \"ingress-canary-f8298\" (UID: \"0184e0f6-46c5-4a13-967f-ea628e1478b6\") " pod="openshift-ingress-canary/ingress-canary-f8298" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299558 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a75720bd-50a7-4a3c-b12c-e901126d4382-trusted-ca\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299580 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdxvf\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-kube-api-access-jdxvf\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299601 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/af373c51-80d6-4656-9b92-800ebb6244b5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299620 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59v2n\" (UniqueName: \"kubernetes.io/projected/af373c51-80d6-4656-9b92-800ebb6244b5-kube-api-access-59v2n\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299641 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d790e45e-5411-4acb-8875-23c6395250c8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4mdvr\" (UID: \"d790e45e-5411-4acb-8875-23c6395250c8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299687 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnjqp\" (UniqueName: \"kubernetes.io/projected/ef86187f-af31-4959-8d1c-fb954bed58d0-kube-api-access-mnjqp\") pod \"migrator-59844c95c7-hh4ql\" (UID: \"ef86187f-af31-4959-8d1c-fb954bed58d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hh4ql" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299714 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c41c983d-45b3-45e8-9562-70d269d414ff-config-volume\") pod \"dns-default-p7tmt\" (UID: \"c41c983d-45b3-45e8-9562-70d269d414ff\") " pod="openshift-dns/dns-default-p7tmt" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299759 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67nf7\" (UniqueName: \"kubernetes.io/projected/37497943-34b6-4c1b-988e-c7b07e9bf608-kube-api-access-67nf7\") pod \"olm-operator-6b444d44fb-z7flg\" (UID: \"37497943-34b6-4c1b-988e-c7b07e9bf608\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299794 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhzfq\" (UniqueName: \"kubernetes.io/projected/601765d3-2ae0-4cd2-a1fb-2c54de37487b-kube-api-access-qhzfq\") pod \"marketplace-operator-79b997595-fdfm4\" (UID: \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\") " pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299823 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/37497943-34b6-4c1b-988e-c7b07e9bf608-srv-cert\") pod \"olm-operator-6b444d44fb-z7flg\" (UID: \"37497943-34b6-4c1b-988e-c7b07e9bf608\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299843 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/02886198-ff3b-442c-8508-c29fa5dbd216-images\") pod \"machine-config-operator-74547568cd-bwkb2\" (UID: \"02886198-ff3b-442c-8508-c29fa5dbd216\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299873 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/af373c51-80d6-4656-9b92-800ebb6244b5-encryption-config\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299909 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/cb3f2abe-d31e-4ece-aae1-abe99f0b1425-tmpfs\") pod \"packageserver-d55dfcdfc-nvbsc\" (UID: \"cb3f2abe-d31e-4ece-aae1-abe99f0b1425\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.299957 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e98b1fd-d7b2-4af4-92c3-f6b4df04df73-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7x976\" (UID: \"9e98b1fd-d7b2-4af4-92c3-f6b4df04df73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300003 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a75720bd-50a7-4a3c-b12c-e901126d4382-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300028 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af373c51-80d6-4656-9b92-800ebb6244b5-etcd-client\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300055 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ff54766c-b299-4eab-8a68-8d85552d861f-node-bootstrap-token\") pod \"machine-config-server-tfqhr\" (UID: \"ff54766c-b299-4eab-8a68-8d85552d861f\") " pod="openshift-machine-config-operator/machine-config-server-tfqhr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300073 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/601765d3-2ae0-4cd2-a1fb-2c54de37487b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fdfm4\" (UID: \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\") " pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300091 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb3f2abe-d31e-4ece-aae1-abe99f0b1425-webhook-cert\") pod \"packageserver-d55dfcdfc-nvbsc\" (UID: \"cb3f2abe-d31e-4ece-aae1-abe99f0b1425\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300109 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdc8q\" (UniqueName: \"kubernetes.io/projected/efef4bf3-fb34-4cca-b99e-a00249f38e11-kube-api-access-rdc8q\") pod \"machine-config-controller-84d6567774-t45d6\" (UID: \"efef4bf3-fb34-4cca-b99e-a00249f38e11\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300272 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ssfv\" (UniqueName: \"kubernetes.io/projected/1301a9f3-55a0-4a5d-b851-e764b459297e-kube-api-access-5ssfv\") pod \"package-server-manager-789f6589d5-lxddc\" (UID: \"1301a9f3-55a0-4a5d-b851-e764b459297e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300352 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-service-ca-bundle\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300380 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-secret-volume\") pod \"collect-profiles-29530485-jxmq4\" (UID: \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300448 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0184e0f6-46c5-4a13-967f-ea628e1478b6-cert\") pod \"ingress-canary-f8298\" (UID: \"0184e0f6-46c5-4a13-967f-ea628e1478b6\") " pod="openshift-ingress-canary/ingress-canary-f8298" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300504 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj9rt\" (UniqueName: \"kubernetes.io/projected/ff54766c-b299-4eab-8a68-8d85552d861f-kube-api-access-pj9rt\") pod \"machine-config-server-tfqhr\" (UID: \"ff54766c-b299-4eab-8a68-8d85552d861f\") " pod="openshift-machine-config-operator/machine-config-server-tfqhr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300539 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-metrics-certs\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300588 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1301a9f3-55a0-4a5d-b851-e764b459297e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lxddc\" (UID: \"1301a9f3-55a0-4a5d-b851-e764b459297e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300609 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e98b1fd-d7b2-4af4-92c3-f6b4df04df73-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7x976\" (UID: \"9e98b1fd-d7b2-4af4-92c3-f6b4df04df73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300631 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntcvn\" (UniqueName: \"kubernetes.io/projected/b0031f21-2311-48e2-acb4-fc64475e66a2-kube-api-access-ntcvn\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300681 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02886198-ff3b-442c-8508-c29fa5dbd216-proxy-tls\") pod \"machine-config-operator-74547568cd-bwkb2\" (UID: \"02886198-ff3b-442c-8508-c29fa5dbd216\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300701 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-csi-data-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300844 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02886198-ff3b-442c-8508-c29fa5dbd216-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bwkb2\" (UID: \"02886198-ff3b-442c-8508-c29fa5dbd216\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300870 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1e6f5f-7b32-4f6f-bc21-bc520e469640-config\") pod \"service-ca-operator-777779d784-27bdk\" (UID: \"4a1e6f5f-7b32-4f6f-bc21-bc520e469640\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.300891 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a75720bd-50a7-4a3c-b12c-e901126d4382-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301027 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af373c51-80d6-4656-9b92-800ebb6244b5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301071 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhd28\" (UniqueName: \"kubernetes.io/projected/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-kube-api-access-hhd28\") pod \"collect-profiles-29530485-jxmq4\" (UID: \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301118 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-bound-sa-token\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301163 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/37497943-34b6-4c1b-988e-c7b07e9bf608-profile-collector-cert\") pod \"olm-operator-6b444d44fb-z7flg\" (UID: \"37497943-34b6-4c1b-988e-c7b07e9bf608\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301180 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-stats-auth\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301200 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjg8j\" (UniqueName: \"kubernetes.io/projected/1379a9ff-16c4-4717-b9d8-59ea36808a48-kube-api-access-fjg8j\") pod \"catalog-operator-68c6474976-fsgwn\" (UID: \"1379a9ff-16c4-4717-b9d8-59ea36808a48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301219 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/efef4bf3-fb34-4cca-b99e-a00249f38e11-proxy-tls\") pod \"machine-config-controller-84d6567774-t45d6\" (UID: \"efef4bf3-fb34-4cca-b99e-a00249f38e11\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301235 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d790e45e-5411-4acb-8875-23c6395250c8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4mdvr\" (UID: \"d790e45e-5411-4acb-8875-23c6395250c8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301265 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88v7n\" (UniqueName: \"kubernetes.io/projected/4a1e6f5f-7b32-4f6f-bc21-bc520e469640-kube-api-access-88v7n\") pod \"service-ca-operator-777779d784-27bdk\" (UID: \"4a1e6f5f-7b32-4f6f-bc21-bc520e469640\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301283 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af373c51-80d6-4656-9b92-800ebb6244b5-serving-cert\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301308 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvdcn\" (UniqueName: \"kubernetes.io/projected/c41c983d-45b3-45e8-9562-70d269d414ff-kube-api-access-jvdcn\") pod \"dns-default-p7tmt\" (UID: \"c41c983d-45b3-45e8-9562-70d269d414ff\") " pod="openshift-dns/dns-default-p7tmt" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301333 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jj72\" (UniqueName: \"kubernetes.io/projected/1d363fb9-3892-412c-a2bd-ba00a377c2eb-kube-api-access-9jj72\") pod \"multus-admission-controller-857f4d67dd-b9dcg\" (UID: \"1d363fb9-3892-412c-a2bd-ba00a377c2eb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b9dcg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301350 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ff54766c-b299-4eab-8a68-8d85552d861f-certs\") pod \"machine-config-server-tfqhr\" (UID: \"ff54766c-b299-4eab-8a68-8d85552d861f\") " pod="openshift-machine-config-operator/machine-config-server-tfqhr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301399 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/601765d3-2ae0-4cd2-a1fb-2c54de37487b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fdfm4\" (UID: \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\") " pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301417 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2hdh\" (UniqueName: \"kubernetes.io/projected/02886198-ff3b-442c-8508-c29fa5dbd216-kube-api-access-f2hdh\") pod \"machine-config-operator-74547568cd-bwkb2\" (UID: \"02886198-ff3b-442c-8508-c29fa5dbd216\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301433 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1379a9ff-16c4-4717-b9d8-59ea36808a48-srv-cert\") pod \"catalog-operator-68c6474976-fsgwn\" (UID: \"1379a9ff-16c4-4717-b9d8-59ea36808a48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301467 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj6dp\" (UniqueName: \"kubernetes.io/projected/cb3f2abe-d31e-4ece-aae1-abe99f0b1425-kube-api-access-dj6dp\") pod \"packageserver-d55dfcdfc-nvbsc\" (UID: \"cb3f2abe-d31e-4ece-aae1-abe99f0b1425\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301485 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-default-certificate\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301502 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-registration-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301520 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1d363fb9-3892-412c-a2bd-ba00a377c2eb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-b9dcg\" (UID: \"1d363fb9-3892-412c-a2bd-ba00a377c2eb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b9dcg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301538 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb3f2abe-d31e-4ece-aae1-abe99f0b1425-apiservice-cert\") pod \"packageserver-d55dfcdfc-nvbsc\" (UID: \"cb3f2abe-d31e-4ece-aae1-abe99f0b1425\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301554 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/af373c51-80d6-4656-9b92-800ebb6244b5-audit-dir\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301569 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-mountpoint-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301587 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-registry-tls\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301602 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e98b1fd-d7b2-4af4-92c3-f6b4df04df73-config\") pod \"kube-apiserver-operator-766d6c64bb-7x976\" (UID: \"9e98b1fd-d7b2-4af4-92c3-f6b4df04df73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301618 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-socket-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.301637 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1379a9ff-16c4-4717-b9d8-59ea36808a48-profile-collector-cert\") pod \"catalog-operator-68c6474976-fsgwn\" (UID: \"1379a9ff-16c4-4717-b9d8-59ea36808a48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.302230 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/af373c51-80d6-4656-9b92-800ebb6244b5-audit-policies\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.303484 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d790e45e-5411-4acb-8875-23c6395250c8-config\") pod \"kube-controller-manager-operator-78b949d7b-4mdvr\" (UID: \"d790e45e-5411-4acb-8875-23c6395250c8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.304093 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/af373c51-80d6-4656-9b92-800ebb6244b5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.304444 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a75720bd-50a7-4a3c-b12c-e901126d4382-registry-certificates\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.305719 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efef4bf3-fb34-4cca-b99e-a00249f38e11-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-t45d6\" (UID: \"efef4bf3-fb34-4cca-b99e-a00249f38e11\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.306681 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-config-volume\") pod \"collect-profiles-29530485-jxmq4\" (UID: \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.312836 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1e6f5f-7b32-4f6f-bc21-bc520e469640-serving-cert\") pod \"service-ca-operator-777779d784-27bdk\" (UID: \"4a1e6f5f-7b32-4f6f-bc21-bc520e469640\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.313731 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1e6f5f-7b32-4f6f-bc21-bc520e469640-config\") pod \"service-ca-operator-777779d784-27bdk\" (UID: \"4a1e6f5f-7b32-4f6f-bc21-bc520e469640\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.318069 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-service-ca-bundle\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.319024 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/af373c51-80d6-4656-9b92-800ebb6244b5-audit-dir\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.319071 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af373c51-80d6-4656-9b92-800ebb6244b5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: E0223 06:48:18.319721 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:18.819702376 +0000 UTC m=+221.071029510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.325067 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e98b1fd-d7b2-4af4-92c3-f6b4df04df73-config\") pod \"kube-apiserver-operator-766d6c64bb-7x976\" (UID: \"9e98b1fd-d7b2-4af4-92c3-f6b4df04df73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.325420 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d790e45e-5411-4acb-8875-23c6395250c8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4mdvr\" (UID: \"d790e45e-5411-4acb-8875-23c6395250c8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.325590 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/cb3f2abe-d31e-4ece-aae1-abe99f0b1425-tmpfs\") pod \"packageserver-d55dfcdfc-nvbsc\" (UID: \"cb3f2abe-d31e-4ece-aae1-abe99f0b1425\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.326271 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/02886198-ff3b-442c-8508-c29fa5dbd216-images\") pod \"machine-config-operator-74547568cd-bwkb2\" (UID: \"02886198-ff3b-442c-8508-c29fa5dbd216\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.326280 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/601765d3-2ae0-4cd2-a1fb-2c54de37487b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fdfm4\" (UID: \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\") " pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.327286 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a75720bd-50a7-4a3c-b12c-e901126d4382-trusted-ca\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.333076 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1d363fb9-3892-412c-a2bd-ba00a377c2eb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-b9dcg\" (UID: \"1d363fb9-3892-412c-a2bd-ba00a377c2eb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b9dcg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.333365 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af373c51-80d6-4656-9b92-800ebb6244b5-serving-cert\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.334127 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02886198-ff3b-442c-8508-c29fa5dbd216-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bwkb2\" (UID: \"02886198-ff3b-442c-8508-c29fa5dbd216\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.336719 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1379a9ff-16c4-4717-b9d8-59ea36808a48-profile-collector-cert\") pod \"catalog-operator-68c6474976-fsgwn\" (UID: \"1379a9ff-16c4-4717-b9d8-59ea36808a48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.338873 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cfg96"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.341092 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/efef4bf3-fb34-4cca-b99e-a00249f38e11-proxy-tls\") pod \"machine-config-controller-84d6567774-t45d6\" (UID: \"efef4bf3-fb34-4cca-b99e-a00249f38e11\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.342469 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a75720bd-50a7-4a3c-b12c-e901126d4382-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.343316 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-default-certificate\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.343437 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb3f2abe-d31e-4ece-aae1-abe99f0b1425-webhook-cert\") pod \"packageserver-d55dfcdfc-nvbsc\" (UID: \"cb3f2abe-d31e-4ece-aae1-abe99f0b1425\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.343784 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/af373c51-80d6-4656-9b92-800ebb6244b5-encryption-config\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.344400 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/601765d3-2ae0-4cd2-a1fb-2c54de37487b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fdfm4\" (UID: \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\") " pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.344871 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1301a9f3-55a0-4a5d-b851-e764b459297e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lxddc\" (UID: \"1301a9f3-55a0-4a5d-b851-e764b459297e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.345006 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e98b1fd-d7b2-4af4-92c3-f6b4df04df73-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7x976\" (UID: \"9e98b1fd-d7b2-4af4-92c3-f6b4df04df73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.345324 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-stats-auth\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.345616 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af373c51-80d6-4656-9b92-800ebb6244b5-etcd-client\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.348759 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-secret-volume\") pod \"collect-profiles-29530485-jxmq4\" (UID: \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.351847 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1379a9ff-16c4-4717-b9d8-59ea36808a48-srv-cert\") pod \"catalog-operator-68c6474976-fsgwn\" (UID: \"1379a9ff-16c4-4717-b9d8-59ea36808a48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.352478 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a75720bd-50a7-4a3c-b12c-e901126d4382-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.355385 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/37497943-34b6-4c1b-988e-c7b07e9bf608-profile-collector-cert\") pod \"olm-operator-6b444d44fb-z7flg\" (UID: \"37497943-34b6-4c1b-988e-c7b07e9bf608\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.357478 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdxvf\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-kube-api-access-jdxvf\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.359440 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-metrics-certs\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.360777 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0184e0f6-46c5-4a13-967f-ea628e1478b6-cert\") pod \"ingress-canary-f8298\" (UID: \"0184e0f6-46c5-4a13-967f-ea628e1478b6\") " pod="openshift-ingress-canary/ingress-canary-f8298" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.363964 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02886198-ff3b-442c-8508-c29fa5dbd216-proxy-tls\") pod \"machine-config-operator-74547568cd-bwkb2\" (UID: \"02886198-ff3b-442c-8508-c29fa5dbd216\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.364025 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsltq\" (UniqueName: \"kubernetes.io/projected/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-kube-api-access-qsltq\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.364312 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-registry-tls\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.364503 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/37497943-34b6-4c1b-988e-c7b07e9bf608-srv-cert\") pod \"olm-operator-6b444d44fb-z7flg\" (UID: \"37497943-34b6-4c1b-988e-c7b07e9bf608\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.365235 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb3f2abe-d31e-4ece-aae1-abe99f0b1425-apiservice-cert\") pod \"packageserver-d55dfcdfc-nvbsc\" (UID: \"cb3f2abe-d31e-4ece-aae1-abe99f0b1425\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.374846 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59v2n\" (UniqueName: \"kubernetes.io/projected/af373c51-80d6-4656-9b92-800ebb6244b5-kube-api-access-59v2n\") pod \"apiserver-7bbb656c7d-m5fb6\" (UID: \"af373c51-80d6-4656-9b92-800ebb6244b5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.375138 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9vjl\" (UniqueName: \"kubernetes.io/projected/07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e-kube-api-access-n9vjl\") pod \"router-default-5444994796-g5zp5\" (UID: \"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e\") " pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.382772 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.382819 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v5zcl"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.383282 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj6dp\" (UniqueName: \"kubernetes.io/projected/cb3f2abe-d31e-4ece-aae1-abe99f0b1425-kube-api-access-dj6dp\") pod \"packageserver-d55dfcdfc-nvbsc\" (UID: \"cb3f2abe-d31e-4ece-aae1-abe99f0b1425\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.386138 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ttb4m"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.402573 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.402792 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c41c983d-45b3-45e8-9562-70d269d414ff-config-volume\") pod \"dns-default-p7tmt\" (UID: \"c41c983d-45b3-45e8-9562-70d269d414ff\") " pod="openshift-dns/dns-default-p7tmt" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.402850 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ff54766c-b299-4eab-8a68-8d85552d861f-node-bootstrap-token\") pod \"machine-config-server-tfqhr\" (UID: \"ff54766c-b299-4eab-8a68-8d85552d861f\") " pod="openshift-machine-config-operator/machine-config-server-tfqhr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.402893 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj9rt\" (UniqueName: \"kubernetes.io/projected/ff54766c-b299-4eab-8a68-8d85552d861f-kube-api-access-pj9rt\") pod \"machine-config-server-tfqhr\" (UID: \"ff54766c-b299-4eab-8a68-8d85552d861f\") " pod="openshift-machine-config-operator/machine-config-server-tfqhr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.402939 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntcvn\" (UniqueName: \"kubernetes.io/projected/b0031f21-2311-48e2-acb4-fc64475e66a2-kube-api-access-ntcvn\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.402959 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-csi-data-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.403021 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvdcn\" (UniqueName: \"kubernetes.io/projected/c41c983d-45b3-45e8-9562-70d269d414ff-kube-api-access-jvdcn\") pod \"dns-default-p7tmt\" (UID: \"c41c983d-45b3-45e8-9562-70d269d414ff\") " pod="openshift-dns/dns-default-p7tmt" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.403043 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ff54766c-b299-4eab-8a68-8d85552d861f-certs\") pod \"machine-config-server-tfqhr\" (UID: \"ff54766c-b299-4eab-8a68-8d85552d861f\") " pod="openshift-machine-config-operator/machine-config-server-tfqhr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.403077 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-registration-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.403095 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-mountpoint-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.403114 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-socket-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.403134 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c41c983d-45b3-45e8-9562-70d269d414ff-metrics-tls\") pod \"dns-default-p7tmt\" (UID: \"c41c983d-45b3-45e8-9562-70d269d414ff\") " pod="openshift-dns/dns-default-p7tmt" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.403151 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-plugins-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.403387 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-csi-data-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.403485 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-plugins-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: E0223 06:48:18.403547 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:18.903514311 +0000 UTC m=+221.154841515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.404446 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c41c983d-45b3-45e8-9562-70d269d414ff-config-volume\") pod \"dns-default-p7tmt\" (UID: \"c41c983d-45b3-45e8-9562-70d269d414ff\") " pod="openshift-dns/dns-default-p7tmt" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.406584 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-mountpoint-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.406665 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-registration-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.406881 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b0031f21-2311-48e2-acb4-fc64475e66a2-socket-dir\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.409976 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.413227 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2hdh\" (UniqueName: \"kubernetes.io/projected/02886198-ff3b-442c-8508-c29fa5dbd216-kube-api-access-f2hdh\") pod \"machine-config-operator-74547568cd-bwkb2\" (UID: \"02886198-ff3b-442c-8508-c29fa5dbd216\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:18 crc kubenswrapper[5047]: W0223 06:48:18.414567 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd61b952_18b4_490f_97cb_dd2938aa9d22.slice/crio-b100de53cb18a5fa44c03042578334c7ac701b11200a15651716eca683ac865b WatchSource:0}: Error finding container b100de53cb18a5fa44c03042578334c7ac701b11200a15651716eca683ac865b: Status 404 returned error can't find the container with id b100de53cb18a5fa44c03042578334c7ac701b11200a15651716eca683ac865b Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.417475 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c41c983d-45b3-45e8-9562-70d269d414ff-metrics-tls\") pod \"dns-default-p7tmt\" (UID: \"c41c983d-45b3-45e8-9562-70d269d414ff\") " pod="openshift-dns/dns-default-p7tmt" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.425941 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ff54766c-b299-4eab-8a68-8d85552d861f-node-bootstrap-token\") pod \"machine-config-server-tfqhr\" (UID: \"ff54766c-b299-4eab-8a68-8d85552d861f\") " pod="openshift-machine-config-operator/machine-config-server-tfqhr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.427936 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7tlp5"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.429100 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ff54766c-b299-4eab-8a68-8d85552d861f-certs\") pod \"machine-config-server-tfqhr\" (UID: \"ff54766c-b299-4eab-8a68-8d85552d861f\") " pod="openshift-machine-config-operator/machine-config-server-tfqhr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.430087 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.441620 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-bound-sa-token\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.455856 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88v7n\" (UniqueName: \"kubernetes.io/projected/4a1e6f5f-7b32-4f6f-bc21-bc520e469640-kube-api-access-88v7n\") pod \"service-ca-operator-777779d784-27bdk\" (UID: \"4a1e6f5f-7b32-4f6f-bc21-bc520e469640\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.474259 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjg8j\" (UniqueName: \"kubernetes.io/projected/1379a9ff-16c4-4717-b9d8-59ea36808a48-kube-api-access-fjg8j\") pod \"catalog-operator-68c6474976-fsgwn\" (UID: \"1379a9ff-16c4-4717-b9d8-59ea36808a48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.492893 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.497143 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w6ztx" event={"ID":"c20af121-a913-4178-aa60-35ca34fe91b2","Type":"ContainerStarted","Data":"19d69791f34b067dcb37ecdaf97011d28970fdc73d050b6fa67d1467034e190c"} Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.500094 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnjqp\" (UniqueName: \"kubernetes.io/projected/ef86187f-af31-4959-8d1c-fb954bed58d0-kube-api-access-mnjqp\") pod \"migrator-59844c95c7-hh4ql\" (UID: \"ef86187f-af31-4959-8d1c-fb954bed58d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hh4ql" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.502186 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" event={"ID":"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1","Type":"ContainerStarted","Data":"e2f137fce423a9c881d62aab3a7e8ccd875c3933cfe5fbaeb07b4b9a06fea1c9"} Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.502247 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" event={"ID":"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1","Type":"ContainerStarted","Data":"1f448ce5f8af479cadc216d59ec65d0349283a8cd25652bc736321dd9eda487e"} Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.504728 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" event={"ID":"b448908d-5bb0-437e-a06c-d608dd395160","Type":"ContainerStarted","Data":"3b6dd483bec84adcb224640a90f1ec4acdec67d6ecf155380bb6b26bac0eb39a"} Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.505096 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: E0223 06:48:18.505462 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:19.005444617 +0000 UTC m=+221.256771751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.511521 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" event={"ID":"d1fdf697-2344-4656-8ae3-8f516f5dd1ca","Type":"ContainerStarted","Data":"e84a6843d8e4fce6afd1f147e36e876bfe99e13c535b84917e84cbd66a1ac988"} Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.517837 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" event={"ID":"cca77003-2334-4205-97ae-87f0ae6d34cc","Type":"ContainerStarted","Data":"a72213a5694d01e647318929702ce573a9884fda2a0dcf9cde80a193b0a46b6a"} Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.519910 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67nf7\" (UniqueName: \"kubernetes.io/projected/37497943-34b6-4c1b-988e-c7b07e9bf608-kube-api-access-67nf7\") pod \"olm-operator-6b444d44fb-z7flg\" (UID: \"37497943-34b6-4c1b-988e-c7b07e9bf608\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.522031 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" event={"ID":"8225b605-2014-4247-8d40-ce334502bb44","Type":"ContainerStarted","Data":"4b0807ebae43a246c7d77cb7a7513ca70c29b1afbe2982dff2defe5c5d1859aa"} Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.525283 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" event={"ID":"cd61b952-18b4-490f-97cb-dd2938aa9d22","Type":"ContainerStarted","Data":"b100de53cb18a5fa44c03042578334c7ac701b11200a15651716eca683ac865b"} Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.526852 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v5zcl" event={"ID":"cd20eb8d-5a96-407f-a898-1dad49ba8355","Type":"ContainerStarted","Data":"b2e733971ff89866db652f10e7fe441bf66de1b155e2ad4e384599063be9625c"} Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.544314 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhzfq\" (UniqueName: \"kubernetes.io/projected/601765d3-2ae0-4cd2-a1fb-2c54de37487b-kube-api-access-qhzfq\") pod \"marketplace-operator-79b997595-fdfm4\" (UID: \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\") " pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.554717 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.560395 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" event={"ID":"ea8ee000-f48c-4d00-8602-cc8684f9f946","Type":"ContainerStarted","Data":"4ecfb0504dd72a830fdffbd4b291ee80cd14bef33e0de13983d36891d715392e"} Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.560825 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhd28\" (UniqueName: \"kubernetes.io/projected/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-kube-api-access-hhd28\") pod \"collect-profiles-29530485-jxmq4\" (UID: \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.569684 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.570470 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-75wvb" event={"ID":"b4a2bba0-ebb9-4923-a829-69112ef9c89c","Type":"ContainerStarted","Data":"12ad3fad486c6cb8c720043f92dcc5eefc47e46ebb241bf5cb5c1e82982b008e"} Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.570822 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-75wvb" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.579893 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdc8q\" (UniqueName: \"kubernetes.io/projected/efef4bf3-fb34-4cca-b99e-a00249f38e11-kube-api-access-rdc8q\") pod \"machine-config-controller-84d6567774-t45d6\" (UID: \"efef4bf3-fb34-4cca-b99e-a00249f38e11\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.597323 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-dhcsk" event={"ID":"ac0d3a9f-d965-453b-8dcc-5cb638715ffa","Type":"ContainerStarted","Data":"7531a7e7d65b0ddee78029cc4c3756b757a18e4885ed71380f1373d9a6bf9d34"} Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.597374 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-dhcsk" event={"ID":"ac0d3a9f-d965-453b-8dcc-5cb638715ffa","Type":"ContainerStarted","Data":"45df9990192b1f6b3621d337188c1b6f387972148f749f2fe6bc3620ca160a88"} Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.597846 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.600220 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d790e45e-5411-4acb-8875-23c6395250c8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4mdvr\" (UID: \"d790e45e-5411-4acb-8875-23c6395250c8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.601114 5047 patch_prober.go:28] interesting pod/console-operator-58897d9998-dhcsk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.601177 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-dhcsk" podUID="ac0d3a9f-d965-453b-8dcc-5cb638715ffa" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.605171 5047 patch_prober.go:28] interesting pod/downloads-7954f5f757-75wvb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.605243 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-75wvb" podUID="b4a2bba0-ebb9-4923-a829-69112ef9c89c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.606573 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:18 crc kubenswrapper[5047]: E0223 06:48:18.608449 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:19.108419354 +0000 UTC m=+221.359746488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.616931 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.631582 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-9l6zf"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.642953 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jj72\" (UniqueName: \"kubernetes.io/projected/1d363fb9-3892-412c-a2bd-ba00a377c2eb-kube-api-access-9jj72\") pod \"multus-admission-controller-857f4d67dd-b9dcg\" (UID: \"1d363fb9-3892-412c-a2bd-ba00a377c2eb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-b9dcg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.646332 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.658458 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-csx28"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.662466 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hh4ql" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.673412 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e98b1fd-d7b2-4af4-92c3-f6b4df04df73-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7x976\" (UID: \"9e98b1fd-d7b2-4af4-92c3-f6b4df04df73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.685232 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.686717 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ssfv\" (UniqueName: \"kubernetes.io/projected/1301a9f3-55a0-4a5d-b851-e764b459297e-kube-api-access-5ssfv\") pod \"package-server-manager-789f6589d5-lxddc\" (UID: \"1301a9f3-55a0-4a5d-b851-e764b459297e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.706736 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk8w5\" (UniqueName: \"kubernetes.io/projected/0184e0f6-46c5-4a13-967f-ea628e1478b6-kube-api-access-wk8w5\") pod \"ingress-canary-f8298\" (UID: \"0184e0f6-46c5-4a13-967f-ea628e1478b6\") " pod="openshift-ingress-canary/ingress-canary-f8298" Feb 23 06:48:18 crc kubenswrapper[5047]: W0223 06:48:18.707385 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cc1838c_dcb5_4c77_9d34_91507d631e3d.slice/crio-c30ac18fcad533da2a817eca8e7b4a32347fa74ee4ce04f24c21cedb3d7db362 WatchSource:0}: Error finding container c30ac18fcad533da2a817eca8e7b4a32347fa74ee4ce04f24c21cedb3d7db362: Status 404 returned error can't find the container with id c30ac18fcad533da2a817eca8e7b4a32347fa74ee4ce04f24c21cedb3d7db362 Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.707886 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: E0223 06:48:18.710078 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:19.210050012 +0000 UTC m=+221.461377146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.718248 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvdcn\" (UniqueName: \"kubernetes.io/projected/c41c983d-45b3-45e8-9562-70d269d414ff-kube-api-access-jvdcn\") pod \"dns-default-p7tmt\" (UID: \"c41c983d-45b3-45e8-9562-70d269d414ff\") " pod="openshift-dns/dns-default-p7tmt" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.720101 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rhztb"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.737154 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.737478 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p7tmt" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.737960 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntcvn\" (UniqueName: \"kubernetes.io/projected/b0031f21-2311-48e2-acb4-fc64475e66a2-kube-api-access-ntcvn\") pod \"csi-hostpathplugin-m247m\" (UID: \"b0031f21-2311-48e2-acb4-fc64475e66a2\") " pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.738208 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" Feb 23 06:48:18 crc kubenswrapper[5047]: W0223 06:48:18.748393 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa630c01_a2ea_4988_b0db_dc5ae558c1e1.slice/crio-8385b3041fea4cb6da694936da3f63296c81e3e3667c6d8fd2de8eab7d3858cc WatchSource:0}: Error finding container 8385b3041fea4cb6da694936da3f63296c81e3e3667c6d8fd2de8eab7d3858cc: Status 404 returned error can't find the container with id 8385b3041fea4cb6da694936da3f63296c81e3e3667c6d8fd2de8eab7d3858cc Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.750405 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.752631 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj9rt\" (UniqueName: \"kubernetes.io/projected/ff54766c-b299-4eab-8a68-8d85552d861f-kube-api-access-pj9rt\") pod \"machine-config-server-tfqhr\" (UID: \"ff54766c-b299-4eab-8a68-8d85552d861f\") " pod="openshift-machine-config-operator/machine-config-server-tfqhr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.778403 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-b9dcg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.784116 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.801278 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm"] Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.820581 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.822431 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:18 crc kubenswrapper[5047]: E0223 06:48:18.823780 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:19.323067237 +0000 UTC m=+221.574394361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.828853 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.842441 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.864167 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.881078 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.926742 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:18 crc kubenswrapper[5047]: E0223 06:48:18.927186 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:19.427166244 +0000 UTC m=+221.678493378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:18 crc kubenswrapper[5047]: I0223 06:48:18.995974 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f8298" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.011862 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tfqhr" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.028065 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.028309 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-config\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.028381 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-serving-cert\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.028417 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.028457 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-client-ca\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.029437 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-client-ca\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:19 crc kubenswrapper[5047]: E0223 06:48:19.029542 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:19.529525414 +0000 UTC m=+221.780852548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.030522 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-config\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.031621 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn"] Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.032341 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-m247m" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.034223 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cm66d\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.051237 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-serving-cert\") pod \"route-controller-manager-6576b87f9c-ct67m\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.067544 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc"] Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.132340 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:19 crc kubenswrapper[5047]: E0223 06:48:19.135351 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:19.635329397 +0000 UTC m=+221.886656541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:19 crc kubenswrapper[5047]: W0223 06:48:19.184212 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1379a9ff_16c4_4717_b9d8_59ea36808a48.slice/crio-f722f631b8b7d8f64be33cd2a227a5dae6e6490b147a4ee6303ed38c9ea26ac6 WatchSource:0}: Error finding container f722f631b8b7d8f64be33cd2a227a5dae6e6490b147a4ee6303ed38c9ea26ac6: Status 404 returned error can't find the container with id f722f631b8b7d8f64be33cd2a227a5dae6e6490b147a4ee6303ed38c9ea26ac6 Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.233466 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:19 crc kubenswrapper[5047]: E0223 06:48:19.233670 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:19.733634964 +0000 UTC m=+221.984962098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.234157 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:19 crc kubenswrapper[5047]: E0223 06:48:19.234569 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:19.73455406 +0000 UTC m=+221.985881194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.238587 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.291685 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.315301 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-27bdk"] Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.319370 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2"] Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.335461 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:19 crc kubenswrapper[5047]: E0223 06:48:19.335811 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:19.835787188 +0000 UTC m=+222.087114322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:19 crc kubenswrapper[5047]: W0223 06:48:19.432301 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02886198_ff3b_442c_8508_c29fa5dbd216.slice/crio-ed2d295eabf42502a18a64ac219823cb555bcddcd2914b249e4979c958176128 WatchSource:0}: Error finding container ed2d295eabf42502a18a64ac219823cb555bcddcd2914b249e4979c958176128: Status 404 returned error can't find the container with id ed2d295eabf42502a18a64ac219823cb555bcddcd2914b249e4979c958176128 Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.437020 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:19 crc kubenswrapper[5047]: E0223 06:48:19.437482 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:19.937466068 +0000 UTC m=+222.188793202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:19 crc kubenswrapper[5047]: W0223 06:48:19.440848 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a1e6f5f_7b32_4f6f_bc21_bc520e469640.slice/crio-37b8da190380b1a2c811b2ad50a0e0a9a4d60a8c03439d1015aa155508f1cc21 WatchSource:0}: Error finding container 37b8da190380b1a2c811b2ad50a0e0a9a4d60a8c03439d1015aa155508f1cc21: Status 404 returned error can't find the container with id 37b8da190380b1a2c811b2ad50a0e0a9a4d60a8c03439d1015aa155508f1cc21 Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.478858 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr"] Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.530343 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wc7s4" podStartSLOduration=150.530314693 podStartE2EDuration="2m30.530314693s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:19.530054275 +0000 UTC m=+221.781381409" watchObservedRunningTime="2026-02-23 06:48:19.530314693 +0000 UTC m=+221.781641827" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.538808 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:19 crc kubenswrapper[5047]: E0223 06:48:19.539217 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:20.039195189 +0000 UTC m=+222.290522313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.547825 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc"] Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.564337 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-75wvb" podStartSLOduration=150.564313285 podStartE2EDuration="2m30.564313285s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:19.558545465 +0000 UTC m=+221.809872599" watchObservedRunningTime="2026-02-23 06:48:19.564313285 +0000 UTC m=+221.815640419" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.566159 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6"] Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.604076 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-dhcsk" podStartSLOduration=150.603883754 podStartE2EDuration="2m30.603883754s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:19.595907612 +0000 UTC m=+221.847234746" watchObservedRunningTime="2026-02-23 06:48:19.603883754 +0000 UTC m=+221.855210888" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.621714 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" event={"ID":"4a1e6f5f-7b32-4f6f-bc21-bc520e469640","Type":"ContainerStarted","Data":"37b8da190380b1a2c811b2ad50a0e0a9a4d60a8c03439d1015aa155508f1cc21"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.642554 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:19 crc kubenswrapper[5047]: E0223 06:48:19.643346 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:20.143304447 +0000 UTC m=+222.394631581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.657328 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" event={"ID":"cd61b952-18b4-490f-97cb-dd2938aa9d22","Type":"ContainerStarted","Data":"cd7940a1d494a1e98cdc5142c868fbd05dc99122e0d3bd8439c52f6aedceae94"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.663853 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-b9dcg"] Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.664894 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hh4ql"] Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.677160 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v5zcl" event={"ID":"cd20eb8d-5a96-407f-a898-1dad49ba8355","Type":"ContainerStarted","Data":"3d8ef04e55486d0d69c506579d29f5ba19fc94ad45ea8f63f5378e06e54b6d12"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.680494 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" event={"ID":"3fee8332-3258-4cf5-a2c6-f1d0c1c684f0","Type":"ContainerStarted","Data":"d1ab1dd446e4663eed164b185cf7fc32b7a6f9aabe08296467bfcb45e05c77d9"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.683635 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" event={"ID":"65bc34e5-539d-4d00-8562-f8b02a455c4d","Type":"ContainerStarted","Data":"26ce92409e8b8b1085d9658299d5146af05665c0f9e705f4ddd656548ed93fc8"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.690135 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-g5zp5" event={"ID":"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e","Type":"ContainerStarted","Data":"8267d4778c2e49f1e7f02eca43f62efb4f3cb8b81aa7487802ac1b43172c4609"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.690390 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-g5zp5" event={"ID":"07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e","Type":"ContainerStarted","Data":"7c4ef71f010e6f3991ba83a8be0a8f5b3eb46f622cf1ab0375e07c7cc98ef8cf"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.696708 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" event={"ID":"8225b605-2014-4247-8d40-ce334502bb44","Type":"ContainerStarted","Data":"ab480d0663a2d318f0e124128a555526e85af86eaac947167bfe4bbe848b4cdc"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.699161 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" event={"ID":"1379a9ff-16c4-4717-b9d8-59ea36808a48","Type":"ContainerStarted","Data":"f722f631b8b7d8f64be33cd2a227a5dae6e6490b147a4ee6303ed38c9ea26ac6"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.702279 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" event={"ID":"f81c95a2-b542-4fde-be6c-9ac1e59f5ea1","Type":"ContainerStarted","Data":"4153f78c99863f15e0c5de28d8f4bf3859e469d3d7e95564af8ae3040789a1a0"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.705578 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w6ztx" event={"ID":"c20af121-a913-4178-aa60-35ca34fe91b2","Type":"ContainerStarted","Data":"92f2ec80cb5ed23c9b08ed276fdc4ee4c7a3dfda234c309dcf768ef1c58ab09d"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.707067 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" event={"ID":"aa630c01-a2ea-4988-b0db-dc5ae558c1e1","Type":"ContainerStarted","Data":"8385b3041fea4cb6da694936da3f63296c81e3e3667c6d8fd2de8eab7d3858cc"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.709405 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" event={"ID":"ea8ee000-f48c-4d00-8602-cc8684f9f946","Type":"ContainerStarted","Data":"0fe56743f9e4ab04263f929ed15d22d9842d724a63ce990fbb7ea4c5f5e33244"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.721563 5047 patch_prober.go:28] interesting pod/downloads-7954f5f757-75wvb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.721644 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-75wvb" podUID="b4a2bba0-ebb9-4923-a829-69112ef9c89c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.714954 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" event={"ID":"cb3f2abe-d31e-4ece-aae1-abe99f0b1425","Type":"ContainerStarted","Data":"94d0c21b2050ba9a2260ca27dda56facf3902ea41e38d0060e325d3884540813"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.729100 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-75wvb" event={"ID":"b4a2bba0-ebb9-4923-a829-69112ef9c89c","Type":"ContainerStarted","Data":"399acf878900a4cbb51d7c4a7cbb6133ce7c07da9bd65b3f955224c406183f94"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.729125 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" event={"ID":"02886198-ff3b-442c-8508-c29fa5dbd216","Type":"ContainerStarted","Data":"ed2d295eabf42502a18a64ac219823cb555bcddcd2914b249e4979c958176128"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.729137 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" event={"ID":"016925fa-5297-45db-8aa9-3f0310eb573f","Type":"ContainerStarted","Data":"7f218014dde0e84b012aa3e5aff76e4cfd6894a92356fc366c7f652b41de1507"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.731023 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" event={"ID":"4cc1838c-dcb5-4c77-9d34-91507d631e3d","Type":"ContainerStarted","Data":"c30ac18fcad533da2a817eca8e7b4a32347fa74ee4ce04f24c21cedb3d7db362"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.738967 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp" event={"ID":"99864770-491a-4f8e-8f3f-688436dc18ba","Type":"ContainerStarted","Data":"ad336cc25355f3ea0064d15e18147c2426792ee40db00efeddf5279a9d44dfac"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.741494 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" event={"ID":"b448908d-5bb0-437e-a06c-d608dd395160","Type":"ContainerStarted","Data":"dee25d683636328fabc909214943c14499e8d3b219057aa9601a1bbae50c9260"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.743353 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:19 crc kubenswrapper[5047]: E0223 06:48:19.744542 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:20.244473743 +0000 UTC m=+222.495800897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.753763 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vff44" podStartSLOduration=150.75374131 podStartE2EDuration="2m30.75374131s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:19.71589426 +0000 UTC m=+221.967221394" watchObservedRunningTime="2026-02-23 06:48:19.75374131 +0000 UTC m=+222.005068444" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.756037 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" event={"ID":"cca77003-2334-4205-97ae-87f0ae6d34cc","Type":"ContainerStarted","Data":"5be37e006f092620ecc7b0dc1f9dc35a41d64707337a33b11248ecfa74b5f535"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.764220 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" event={"ID":"ac96f0e5-3605-48c8-80b4-ef4e02123af8","Type":"ContainerStarted","Data":"a82c9bd072cdbde59bcdf124a13a170d9ccefd8370ea95c2dd365b96962f33ce"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.767926 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fdfm4"] Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.770815 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" event={"ID":"d1fdf697-2344-4656-8ae3-8f516f5dd1ca","Type":"ContainerStarted","Data":"c3a9e6c055a90e7776e737241eed82bc20f1b5385859bf9284e2e7fce614e603"} Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.772772 5047 patch_prober.go:28] interesting pod/console-operator-58897d9998-dhcsk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.772849 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-dhcsk" podUID="ac0d3a9f-d965-453b-8dcc-5cb638715ffa" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.846149 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:19 crc kubenswrapper[5047]: E0223 06:48:19.848369 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:20.348346944 +0000 UTC m=+222.599674078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:19 crc kubenswrapper[5047]: W0223 06:48:19.948140 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod601765d3_2ae0_4cd2_a1fb_2c54de37487b.slice/crio-19f0942471b50cd0bd381679d57b51dda2bf345b557744853c5a24aedfe9cbdf WatchSource:0}: Error finding container 19f0942471b50cd0bd381679d57b51dda2bf345b557744853c5a24aedfe9cbdf: Status 404 returned error can't find the container with id 19f0942471b50cd0bd381679d57b51dda2bf345b557744853c5a24aedfe9cbdf Feb 23 06:48:19 crc kubenswrapper[5047]: I0223 06:48:19.948776 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:19 crc kubenswrapper[5047]: E0223 06:48:19.949372 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:20.449341265 +0000 UTC m=+222.700668399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:19 crc kubenswrapper[5047]: W0223 06:48:19.950695 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d363fb9_3892_412c_a2bd_ba00a377c2eb.slice/crio-3f3d58cd951c975286739c049390fbd69e1a6eb23825e4565b722f55cb6e4802 WatchSource:0}: Error finding container 3f3d58cd951c975286739c049390fbd69e1a6eb23825e4565b722f55cb6e4802: Status 404 returned error can't find the container with id 3f3d58cd951c975286739c049390fbd69e1a6eb23825e4565b722f55cb6e4802 Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.052601 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:20 crc kubenswrapper[5047]: E0223 06:48:20.053781 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:20.553764521 +0000 UTC m=+222.805091655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.149985 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6"] Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.159538 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:20 crc kubenswrapper[5047]: E0223 06:48:20.160056 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:20.660034298 +0000 UTC m=+222.911361432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.240836 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg"] Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.244863 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976"] Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.261375 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:20 crc kubenswrapper[5047]: E0223 06:48:20.261851 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:20.761833101 +0000 UTC m=+223.013160235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.263158 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4"] Feb 23 06:48:20 crc kubenswrapper[5047]: W0223 06:48:20.281054 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefef4bf3_fb34_4cca_b99e_a00249f38e11.slice/crio-2f7640893d282880970aec6d72854cddcbffd091dc4873983ae0b92574f232c0 WatchSource:0}: Error finding container 2f7640893d282880970aec6d72854cddcbffd091dc4873983ae0b92574f232c0: Status 404 returned error can't find the container with id 2f7640893d282880970aec6d72854cddcbffd091dc4873983ae0b92574f232c0 Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.336472 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p7tmt"] Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.362271 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:20 crc kubenswrapper[5047]: E0223 06:48:20.363633 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:20.863605135 +0000 UTC m=+223.114932269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.370111 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f8298"] Feb 23 06:48:20 crc kubenswrapper[5047]: W0223 06:48:20.390796 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd845d9f8_e91e_46b9_b484_dbfeb6006c7f.slice/crio-c7706a9f207e8d0896d04400047ea942eabb8238184365de6098db73b8ee6d02 WatchSource:0}: Error finding container c7706a9f207e8d0896d04400047ea942eabb8238184365de6098db73b8ee6d02: Status 404 returned error can't find the container with id c7706a9f207e8d0896d04400047ea942eabb8238184365de6098db73b8ee6d02 Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.412210 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.416363 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.416406 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.420039 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cm66d"] Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.458978 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6vkr2" podStartSLOduration=151.458946089 podStartE2EDuration="2m31.458946089s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:20.437079842 +0000 UTC m=+222.688406996" watchObservedRunningTime="2026-02-23 06:48:20.458946089 +0000 UTC m=+222.710273223" Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.459454 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-m247m"] Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.469642 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:20 crc kubenswrapper[5047]: E0223 06:48:20.470032 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:20.970018996 +0000 UTC m=+223.221346130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.514503 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qxj5w" podStartSLOduration=151.514482119 podStartE2EDuration="2m31.514482119s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:20.514371146 +0000 UTC m=+222.765698280" watchObservedRunningTime="2026-02-23 06:48:20.514482119 +0000 UTC m=+222.765809253" Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.525427 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rcb6l" podStartSLOduration=151.525403111 podStartE2EDuration="2m31.525403111s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:20.476891056 +0000 UTC m=+222.728218190" watchObservedRunningTime="2026-02-23 06:48:20.525403111 +0000 UTC m=+222.776730245" Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.570445 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:20 crc kubenswrapper[5047]: E0223 06:48:20.570872 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.070837152 +0000 UTC m=+223.322164286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.571266 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:20 crc kubenswrapper[5047]: E0223 06:48:20.573325 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.072855588 +0000 UTC m=+223.324182722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.574330 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-v5zcl" podStartSLOduration=151.574312359 podStartE2EDuration="2m31.574312359s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:20.563603151 +0000 UTC m=+222.814930305" watchObservedRunningTime="2026-02-23 06:48:20.574312359 +0000 UTC m=+222.825639493" Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.600727 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qgn89" podStartSLOduration=151.60070553 podStartE2EDuration="2m31.60070553s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:20.598994293 +0000 UTC m=+222.850321417" watchObservedRunningTime="2026-02-23 06:48:20.60070553 +0000 UTC m=+222.852032664" Feb 23 06:48:20 crc kubenswrapper[5047]: W0223 06:48:20.640679 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71636caa_95b7_4d64_a6f1_1cdf7dc03c07.slice/crio-ca7f0abf447772eb70bf800d6251530d02c99089b25db49b82b6f5926b765118 WatchSource:0}: Error finding container ca7f0abf447772eb70bf800d6251530d02c99089b25db49b82b6f5926b765118: Status 404 returned error can't find the container with id ca7f0abf447772eb70bf800d6251530d02c99089b25db49b82b6f5926b765118 Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.649232 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7tlp5" podStartSLOduration=151.649188836 podStartE2EDuration="2m31.649188836s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:20.637292805 +0000 UTC m=+222.888619939" watchObservedRunningTime="2026-02-23 06:48:20.649188836 +0000 UTC m=+222.900515970" Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.674313 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:20 crc kubenswrapper[5047]: E0223 06:48:20.675054 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.175018302 +0000 UTC m=+223.426345436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.684666 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:20 crc kubenswrapper[5047]: E0223 06:48:20.685468 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.185448881 +0000 UTC m=+223.436776005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.704569 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m"] Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.706234 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" podStartSLOduration=151.706192266 podStartE2EDuration="2m31.706192266s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:20.67856915 +0000 UTC m=+222.929896284" watchObservedRunningTime="2026-02-23 06:48:20.706192266 +0000 UTC m=+222.957519400" Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.729736 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-g5zp5" podStartSLOduration=151.729700448 podStartE2EDuration="2m31.729700448s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:20.716114832 +0000 UTC m=+222.967441966" watchObservedRunningTime="2026-02-23 06:48:20.729700448 +0000 UTC m=+222.981027582" Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.787502 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:20 crc kubenswrapper[5047]: E0223 06:48:20.787819 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.287769269 +0000 UTC m=+223.539096403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.787985 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:20 crc kubenswrapper[5047]: E0223 06:48:20.788460 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.288450157 +0000 UTC m=+223.539777601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.832240 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m247m" event={"ID":"b0031f21-2311-48e2-acb4-fc64475e66a2","Type":"ContainerStarted","Data":"fbad9ff2e53851836584f2f289d865d1f4b7821579571f6f955d46950a86b1aa"} Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.840500 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" event={"ID":"9e98b1fd-d7b2-4af4-92c3-f6b4df04df73","Type":"ContainerStarted","Data":"6552078e15f3c6d7fb78fae85c10102c326e17767b16f4adb517c5bbf59442fa"} Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.870051 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" event={"ID":"4cc1838c-dcb5-4c77-9d34-91507d631e3d","Type":"ContainerStarted","Data":"7557f76dc4f201f8ab5ffa33ae734ed51d3f7288b7930e6a8607f7097e8e5820"} Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.885442 5047 generic.go:334] "Generic (PLEG): container finished" podID="aa630c01-a2ea-4988-b0db-dc5ae558c1e1" containerID="628072d3f7e9eb1cb0bc0cc950a0e645e2dbcca317c808583a0ef6b1e4e46ffb" exitCode=0 Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.885546 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" event={"ID":"aa630c01-a2ea-4988-b0db-dc5ae558c1e1","Type":"ContainerDied","Data":"628072d3f7e9eb1cb0bc0cc950a0e645e2dbcca317c808583a0ef6b1e4e46ffb"} Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.888793 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:20 crc kubenswrapper[5047]: E0223 06:48:20.889316 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.389291464 +0000 UTC m=+223.640618598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.897165 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-952vj" podStartSLOduration=151.897126562 podStartE2EDuration="2m31.897126562s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:20.894209751 +0000 UTC m=+223.145536905" watchObservedRunningTime="2026-02-23 06:48:20.897126562 +0000 UTC m=+223.148453696" Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.940898 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" event={"ID":"601765d3-2ae0-4cd2-a1fb-2c54de37487b","Type":"ContainerStarted","Data":"19f0942471b50cd0bd381679d57b51dda2bf345b557744853c5a24aedfe9cbdf"} Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.944669 5047 generic.go:334] "Generic (PLEG): container finished" podID="ac96f0e5-3605-48c8-80b4-ef4e02123af8" containerID="93c39a132fae236cee2b5494dc8b890caafa949fedc9c1cc80255160c290b613" exitCode=0 Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.946689 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" event={"ID":"ac96f0e5-3605-48c8-80b4-ef4e02123af8","Type":"ContainerDied","Data":"93c39a132fae236cee2b5494dc8b890caafa949fedc9c1cc80255160c290b613"} Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.949338 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" event={"ID":"71636caa-95b7-4d64-a6f1-1cdf7dc03c07","Type":"ContainerStarted","Data":"ca7f0abf447772eb70bf800d6251530d02c99089b25db49b82b6f5926b765118"} Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.950266 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hh4ql" event={"ID":"ef86187f-af31-4959-8d1c-fb954bed58d0","Type":"ContainerStarted","Data":"7dbac5cfddfd2cabe18a2b0d5958e15e11fc8e2afaa040fe5e859e7c00881bbb"} Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.959828 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" event={"ID":"cb3f2abe-d31e-4ece-aae1-abe99f0b1425","Type":"ContainerStarted","Data":"59ea4b4e3f62e5e211baf23ca57267f59a755f360085127dfad840cc399a0fcb"} Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.961451 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.971194 5047 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-nvbsc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:5443/healthz\": dial tcp 10.217.0.24:5443: connect: connection refused" start-of-body= Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.971270 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" podUID="cb3f2abe-d31e-4ece-aae1-abe99f0b1425" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.24:5443/healthz\": dial tcp 10.217.0.24:5443: connect: connection refused" Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.990345 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:20 crc kubenswrapper[5047]: I0223 06:48:20.993644 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" event={"ID":"4a1e6f5f-7b32-4f6f-bc21-bc520e469640","Type":"ContainerStarted","Data":"cb3d7c9327b7503b775f9dd35a125c93f32a949a24308589a0fe08610eff7625"} Feb 23 06:48:20 crc kubenswrapper[5047]: E0223 06:48:20.994888 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.494873013 +0000 UTC m=+223.746200147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.025506 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p7tmt" event={"ID":"c41c983d-45b3-45e8-9562-70d269d414ff","Type":"ContainerStarted","Data":"e1538c864e398a443e8e4bfce5770fbb9c4b35cac75fbed227f50fa55e809ada"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.033247 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" podStartSLOduration=152.033194186 podStartE2EDuration="2m32.033194186s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:21.030844841 +0000 UTC m=+223.282171975" watchObservedRunningTime="2026-02-23 06:48:21.033194186 +0000 UTC m=+223.284521320" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.044441 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" event={"ID":"3fee8332-3258-4cf5-a2c6-f1d0c1c684f0","Type":"ContainerStarted","Data":"ccd4e3ad6fefdadfe8b39e449be5e63b87c8d45cf2eea9523e92a2eeb3323b4c"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.046627 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" event={"ID":"d790e45e-5411-4acb-8875-23c6395250c8","Type":"ContainerStarted","Data":"1f3b3a3ca949a367520dfa8f57dfcc3c8331e41532214bb063437113532e917c"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.050814 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-b9dcg" event={"ID":"1d363fb9-3892-412c-a2bd-ba00a377c2eb","Type":"ContainerStarted","Data":"3f3d58cd951c975286739c049390fbd69e1a6eb23825e4565b722f55cb6e4802"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.057736 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f8298" event={"ID":"0184e0f6-46c5-4a13-967f-ea628e1478b6","Type":"ContainerStarted","Data":"67bf3cc9844be2c951fe95b9393cec7e1f0477d7f511c44d81b165ee4e00f6f7"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.064268 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" event={"ID":"cca77003-2334-4205-97ae-87f0ae6d34cc","Type":"ContainerStarted","Data":"9e7232021b42292e8691e9a3e9e6a26170b0165907530d901e2763a50c102ad2"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.065240 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-27bdk" podStartSLOduration=151.065212404 podStartE2EDuration="2m31.065212404s" podCreationTimestamp="2026-02-23 06:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:21.050532927 +0000 UTC m=+223.301860061" watchObservedRunningTime="2026-02-23 06:48:21.065212404 +0000 UTC m=+223.316539538" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.074310 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" event={"ID":"1379a9ff-16c4-4717-b9d8-59ea36808a48","Type":"ContainerStarted","Data":"9cf1540e1dd0a188f8aba86ad16a97aff70e1badcdac4f1f6903ad98f241fc76"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.079187 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.080044 5047 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fsgwn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.080092 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" podUID="1379a9ff-16c4-4717-b9d8-59ea36808a48" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.080529 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" podStartSLOduration=152.080498538 podStartE2EDuration="2m32.080498538s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:21.080263422 +0000 UTC m=+223.331590556" watchObservedRunningTime="2026-02-23 06:48:21.080498538 +0000 UTC m=+223.331825672" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.086793 5047 csr.go:261] certificate signing request csr-7wb5v is approved, waiting to be issued Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.092222 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:21 crc kubenswrapper[5047]: E0223 06:48:21.093259 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.59321769 +0000 UTC m=+223.844544924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.097050 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" event={"ID":"016925fa-5297-45db-8aa9-3f0310eb573f","Type":"ContainerStarted","Data":"eccaf45facc8b7ee01225f8fec7e7c1e29feea09af91500756d175748dc08a73"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.100680 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" event={"ID":"d845d9f8-e91e-46b9-b484-dbfeb6006c7f","Type":"ContainerStarted","Data":"c7706a9f207e8d0896d04400047ea942eabb8238184365de6098db73b8ee6d02"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.132207 5047 csr.go:257] certificate signing request csr-7wb5v is issued Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.150481 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-rhztb" podStartSLOduration=151.150458928 podStartE2EDuration="2m31.150458928s" podCreationTimestamp="2026-02-23 06:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:21.142245711 +0000 UTC m=+223.393572845" watchObservedRunningTime="2026-02-23 06:48:21.150458928 +0000 UTC m=+223.401786062" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.173226 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-cfg96" podStartSLOduration=152.173204819 podStartE2EDuration="2m32.173204819s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:21.171839761 +0000 UTC m=+223.423166895" watchObservedRunningTime="2026-02-23 06:48:21.173204819 +0000 UTC m=+223.424531953" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.209651 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" event={"ID":"37497943-34b6-4c1b-988e-c7b07e9bf608","Type":"ContainerStarted","Data":"24d8c0e1cc32abce5f9795edf31df3e791204fd3d5336eea01b50458b7f328a8"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.210849 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" podStartSLOduration=152.210827062 podStartE2EDuration="2m32.210827062s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:21.210110893 +0000 UTC m=+223.461438037" watchObservedRunningTime="2026-02-23 06:48:21.210827062 +0000 UTC m=+223.462154196" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.210968 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:21 crc kubenswrapper[5047]: E0223 06:48:21.211587 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.711558522 +0000 UTC m=+223.962885696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.251006 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w6ztx" event={"ID":"c20af121-a913-4178-aa60-35ca34fe91b2","Type":"ContainerStarted","Data":"a6baec86f181ad198893d1f0533db7adbd04a719786eafe9d394a9b687da1020"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.275098 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-w6ztx" podStartSLOduration=152.275066844 podStartE2EDuration="2m32.275066844s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:21.274489918 +0000 UTC m=+223.525817052" watchObservedRunningTime="2026-02-23 06:48:21.275066844 +0000 UTC m=+223.526393978" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.285209 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp" event={"ID":"99864770-491a-4f8e-8f3f-688436dc18ba","Type":"ContainerStarted","Data":"d1c01c6900eebd39e0f678ea52092c3267c7b1f0da46f664310e3d622d6e50b1"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.292004 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" event={"ID":"1301a9f3-55a0-4a5d-b851-e764b459297e","Type":"ContainerStarted","Data":"4354b93af65137bae0d913b419ada4c1912df5d436474753c03564338e0eabb3"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.292058 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" event={"ID":"1301a9f3-55a0-4a5d-b851-e764b459297e","Type":"ContainerStarted","Data":"008813ec3485dbced82c9c048f2059aba420db46a037f3046b0b58fb75836073"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.294477 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" event={"ID":"02886198-ff3b-442c-8508-c29fa5dbd216","Type":"ContainerStarted","Data":"90ee98cbed0a50b78309478ed4abcb8da6bd385de060c4215c21e4b8e1b7c7bc"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.297348 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" event={"ID":"af373c51-80d6-4656-9b92-800ebb6244b5","Type":"ContainerStarted","Data":"535cba2e1d13a46199a60b886b5e1c4630444c2bd65ec8e3caa9727089a1e2b1"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.301934 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-22tgp" podStartSLOduration=152.301899849 podStartE2EDuration="2m32.301899849s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:21.300348145 +0000 UTC m=+223.551675289" watchObservedRunningTime="2026-02-23 06:48:21.301899849 +0000 UTC m=+223.553226983" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.312464 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:21 crc kubenswrapper[5047]: E0223 06:48:21.313731 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.813709486 +0000 UTC m=+224.065036620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.320202 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tfqhr" event={"ID":"ff54766c-b299-4eab-8a68-8d85552d861f","Type":"ContainerStarted","Data":"2c3a89346a5c550d89ea743d38f1a052dc81301eb545c087b6646f547cd4bb20"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.320271 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tfqhr" event={"ID":"ff54766c-b299-4eab-8a68-8d85552d861f","Type":"ContainerStarted","Data":"b073234f56ade784483e3472a1ab22e3931d3c2e6a8b636711cdb12d4e3f8bbf"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.341296 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" event={"ID":"65bc34e5-539d-4d00-8562-f8b02a455c4d","Type":"ContainerStarted","Data":"cb01664b3a1650ad9727f7d653f814cf192776d1ed077b01c92c4c6874907310"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.344828 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" event={"ID":"efef4bf3-fb34-4cca-b99e-a00249f38e11","Type":"ContainerStarted","Data":"2f7640893d282880970aec6d72854cddcbffd091dc4873983ae0b92574f232c0"} Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.345898 5047 patch_prober.go:28] interesting pod/downloads-7954f5f757-75wvb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.345963 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-75wvb" podUID="b4a2bba0-ebb9-4923-a829-69112ef9c89c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.349200 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.354252 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-tfqhr" podStartSLOduration=6.354219909 podStartE2EDuration="6.354219909s" podCreationTimestamp="2026-02-23 06:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:21.351101773 +0000 UTC m=+223.602428917" watchObservedRunningTime="2026-02-23 06:48:21.354219909 +0000 UTC m=+223.605547043" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.382858 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-dhcsk" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.409436 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25gfm" podStartSLOduration=152.409384079 podStartE2EDuration="2m32.409384079s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:21.405886893 +0000 UTC m=+223.657214017" watchObservedRunningTime="2026-02-23 06:48:21.409384079 +0000 UTC m=+223.660711203" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.431703 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.432040 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.432327 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.432436 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.432558 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.445815 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:48:21 crc kubenswrapper[5047]: E0223 06:48:21.447909 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:21.947886098 +0000 UTC m=+224.199213232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.451725 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:21 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:21 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:21 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.456828 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.466475 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.469111 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.469727 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.538494 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:21 crc kubenswrapper[5047]: E0223 06:48:21.539127 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:22.039099377 +0000 UTC m=+224.290426511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.640809 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:21 crc kubenswrapper[5047]: E0223 06:48:21.641745 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:22.141727954 +0000 UTC m=+224.393055088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.670510 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.681490 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.702470 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.744648 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:21 crc kubenswrapper[5047]: E0223 06:48:21.745330 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:22.245283716 +0000 UTC m=+224.496610850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.849157 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:21 crc kubenswrapper[5047]: E0223 06:48:21.849616 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:22.349600959 +0000 UTC m=+224.600928093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:21 crc kubenswrapper[5047]: I0223 06:48:21.951699 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:21 crc kubenswrapper[5047]: E0223 06:48:21.952767 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:22.452724289 +0000 UTC m=+224.704051423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.053966 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:22 crc kubenswrapper[5047]: E0223 06:48:22.057481 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:22.557448083 +0000 UTC m=+224.808775217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.068412 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.133558 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-23 06:43:21 +0000 UTC, rotation deadline is 2027-01-17 01:32:25.949321237 +0000 UTC Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.133949 5047 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7866h44m3.815375251s for next certificate rotation Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.156144 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:22 crc kubenswrapper[5047]: E0223 06:48:22.156497 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:22.656479231 +0000 UTC m=+224.907806365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.257491 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:22 crc kubenswrapper[5047]: E0223 06:48:22.258028 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:22.758013396 +0000 UTC m=+225.009340530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:22 crc kubenswrapper[5047]: W0223 06:48:22.270761 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-dc52d3a28d2026d34f3483990129dde8b299efa89eb42203a3bc3ae184d49375 WatchSource:0}: Error finding container dc52d3a28d2026d34f3483990129dde8b299efa89eb42203a3bc3ae184d49375: Status 404 returned error can't find the container with id dc52d3a28d2026d34f3483990129dde8b299efa89eb42203a3bc3ae184d49375 Feb 23 06:48:22 crc kubenswrapper[5047]: W0223 06:48:22.327622 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-398c1c0599a0478f0eda691d65d40090336ebd149114a372ed52367dd570ec03 WatchSource:0}: Error finding container 398c1c0599a0478f0eda691d65d40090336ebd149114a372ed52367dd570ec03: Status 404 returned error can't find the container with id 398c1c0599a0478f0eda691d65d40090336ebd149114a372ed52367dd570ec03 Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.361339 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:22 crc kubenswrapper[5047]: E0223 06:48:22.362364 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:22.862316589 +0000 UTC m=+225.113643723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.410453 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" event={"ID":"02886198-ff3b-442c-8508-c29fa5dbd216","Type":"ContainerStarted","Data":"d5f86886ca6f91c11644fa80fec94e577ca067f40bce3a8bc1ee2853d25e1124"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.422490 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:22 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:22 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:22 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.422575 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.441775 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bwkb2" podStartSLOduration=153.441752403 podStartE2EDuration="2m33.441752403s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.440626461 +0000 UTC m=+224.691953595" watchObservedRunningTime="2026-02-23 06:48:22.441752403 +0000 UTC m=+224.693079537" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.442314 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" event={"ID":"d845d9f8-e91e-46b9-b484-dbfeb6006c7f","Type":"ContainerStarted","Data":"8f6e37f1f6cd5a01340c7a7c2791d58e7a3095aab22cfed70b914d2f52d6dc8d"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.463080 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" event={"ID":"efef4bf3-fb34-4cca-b99e-a00249f38e11","Type":"ContainerStarted","Data":"50dfba0f388038c35b86bce47851d5f07e4e88c4e16fe994707072a752da7cab"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.463126 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" event={"ID":"efef4bf3-fb34-4cca-b99e-a00249f38e11","Type":"ContainerStarted","Data":"040d63994c1da6daf6e8944545f8d43d995197a334fef87b1614239a6612684a"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.463510 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:22 crc kubenswrapper[5047]: E0223 06:48:22.465050 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:22.965036388 +0000 UTC m=+225.216363522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:22 crc kubenswrapper[5047]: W0223 06:48:22.465213 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-3fc6f771f8a9b631d5f97b794d4d8c88e6392aa44ecbdcd1f8bde499a3b16ce8 WatchSource:0}: Error finding container 3fc6f771f8a9b631d5f97b794d4d8c88e6392aa44ecbdcd1f8bde499a3b16ce8: Status 404 returned error can't find the container with id 3fc6f771f8a9b631d5f97b794d4d8c88e6392aa44ecbdcd1f8bde499a3b16ce8 Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.469360 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" event={"ID":"d4967a59-3c6c-4ec3-9c70-4378ec3702c6","Type":"ContainerStarted","Data":"8f2cdde113d2295560e3e4bc27cf5a98b742ef79a3ccd0dbf49ab65fa162d725"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.469400 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" event={"ID":"d4967a59-3c6c-4ec3-9c70-4378ec3702c6","Type":"ContainerStarted","Data":"ff56b88f7a69f2783438d6fd485a78e536e316e4b509e276cccb353fe592bc10"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.470374 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.473019 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" podStartSLOduration=153.473009539 podStartE2EDuration="2m33.473009539s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.471816976 +0000 UTC m=+224.723144110" watchObservedRunningTime="2026-02-23 06:48:22.473009539 +0000 UTC m=+224.724336673" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.478958 5047 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ct67m container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.479014 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" podUID="d4967a59-3c6c-4ec3-9c70-4378ec3702c6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.500537 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-t45d6" podStartSLOduration=153.500512832 podStartE2EDuration="2m33.500512832s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.49683229 +0000 UTC m=+224.748159424" watchObservedRunningTime="2026-02-23 06:48:22.500512832 +0000 UTC m=+224.751839966" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.510058 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mdvr" event={"ID":"d790e45e-5411-4acb-8875-23c6395250c8","Type":"ContainerStarted","Data":"2cb7715ae462242e3d58945b55e7d7ff0487462d6c8b251b646d61dba99123ef"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.525383 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" event={"ID":"601765d3-2ae0-4cd2-a1fb-2c54de37487b","Type":"ContainerStarted","Data":"4c6fbfd459b7a7aaa45a0830c9aebf570aeed540eee5a950bb9132edecb8efb5"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.526215 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.547729 5047 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fdfm4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.547836 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" podUID="601765d3-2ae0-4cd2-a1fb-2c54de37487b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.547857 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" event={"ID":"71636caa-95b7-4d64-a6f1-1cdf7dc03c07","Type":"ContainerStarted","Data":"f8390d9c000c914a320f33a8d41ffd7d7012a5755806288de23fe1aa88f8eecc"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.548717 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.549259 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" podStartSLOduration=152.549244034 podStartE2EDuration="2m32.549244034s" podCreationTimestamp="2026-02-23 06:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.541940801 +0000 UTC m=+224.793267935" watchObservedRunningTime="2026-02-23 06:48:22.549244034 +0000 UTC m=+224.800571168" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.561707 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"dc52d3a28d2026d34f3483990129dde8b299efa89eb42203a3bc3ae184d49375"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.565119 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:22 crc kubenswrapper[5047]: E0223 06:48:22.565490 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.065455464 +0000 UTC m=+225.316782598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.565807 5047 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cm66d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.565885 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" podUID="71636caa-95b7-4d64-a6f1-1cdf7dc03c07" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.568178 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:22 crc kubenswrapper[5047]: E0223 06:48:22.570267 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.070251136 +0000 UTC m=+225.321578270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.595356 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p7tmt" event={"ID":"c41c983d-45b3-45e8-9562-70d269d414ff","Type":"ContainerStarted","Data":"e06a4acaae800e78c8f7701acbb313e12a57c59f769ab01e7606938e10d4d78f"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.596059 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p7tmt" event={"ID":"c41c983d-45b3-45e8-9562-70d269d414ff","Type":"ContainerStarted","Data":"5a520ceb130d0c38ef39b633637a3446b8f724e621441c40435dafd4a8fc556e"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.596791 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-p7tmt" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.611192 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hh4ql" event={"ID":"ef86187f-af31-4959-8d1c-fb954bed58d0","Type":"ContainerStarted","Data":"3a6fd910c7fead7ab18ae8491864981cd1184df78d0b9bbfd070dc468746fe4d"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.611251 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hh4ql" event={"ID":"ef86187f-af31-4959-8d1c-fb954bed58d0","Type":"ContainerStarted","Data":"3b25f79f273a9155cfdf5111993299fb27513bbaf1a3e70b6ee76e87f6a438b3"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.618125 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" podStartSLOduration=153.618097514 podStartE2EDuration="2m33.618097514s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.566507662 +0000 UTC m=+224.817834796" watchObservedRunningTime="2026-02-23 06:48:22.618097514 +0000 UTC m=+224.869424638" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.652651 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" event={"ID":"37497943-34b6-4c1b-988e-c7b07e9bf608","Type":"ContainerStarted","Data":"b21d88db381c3d5a763e92d318ec1605cc30b64cbaced39a0a0343b7dd271f1f"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.653626 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.655402 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" podStartSLOduration=153.655390018 podStartE2EDuration="2m33.655390018s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.619609015 +0000 UTC m=+224.870936139" watchObservedRunningTime="2026-02-23 06:48:22.655390018 +0000 UTC m=+224.906717152" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.655817 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p7tmt" podStartSLOduration=7.655810739 podStartE2EDuration="7.655810739s" podCreationTimestamp="2026-02-23 06:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.653980318 +0000 UTC m=+224.905307452" watchObservedRunningTime="2026-02-23 06:48:22.655810739 +0000 UTC m=+224.907137873" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.667107 5047 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-z7flg container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.667171 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" podUID="37497943-34b6-4c1b-988e-c7b07e9bf608" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.670191 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:22 crc kubenswrapper[5047]: E0223 06:48:22.672288 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.172254245 +0000 UTC m=+225.423581439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.691503 5047 generic.go:334] "Generic (PLEG): container finished" podID="af373c51-80d6-4656-9b92-800ebb6244b5" containerID="e8623770de18657d9392be984b1591c9021a24c845a4ea35697bb3d3b99393c5" exitCode=0 Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.691651 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" event={"ID":"af373c51-80d6-4656-9b92-800ebb6244b5","Type":"ContainerStarted","Data":"9963bc8d83200e10d02e080ede6184862b089d9e6b189b7b60ea0505fbe20d7e"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.691702 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" event={"ID":"af373c51-80d6-4656-9b92-800ebb6244b5","Type":"ContainerDied","Data":"e8623770de18657d9392be984b1591c9021a24c845a4ea35697bb3d3b99393c5"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.691998 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hh4ql" podStartSLOduration=153.691973972 podStartE2EDuration="2m33.691973972s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.688596569 +0000 UTC m=+224.939923703" watchObservedRunningTime="2026-02-23 06:48:22.691973972 +0000 UTC m=+224.943301106" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.706984 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"398c1c0599a0478f0eda691d65d40090336ebd149114a372ed52367dd570ec03"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.745947 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" event={"ID":"aa630c01-a2ea-4988-b0db-dc5ae558c1e1","Type":"ContainerStarted","Data":"453515346d0cd00fe354d7946bd02507952a31acb7b473e2d2893c0040eaa0ca"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.746508 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.765019 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" podStartSLOduration=153.764978787 podStartE2EDuration="2m33.764978787s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.717556341 +0000 UTC m=+224.968883475" watchObservedRunningTime="2026-02-23 06:48:22.764978787 +0000 UTC m=+225.016305921" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.766488 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" podStartSLOduration=153.766482518 podStartE2EDuration="2m33.766482518s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.762101847 +0000 UTC m=+225.013428991" watchObservedRunningTime="2026-02-23 06:48:22.766482518 +0000 UTC m=+225.017809652" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.768447 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" event={"ID":"9e98b1fd-d7b2-4af4-92c3-f6b4df04df73","Type":"ContainerStarted","Data":"57fd3e53d906eb3a1f57d6fa797db6800d04b5708556cf7a814cca5ab41febd0"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.774624 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:22 crc kubenswrapper[5047]: E0223 06:48:22.776328 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.276312641 +0000 UTC m=+225.527639775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.800098 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f8298" event={"ID":"0184e0f6-46c5-4a13-967f-ea628e1478b6","Type":"ContainerStarted","Data":"09f34198eecbd7287da5c470bc804b9c8d8c198048697ec9aebc2bbb8334741e"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.801953 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" podStartSLOduration=153.801941672 podStartE2EDuration="2m33.801941672s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.801832269 +0000 UTC m=+225.053159403" watchObservedRunningTime="2026-02-23 06:48:22.801941672 +0000 UTC m=+225.053268806" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.823566 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" event={"ID":"016925fa-5297-45db-8aa9-3f0310eb573f","Type":"ContainerStarted","Data":"efc6187b9932c61e681fce0d4920fe3e189044804395771ff2d9eb0bb3709bdf"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.839316 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" event={"ID":"1301a9f3-55a0-4a5d-b851-e764b459297e","Type":"ContainerStarted","Data":"0dd51d20f4a51687385cd83d3c72ae68b504b03d78adb08bb76ca62b61aa0975"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.839865 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.840114 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7x976" podStartSLOduration=153.840096031 podStartE2EDuration="2m33.840096031s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.839118633 +0000 UTC m=+225.090445767" watchObservedRunningTime="2026-02-23 06:48:22.840096031 +0000 UTC m=+225.091423165" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.853213 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-b9dcg" event={"ID":"1d363fb9-3892-412c-a2bd-ba00a377c2eb","Type":"ContainerStarted","Data":"919b531f2f7fa93648dbe3754ee6d17d1f81925aacdd5c93d7287ec8523a0901"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.853265 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-b9dcg" event={"ID":"1d363fb9-3892-412c-a2bd-ba00a377c2eb","Type":"ContainerStarted","Data":"fdfffdacb20f02a600a92ecedab8c47453a59d7b53b99f33593e0da1a010afa8"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.862369 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" event={"ID":"ac96f0e5-3605-48c8-80b4-ef4e02123af8","Type":"ContainerStarted","Data":"6c22d03071c7620614dfedc4b3f36e7a9341d62be752107c073dcc7b5e5c7ee8"} Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.870363 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-f8298" podStartSLOduration=7.870342019 podStartE2EDuration="7.870342019s" podCreationTimestamp="2026-02-23 06:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.86857655 +0000 UTC m=+225.119903684" watchObservedRunningTime="2026-02-23 06:48:22.870342019 +0000 UTC m=+225.121669143" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.876091 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:22 crc kubenswrapper[5047]: E0223 06:48:22.877340 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.377294962 +0000 UTC m=+225.628622286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.881766 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsgwn" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.938044 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.938503 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.947485 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" podStartSLOduration=153.947462988 podStartE2EDuration="2m33.947462988s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.909833775 +0000 UTC m=+225.161160929" watchObservedRunningTime="2026-02-23 06:48:22.947462988 +0000 UTC m=+225.198790122" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.953150 5047 patch_prober.go:28] interesting pod/apiserver-76f77b778f-9l6zf container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.953232 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" podUID="ac96f0e5-3605-48c8-80b4-ef4e02123af8" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.983853 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:22 crc kubenswrapper[5047]: E0223 06:48:22.986824 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.486803439 +0000 UTC m=+225.738130573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:22 crc kubenswrapper[5047]: I0223 06:48:22.997253 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4b54r" podStartSLOduration=153.997230698 podStartE2EDuration="2m33.997230698s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.949995088 +0000 UTC m=+225.201322222" watchObservedRunningTime="2026-02-23 06:48:22.997230698 +0000 UTC m=+225.248557822" Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.042279 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" podStartSLOduration=154.042258317 podStartE2EDuration="2m34.042258317s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:22.997122475 +0000 UTC m=+225.248449619" watchObservedRunningTime="2026-02-23 06:48:23.042258317 +0000 UTC m=+225.293585451" Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.064720 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-b9dcg" podStartSLOduration=154.064692349 podStartE2EDuration="2m34.064692349s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:23.04381646 +0000 UTC m=+225.295143594" watchObservedRunningTime="2026-02-23 06:48:23.064692349 +0000 UTC m=+225.316019483" Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.085518 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.085955 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.585902857 +0000 UTC m=+225.837232601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.187151 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.187557 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.687542027 +0000 UTC m=+225.938869161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.288887 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.289163 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.789125424 +0000 UTC m=+226.040452558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.289734 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.290088 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.79008037 +0000 UTC m=+226.041407504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.391594 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.391735 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.89171079 +0000 UTC m=+226.143037924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.391892 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.392202 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.892191503 +0000 UTC m=+226.143518627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.422046 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:23 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:23 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:23 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.422124 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.493254 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.493525 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.993457741 +0000 UTC m=+226.244784885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.493613 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.494020 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:23.994004666 +0000 UTC m=+226.245331800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.595096 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.595767 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.095710677 +0000 UTC m=+226.347037811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.595902 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.596704 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.096677634 +0000 UTC m=+226.348004768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.617549 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.617662 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.620136 5047 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-m5fb6 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.40:8443/livez\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.620198 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" podUID="af373c51-80d6-4656-9b92-800ebb6244b5" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.40:8443/livez\": dial tcp 10.217.0.40:8443: connect: connection refused" Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.697993 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.698255 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.19819947 +0000 UTC m=+226.449526604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.698493 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.698937 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.198896479 +0000 UTC m=+226.450223613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.730645 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m"] Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.731139 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cm66d"] Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.800362 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.800554 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.300511978 +0000 UTC m=+226.551839112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.800651 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.801016 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.300997671 +0000 UTC m=+226.552324805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.864634 5047 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-nvbsc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.865026 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" podUID="cb3f2abe-d31e-4ece-aae1-abe99f0b1425" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.24:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.873975 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" event={"ID":"ac96f0e5-3605-48c8-80b4-ef4e02123af8","Type":"ContainerStarted","Data":"cfb0b7fe2dfbdb52f344c0011e70ec58231de9745a0cc02443afb028983025fd"} Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.875849 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m247m" event={"ID":"b0031f21-2311-48e2-acb4-fc64475e66a2","Type":"ContainerStarted","Data":"9be4f82572c68e28c3210f24a0664cc36d64cb00744cd5ec04d55b4ef000409f"} Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.877262 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5e4219e639805817e2c0fecaf01ebb84b3a33c9728f39e0c536e87222b6b1e51"} Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.879018 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"120366ced36c7b31659339cf06152804b7cfe48d67212de8c3366ebdef79a41f"} Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.879043 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3fc6f771f8a9b631d5f97b794d4d8c88e6392aa44ecbdcd1f8bde499a3b16ce8"} Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.879363 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.881195 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2ed2ba42e48ed39211b776f11701db4e6f4284d5499bfe81ff825d97f1fb50eb"} Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.884509 5047 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cm66d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.884548 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" podUID="71636caa-95b7-4d64-a6f1-1cdf7dc03c07" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.886729 5047 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fdfm4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.886797 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" podUID="601765d3-2ae0-4cd2-a1fb-2c54de37487b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.901488 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.901825 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.401784026 +0000 UTC m=+226.653111150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.902970 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:23 crc kubenswrapper[5047]: E0223 06:48:23.915227 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.415203589 +0000 UTC m=+226.666530723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:23 crc kubenswrapper[5047]: I0223 06:48:23.920197 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z7flg" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.028763 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:24 crc kubenswrapper[5047]: E0223 06:48:24.029245 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.52920444 +0000 UTC m=+226.780531574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.029553 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:24 crc kubenswrapper[5047]: E0223 06:48:24.029952 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.52994296 +0000 UTC m=+226.781270094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.030730 5047 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-csx28 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.030738 5047 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-csx28 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.030783 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" podUID="aa630c01-a2ea-4988-b0db-dc5ae558c1e1" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.030823 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" podUID="aa630c01-a2ea-4988-b0db-dc5ae558c1e1" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.131195 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:24 crc kubenswrapper[5047]: E0223 06:48:24.131441 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.631403525 +0000 UTC m=+226.882730659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.233114 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:24 crc kubenswrapper[5047]: E0223 06:48:24.233539 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.733524507 +0000 UTC m=+226.984851641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.328379 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.334516 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:24 crc kubenswrapper[5047]: E0223 06:48:24.334627 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.83460566 +0000 UTC m=+227.085932804 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.335136 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:24 crc kubenswrapper[5047]: E0223 06:48:24.335724 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.835700521 +0000 UTC m=+227.087027645 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.421372 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:24 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:24 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:24 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.421470 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.436206 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:24 crc kubenswrapper[5047]: E0223 06:48:24.436536 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:24.936502297 +0000 UTC m=+227.187829431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.538417 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:24 crc kubenswrapper[5047]: E0223 06:48:24.538973 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:25.038953208 +0000 UTC m=+227.290280342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.640094 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:24 crc kubenswrapper[5047]: E0223 06:48:24.640215 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:25.140196866 +0000 UTC m=+227.391524000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.640477 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:24 crc kubenswrapper[5047]: E0223 06:48:24.641079 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:25.14104695 +0000 UTC m=+227.392374084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.726296 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zhsns"] Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.727369 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.741979 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:24 crc kubenswrapper[5047]: E0223 06:48:24.742811 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:25.242788391 +0000 UTC m=+227.494115525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.745311 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zhsns"] Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.812601 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.844274 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5pts\" (UniqueName: \"kubernetes.io/projected/83f104c4-c1f9-4714-a807-2a6368b538fd-kube-api-access-q5pts\") pod \"community-operators-zhsns\" (UID: \"83f104c4-c1f9-4714-a807-2a6368b538fd\") " pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.844331 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f104c4-c1f9-4714-a807-2a6368b538fd-catalog-content\") pod \"community-operators-zhsns\" (UID: \"83f104c4-c1f9-4714-a807-2a6368b538fd\") " pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.844374 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.844408 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f104c4-c1f9-4714-a807-2a6368b538fd-utilities\") pod \"community-operators-zhsns\" (UID: \"83f104c4-c1f9-4714-a807-2a6368b538fd\") " pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:48:24 crc kubenswrapper[5047]: E0223 06:48:24.844776 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:25.344760519 +0000 UTC m=+227.596087653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.886231 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" podUID="71636caa-95b7-4d64-a6f1-1cdf7dc03c07" containerName="controller-manager" containerID="cri-o://f8390d9c000c914a320f33a8d41ffd7d7012a5755806288de23fe1aa88f8eecc" gracePeriod=30 Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.887753 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" podUID="d4967a59-3c6c-4ec3-9c70-4378ec3702c6" containerName="route-controller-manager" containerID="cri-o://8f2cdde113d2295560e3e4bc27cf5a98b742ef79a3ccd0dbf49ab65fa162d725" gracePeriod=30 Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.888958 5047 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cm66d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.889037 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" podUID="71636caa-95b7-4d64-a6f1-1cdf7dc03c07" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.925618 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gnn8v"] Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.927026 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.942662 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.945692 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.946048 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5pts\" (UniqueName: \"kubernetes.io/projected/83f104c4-c1f9-4714-a807-2a6368b538fd-kube-api-access-q5pts\") pod \"community-operators-zhsns\" (UID: \"83f104c4-c1f9-4714-a807-2a6368b538fd\") " pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.946147 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f104c4-c1f9-4714-a807-2a6368b538fd-catalog-content\") pod \"community-operators-zhsns\" (UID: \"83f104c4-c1f9-4714-a807-2a6368b538fd\") " pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.946282 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gnn8v"] Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.946327 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f104c4-c1f9-4714-a807-2a6368b538fd-utilities\") pod \"community-operators-zhsns\" (UID: \"83f104c4-c1f9-4714-a807-2a6368b538fd\") " pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:48:24 crc kubenswrapper[5047]: E0223 06:48:24.947576 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:25.44754134 +0000 UTC m=+227.698868474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.947795 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f104c4-c1f9-4714-a807-2a6368b538fd-catalog-content\") pod \"community-operators-zhsns\" (UID: \"83f104c4-c1f9-4714-a807-2a6368b538fd\") " pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.950998 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f104c4-c1f9-4714-a807-2a6368b538fd-utilities\") pod \"community-operators-zhsns\" (UID: \"83f104c4-c1f9-4714-a807-2a6368b538fd\") " pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:48:24 crc kubenswrapper[5047]: I0223 06:48:24.981016 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5pts\" (UniqueName: \"kubernetes.io/projected/83f104c4-c1f9-4714-a807-2a6368b538fd-kube-api-access-q5pts\") pod \"community-operators-zhsns\" (UID: \"83f104c4-c1f9-4714-a807-2a6368b538fd\") " pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.048114 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.048205 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jl2s\" (UniqueName: \"kubernetes.io/projected/46419724-8d7c-47d4-9d51-3aef5c54ab1b-kube-api-access-4jl2s\") pod \"certified-operators-gnn8v\" (UID: \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\") " pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.048254 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46419724-8d7c-47d4-9d51-3aef5c54ab1b-catalog-content\") pod \"certified-operators-gnn8v\" (UID: \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\") " pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.048278 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46419724-8d7c-47d4-9d51-3aef5c54ab1b-utilities\") pod \"certified-operators-gnn8v\" (UID: \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\") " pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:48:25 crc kubenswrapper[5047]: E0223 06:48:25.048653 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:25.548635794 +0000 UTC m=+227.799962928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.130851 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2qnnx"] Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.131965 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.141211 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.149828 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.150115 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jl2s\" (UniqueName: \"kubernetes.io/projected/46419724-8d7c-47d4-9d51-3aef5c54ab1b-kube-api-access-4jl2s\") pod \"certified-operators-gnn8v\" (UID: \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\") " pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.150171 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46419724-8d7c-47d4-9d51-3aef5c54ab1b-catalog-content\") pod \"certified-operators-gnn8v\" (UID: \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\") " pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.150196 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46419724-8d7c-47d4-9d51-3aef5c54ab1b-utilities\") pod \"certified-operators-gnn8v\" (UID: \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\") " pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.150655 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46419724-8d7c-47d4-9d51-3aef5c54ab1b-utilities\") pod \"certified-operators-gnn8v\" (UID: \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\") " pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:48:25 crc kubenswrapper[5047]: E0223 06:48:25.150741 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:25.650724946 +0000 UTC m=+227.902052080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.151140 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2qnnx"] Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.151278 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46419724-8d7c-47d4-9d51-3aef5c54ab1b-catalog-content\") pod \"certified-operators-gnn8v\" (UID: \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\") " pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.176614 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jl2s\" (UniqueName: \"kubernetes.io/projected/46419724-8d7c-47d4-9d51-3aef5c54ab1b-kube-api-access-4jl2s\") pod \"certified-operators-gnn8v\" (UID: \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\") " pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.191147 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-csx28" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.248278 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.251111 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnfnt\" (UniqueName: \"kubernetes.io/projected/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-kube-api-access-mnfnt\") pod \"community-operators-2qnnx\" (UID: \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\") " pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.251185 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-catalog-content\") pod \"community-operators-2qnnx\" (UID: \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\") " pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.251211 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-utilities\") pod \"community-operators-2qnnx\" (UID: \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\") " pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.251236 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:25 crc kubenswrapper[5047]: E0223 06:48:25.251586 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:25.751568793 +0000 UTC m=+228.002895937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.339150 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7sssp"] Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.350531 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.355764 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.356028 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnfnt\" (UniqueName: \"kubernetes.io/projected/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-kube-api-access-mnfnt\") pod \"community-operators-2qnnx\" (UID: \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\") " pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.356098 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-catalog-content\") pod \"community-operators-2qnnx\" (UID: \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\") " pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.356120 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-utilities\") pod \"community-operators-2qnnx\" (UID: \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\") " pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.356527 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-utilities\") pod \"community-operators-2qnnx\" (UID: \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\") " pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:48:25 crc kubenswrapper[5047]: E0223 06:48:25.356626 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:25.856603215 +0000 UTC m=+228.107930349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.357170 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-catalog-content\") pod \"community-operators-2qnnx\" (UID: \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\") " pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.364546 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7sssp"] Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.402608 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnfnt\" (UniqueName: \"kubernetes.io/projected/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-kube-api-access-mnfnt\") pod \"community-operators-2qnnx\" (UID: \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\") " pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.423539 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:25 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:25 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:25 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.423625 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.445792 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.460769 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b848e749-e1a5-47ab-aca5-13e799a2504e-utilities\") pod \"certified-operators-7sssp\" (UID: \"b848e749-e1a5-47ab-aca5-13e799a2504e\") " pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.460828 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b848e749-e1a5-47ab-aca5-13e799a2504e-catalog-content\") pod \"certified-operators-7sssp\" (UID: \"b848e749-e1a5-47ab-aca5-13e799a2504e\") " pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.460906 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c48zt\" (UniqueName: \"kubernetes.io/projected/b848e749-e1a5-47ab-aca5-13e799a2504e-kube-api-access-c48zt\") pod \"certified-operators-7sssp\" (UID: \"b848e749-e1a5-47ab-aca5-13e799a2504e\") " pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.460949 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:25 crc kubenswrapper[5047]: E0223 06:48:25.461285 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:25.961269258 +0000 UTC m=+228.212596392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.483057 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.557100 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-694f4c68df-rbr22"] Feb 23 06:48:25 crc kubenswrapper[5047]: E0223 06:48:25.557432 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71636caa-95b7-4d64-a6f1-1cdf7dc03c07" containerName="controller-manager" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.557449 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="71636caa-95b7-4d64-a6f1-1cdf7dc03c07" containerName="controller-manager" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.557589 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="71636caa-95b7-4d64-a6f1-1cdf7dc03c07" containerName="controller-manager" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.558144 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.562125 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-694f4c68df-rbr22"] Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.562545 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsltq\" (UniqueName: \"kubernetes.io/projected/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-kube-api-access-qsltq\") pod \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.562578 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-proxy-ca-bundles\") pod \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.562675 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-client-ca\") pod \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.562738 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-config\") pod \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.562970 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.563018 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-serving-cert\") pod \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\" (UID: \"71636caa-95b7-4d64-a6f1-1cdf7dc03c07\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.563182 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b848e749-e1a5-47ab-aca5-13e799a2504e-catalog-content\") pod \"certified-operators-7sssp\" (UID: \"b848e749-e1a5-47ab-aca5-13e799a2504e\") " pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.563251 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c48zt\" (UniqueName: \"kubernetes.io/projected/b848e749-e1a5-47ab-aca5-13e799a2504e-kube-api-access-c48zt\") pod \"certified-operators-7sssp\" (UID: \"b848e749-e1a5-47ab-aca5-13e799a2504e\") " pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.563283 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b848e749-e1a5-47ab-aca5-13e799a2504e-utilities\") pod \"certified-operators-7sssp\" (UID: \"b848e749-e1a5-47ab-aca5-13e799a2504e\") " pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.564244 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b848e749-e1a5-47ab-aca5-13e799a2504e-utilities\") pod \"certified-operators-7sssp\" (UID: \"b848e749-e1a5-47ab-aca5-13e799a2504e\") " pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:48:25 crc kubenswrapper[5047]: E0223 06:48:25.564699 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:26.064677706 +0000 UTC m=+228.316004840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.566703 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "71636caa-95b7-4d64-a6f1-1cdf7dc03c07" (UID: "71636caa-95b7-4d64-a6f1-1cdf7dc03c07"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.567128 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-client-ca" (OuterVolumeSpecName: "client-ca") pod "71636caa-95b7-4d64-a6f1-1cdf7dc03c07" (UID: "71636caa-95b7-4d64-a6f1-1cdf7dc03c07"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.567669 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-config" (OuterVolumeSpecName: "config") pod "71636caa-95b7-4d64-a6f1-1cdf7dc03c07" (UID: "71636caa-95b7-4d64-a6f1-1cdf7dc03c07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.567822 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-kube-api-access-qsltq" (OuterVolumeSpecName: "kube-api-access-qsltq") pod "71636caa-95b7-4d64-a6f1-1cdf7dc03c07" (UID: "71636caa-95b7-4d64-a6f1-1cdf7dc03c07"). InnerVolumeSpecName "kube-api-access-qsltq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.568122 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b848e749-e1a5-47ab-aca5-13e799a2504e-catalog-content\") pod \"certified-operators-7sssp\" (UID: \"b848e749-e1a5-47ab-aca5-13e799a2504e\") " pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.576852 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "71636caa-95b7-4d64-a6f1-1cdf7dc03c07" (UID: "71636caa-95b7-4d64-a6f1-1cdf7dc03c07"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.596706 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c48zt\" (UniqueName: \"kubernetes.io/projected/b848e749-e1a5-47ab-aca5-13e799a2504e-kube-api-access-c48zt\") pod \"certified-operators-7sssp\" (UID: \"b848e749-e1a5-47ab-aca5-13e799a2504e\") " pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.647535 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.671191 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-client-ca\") pod \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.671256 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-serving-cert\") pod \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.671315 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26trc\" (UniqueName: \"kubernetes.io/projected/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-kube-api-access-26trc\") pod \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.671636 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-config\") pod \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\" (UID: \"d4967a59-3c6c-4ec3-9c70-4378ec3702c6\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.671854 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd9hd\" (UniqueName: \"kubernetes.io/projected/81624f2b-37c1-42b3-863c-2f4a61bb68cc-kube-api-access-pd9hd\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.671881 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81624f2b-37c1-42b3-863c-2f4a61bb68cc-serving-cert\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.671898 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-client-ca\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.672976 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.673004 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-config\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.673044 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-proxy-ca-bundles\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.673141 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsltq\" (UniqueName: \"kubernetes.io/projected/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-kube-api-access-qsltq\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.673152 5047 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.673161 5047 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.673170 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.673181 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71636caa-95b7-4d64-a6f1-1cdf7dc03c07-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.673980 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-client-ca" (OuterVolumeSpecName: "client-ca") pod "d4967a59-3c6c-4ec3-9c70-4378ec3702c6" (UID: "d4967a59-3c6c-4ec3-9c70-4378ec3702c6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:25 crc kubenswrapper[5047]: E0223 06:48:25.680259 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:26.180236722 +0000 UTC m=+228.431563856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.681466 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-config" (OuterVolumeSpecName: "config") pod "d4967a59-3c6c-4ec3-9c70-4378ec3702c6" (UID: "d4967a59-3c6c-4ec3-9c70-4378ec3702c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.682719 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d4967a59-3c6c-4ec3-9c70-4378ec3702c6" (UID: "d4967a59-3c6c-4ec3-9c70-4378ec3702c6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.682958 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.703873 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-kube-api-access-26trc" (OuterVolumeSpecName: "kube-api-access-26trc") pod "d4967a59-3c6c-4ec3-9c70-4378ec3702c6" (UID: "d4967a59-3c6c-4ec3-9c70-4378ec3702c6"). InnerVolumeSpecName "kube-api-access-26trc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.758872 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zhsns"] Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.774202 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.774391 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-proxy-ca-bundles\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.774478 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd9hd\" (UniqueName: \"kubernetes.io/projected/81624f2b-37c1-42b3-863c-2f4a61bb68cc-kube-api-access-pd9hd\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.774503 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81624f2b-37c1-42b3-863c-2f4a61bb68cc-serving-cert\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.774518 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-client-ca\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.774550 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-config\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.774599 5047 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.774611 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.774621 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26trc\" (UniqueName: \"kubernetes.io/projected/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-kube-api-access-26trc\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.774633 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4967a59-3c6c-4ec3-9c70-4378ec3702c6-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.775866 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-config\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: E0223 06:48:25.775966 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:26.275944976 +0000 UTC m=+228.527272110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.776613 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-proxy-ca-bundles\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.778507 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-client-ca\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.791610 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 23 06:48:25 crc kubenswrapper[5047]: E0223 06:48:25.792038 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4967a59-3c6c-4ec3-9c70-4378ec3702c6" containerName="route-controller-manager" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.792123 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4967a59-3c6c-4ec3-9c70-4378ec3702c6" containerName="route-controller-manager" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.794549 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4967a59-3c6c-4ec3-9c70-4378ec3702c6" containerName="route-controller-manager" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.808895 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd9hd\" (UniqueName: \"kubernetes.io/projected/81624f2b-37c1-42b3-863c-2f4a61bb68cc-kube-api-access-pd9hd\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.811203 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.813271 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81624f2b-37c1-42b3-863c-2f4a61bb68cc-serving-cert\") pod \"controller-manager-694f4c68df-rbr22\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.816134 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.816278 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.844597 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.851519 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gnn8v"] Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.876106 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66cfc9b8-5b52-47f9-91be-e41f21385713-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"66cfc9b8-5b52-47f9-91be-e41f21385713\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.876150 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66cfc9b8-5b52-47f9-91be-e41f21385713-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"66cfc9b8-5b52-47f9-91be-e41f21385713\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.876217 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:25 crc kubenswrapper[5047]: E0223 06:48:25.876676 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:26.37666042 +0000 UTC m=+228.627987554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.887250 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2qnnx"] Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.914568 5047 generic.go:334] "Generic (PLEG): container finished" podID="71636caa-95b7-4d64-a6f1-1cdf7dc03c07" containerID="f8390d9c000c914a320f33a8d41ffd7d7012a5755806288de23fe1aa88f8eecc" exitCode=0 Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.914773 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" event={"ID":"71636caa-95b7-4d64-a6f1-1cdf7dc03c07","Type":"ContainerDied","Data":"f8390d9c000c914a320f33a8d41ffd7d7012a5755806288de23fe1aa88f8eecc"} Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.915179 5047 scope.go:117] "RemoveContainer" containerID="f8390d9c000c914a320f33a8d41ffd7d7012a5755806288de23fe1aa88f8eecc" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.915279 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.916239 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.923516 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cm66d" event={"ID":"71636caa-95b7-4d64-a6f1-1cdf7dc03c07","Type":"ContainerDied","Data":"ca7f0abf447772eb70bf800d6251530d02c99089b25db49b82b6f5926b765118"} Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.923692 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhsns" event={"ID":"83f104c4-c1f9-4714-a807-2a6368b538fd","Type":"ContainerStarted","Data":"79394dc7f58674e369e970e52ac36b4d6c6a8b93a10204fcb2c952b8cc81ba35"} Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.923972 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnn8v" event={"ID":"46419724-8d7c-47d4-9d51-3aef5c54ab1b","Type":"ContainerStarted","Data":"7b61f12517e6ee30d60048f3d901538c200c6c6367f94509aa8a817732669022"} Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.926381 5047 generic.go:334] "Generic (PLEG): container finished" podID="d4967a59-3c6c-4ec3-9c70-4378ec3702c6" containerID="8f2cdde113d2295560e3e4bc27cf5a98b742ef79a3ccd0dbf49ab65fa162d725" exitCode=0 Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.927627 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.928006 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" event={"ID":"d4967a59-3c6c-4ec3-9c70-4378ec3702c6","Type":"ContainerDied","Data":"8f2cdde113d2295560e3e4bc27cf5a98b742ef79a3ccd0dbf49ab65fa162d725"} Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.928158 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m" event={"ID":"d4967a59-3c6c-4ec3-9c70-4378ec3702c6","Type":"ContainerDied","Data":"ff56b88f7a69f2783438d6fd485a78e536e316e4b509e276cccb353fe592bc10"} Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.957944 5047 scope.go:117] "RemoveContainer" containerID="f8390d9c000c914a320f33a8d41ffd7d7012a5755806288de23fe1aa88f8eecc" Feb 23 06:48:25 crc kubenswrapper[5047]: E0223 06:48:25.960835 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8390d9c000c914a320f33a8d41ffd7d7012a5755806288de23fe1aa88f8eecc\": container with ID starting with f8390d9c000c914a320f33a8d41ffd7d7012a5755806288de23fe1aa88f8eecc not found: ID does not exist" containerID="f8390d9c000c914a320f33a8d41ffd7d7012a5755806288de23fe1aa88f8eecc" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.960884 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8390d9c000c914a320f33a8d41ffd7d7012a5755806288de23fe1aa88f8eecc"} err="failed to get container status \"f8390d9c000c914a320f33a8d41ffd7d7012a5755806288de23fe1aa88f8eecc\": rpc error: code = NotFound desc = could not find container \"f8390d9c000c914a320f33a8d41ffd7d7012a5755806288de23fe1aa88f8eecc\": container with ID starting with f8390d9c000c914a320f33a8d41ffd7d7012a5755806288de23fe1aa88f8eecc not found: ID does not exist" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.960930 5047 scope.go:117] "RemoveContainer" containerID="8f2cdde113d2295560e3e4bc27cf5a98b742ef79a3ccd0dbf49ab65fa162d725" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.978374 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.978574 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cm66d"] Feb 23 06:48:25 crc kubenswrapper[5047]: E0223 06:48:25.978802 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:26.478774991 +0000 UTC m=+228.730102125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.978889 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66cfc9b8-5b52-47f9-91be-e41f21385713-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"66cfc9b8-5b52-47f9-91be-e41f21385713\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.979100 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66cfc9b8-5b52-47f9-91be-e41f21385713-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"66cfc9b8-5b52-47f9-91be-e41f21385713\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.979738 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66cfc9b8-5b52-47f9-91be-e41f21385713-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"66cfc9b8-5b52-47f9-91be-e41f21385713\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:48:25 crc kubenswrapper[5047]: I0223 06:48:25.983076 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cm66d"] Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.012549 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m"] Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.012864 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66cfc9b8-5b52-47f9-91be-e41f21385713-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"66cfc9b8-5b52-47f9-91be-e41f21385713\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.016304 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7sssp"] Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.019734 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ct67m"] Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.019965 5047 scope.go:117] "RemoveContainer" containerID="8f2cdde113d2295560e3e4bc27cf5a98b742ef79a3ccd0dbf49ab65fa162d725" Feb 23 06:48:26 crc kubenswrapper[5047]: E0223 06:48:26.022107 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f2cdde113d2295560e3e4bc27cf5a98b742ef79a3ccd0dbf49ab65fa162d725\": container with ID starting with 8f2cdde113d2295560e3e4bc27cf5a98b742ef79a3ccd0dbf49ab65fa162d725 not found: ID does not exist" containerID="8f2cdde113d2295560e3e4bc27cf5a98b742ef79a3ccd0dbf49ab65fa162d725" Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.022144 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f2cdde113d2295560e3e4bc27cf5a98b742ef79a3ccd0dbf49ab65fa162d725"} err="failed to get container status \"8f2cdde113d2295560e3e4bc27cf5a98b742ef79a3ccd0dbf49ab65fa162d725\": rpc error: code = NotFound desc = could not find container \"8f2cdde113d2295560e3e4bc27cf5a98b742ef79a3ccd0dbf49ab65fa162d725\": container with ID starting with 8f2cdde113d2295560e3e4bc27cf5a98b742ef79a3ccd0dbf49ab65fa162d725 not found: ID does not exist" Feb 23 06:48:26 crc kubenswrapper[5047]: W0223 06:48:26.032711 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb848e749_e1a5_47ab_aca5_13e799a2504e.slice/crio-d8b07b669d4ab3ac572382b802ccb9a00656830b6d87ea091e45b2086c3fd2df WatchSource:0}: Error finding container d8b07b669d4ab3ac572382b802ccb9a00656830b6d87ea091e45b2086c3fd2df: Status 404 returned error can't find the container with id d8b07b669d4ab3ac572382b802ccb9a00656830b6d87ea091e45b2086c3fd2df Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.087986 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:26 crc kubenswrapper[5047]: E0223 06:48:26.088836 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:26.588820453 +0000 UTC m=+228.840147587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.159036 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.189226 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:26 crc kubenswrapper[5047]: E0223 06:48:26.189653 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:26.689609789 +0000 UTC m=+228.940936923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.189721 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:26 crc kubenswrapper[5047]: E0223 06:48:26.190274 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:26.690264138 +0000 UTC m=+228.941591272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.246745 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-694f4c68df-rbr22"] Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.290328 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:26 crc kubenswrapper[5047]: E0223 06:48:26.290847 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:26.790830197 +0000 UTC m=+229.042157331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.371965 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71636caa-95b7-4d64-a6f1-1cdf7dc03c07" path="/var/lib/kubelet/pods/71636caa-95b7-4d64-a6f1-1cdf7dc03c07/volumes" Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.373832 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4967a59-3c6c-4ec3-9c70-4378ec3702c6" path="/var/lib/kubelet/pods/d4967a59-3c6c-4ec3-9c70-4378ec3702c6/volumes" Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.392783 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:26 crc kubenswrapper[5047]: E0223 06:48:26.393154 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:26.893138804 +0000 UTC m=+229.144465928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.425577 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:26 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:26 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:26 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.425647 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.434365 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.494596 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:26 crc kubenswrapper[5047]: E0223 06:48:26.495767 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:26.995696629 +0000 UTC m=+229.247023763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.596206 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:26 crc kubenswrapper[5047]: E0223 06:48:26.596594 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:27.096580347 +0000 UTC m=+229.347907481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.697325 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:26 crc kubenswrapper[5047]: E0223 06:48:26.697485 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:27.197457375 +0000 UTC m=+229.448784509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.697622 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:26 crc kubenswrapper[5047]: E0223 06:48:26.698045 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:27.19802431 +0000 UTC m=+229.449351444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.798605 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:26 crc kubenswrapper[5047]: E0223 06:48:26.798795 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:27.298760004 +0000 UTC m=+229.550087138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.799512 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:26 crc kubenswrapper[5047]: E0223 06:48:26.800075 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:27.300050299 +0000 UTC m=+229.551377433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.900686 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:26 crc kubenswrapper[5047]: E0223 06:48:26.901286 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:27.401260056 +0000 UTC m=+229.652587200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.937219 5047 generic.go:334] "Generic (PLEG): container finished" podID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" containerID="88a31aef10db7dce12dfcfb30eddfe9c8a634bdbaf41617d116a9153b08e4a0f" exitCode=0 Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.937320 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnn8v" event={"ID":"46419724-8d7c-47d4-9d51-3aef5c54ab1b","Type":"ContainerDied","Data":"88a31aef10db7dce12dfcfb30eddfe9c8a634bdbaf41617d116a9153b08e4a0f"} Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.940275 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.945157 5047 generic.go:334] "Generic (PLEG): container finished" podID="8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" containerID="09dfcb4a4d2d7764b5e6c3068780a41cda7b7a4231fe0d3eff02f2fbd4d28f30" exitCode=0 Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.945545 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qnnx" event={"ID":"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0","Type":"ContainerDied","Data":"09dfcb4a4d2d7764b5e6c3068780a41cda7b7a4231fe0d3eff02f2fbd4d28f30"} Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.945629 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qnnx" event={"ID":"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0","Type":"ContainerStarted","Data":"23e82e80b0cd4427f7c211f9795e3ae85c2ef77debaae4341d812cbb9a3906a8"} Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.947821 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g6khf"] Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.949341 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.955158 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.962594 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g6khf"] Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.966061 5047 generic.go:334] "Generic (PLEG): container finished" podID="d845d9f8-e91e-46b9-b484-dbfeb6006c7f" containerID="8f6e37f1f6cd5a01340c7a7c2791d58e7a3095aab22cfed70b914d2f52d6dc8d" exitCode=0 Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.966213 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" event={"ID":"d845d9f8-e91e-46b9-b484-dbfeb6006c7f","Type":"ContainerDied","Data":"8f6e37f1f6cd5a01340c7a7c2791d58e7a3095aab22cfed70b914d2f52d6dc8d"} Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.999246 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m247m" event={"ID":"b0031f21-2311-48e2-acb4-fc64475e66a2","Type":"ContainerStarted","Data":"8802e38cfc8d8b8e9617e6eb9083817e64a60eb7105816f8b4749984a5575865"} Feb 23 06:48:26 crc kubenswrapper[5047]: I0223 06:48:26.999326 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m247m" event={"ID":"b0031f21-2311-48e2-acb4-fc64475e66a2","Type":"ContainerStarted","Data":"cd9739105233207293437e076457e989e6f72873ab24b350b3770a8ffa61c056"} Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.004816 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.004898 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-utilities\") pod \"redhat-marketplace-g6khf\" (UID: \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\") " pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.004940 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-catalog-content\") pod \"redhat-marketplace-g6khf\" (UID: \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\") " pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.004982 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f7xd\" (UniqueName: \"kubernetes.io/projected/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-kube-api-access-4f7xd\") pod \"redhat-marketplace-g6khf\" (UID: \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\") " pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:48:27 crc kubenswrapper[5047]: E0223 06:48:27.006433 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:27.506416253 +0000 UTC m=+229.757743377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.016621 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" event={"ID":"81624f2b-37c1-42b3-863c-2f4a61bb68cc","Type":"ContainerStarted","Data":"808aac692cd147797b9a4330c3137dffddf573443fc6fbe488ecb07b73570f1a"} Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.016673 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" event={"ID":"81624f2b-37c1-42b3-863c-2f4a61bb68cc","Type":"ContainerStarted","Data":"00470f164bc9557fb541ab3ddad5f4c7098587009a313398e2471f8109c45023"} Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.017302 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.020491 5047 generic.go:334] "Generic (PLEG): container finished" podID="83f104c4-c1f9-4714-a807-2a6368b538fd" containerID="dbd359375b507a12f1592d9338dfc7b78470829f3cd3aa042fb3511727e764e7" exitCode=0 Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.020565 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhsns" event={"ID":"83f104c4-c1f9-4714-a807-2a6368b538fd","Type":"ContainerDied","Data":"dbd359375b507a12f1592d9338dfc7b78470829f3cd3aa042fb3511727e764e7"} Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.037245 5047 generic.go:334] "Generic (PLEG): container finished" podID="b848e749-e1a5-47ab-aca5-13e799a2504e" containerID="504c07b2ba8502e5cb1d852c7b7ca9904e1651d50fda26ef38cb0fda428c1472" exitCode=0 Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.037412 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sssp" event={"ID":"b848e749-e1a5-47ab-aca5-13e799a2504e","Type":"ContainerDied","Data":"504c07b2ba8502e5cb1d852c7b7ca9904e1651d50fda26ef38cb0fda428c1472"} Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.037461 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sssp" event={"ID":"b848e749-e1a5-47ab-aca5-13e799a2504e","Type":"ContainerStarted","Data":"d8b07b669d4ab3ac572382b802ccb9a00656830b6d87ea091e45b2086c3fd2df"} Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.047485 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.074224 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"66cfc9b8-5b52-47f9-91be-e41f21385713","Type":"ContainerStarted","Data":"06673324f90b7bcbe20d22a33dcb8422ffbe90aee8f7f9decdf3fc4eecb0417f"} Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.107996 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.108361 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-utilities\") pod \"redhat-marketplace-g6khf\" (UID: \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\") " pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.108401 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-catalog-content\") pod \"redhat-marketplace-g6khf\" (UID: \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\") " pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.108440 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f7xd\" (UniqueName: \"kubernetes.io/projected/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-kube-api-access-4f7xd\") pod \"redhat-marketplace-g6khf\" (UID: \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\") " pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:48:27 crc kubenswrapper[5047]: E0223 06:48:27.109754 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:27.609726749 +0000 UTC m=+229.861053883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.111586 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-catalog-content\") pod \"redhat-marketplace-g6khf\" (UID: \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\") " pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.111857 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-utilities\") pod \"redhat-marketplace-g6khf\" (UID: \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\") " pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.112223 5047 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.144221 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f7xd\" (UniqueName: \"kubernetes.io/projected/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-kube-api-access-4f7xd\") pod \"redhat-marketplace-g6khf\" (UID: \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\") " pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.184974 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" podStartSLOduration=4.184951865 podStartE2EDuration="4.184951865s" podCreationTimestamp="2026-02-23 06:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:27.162370329 +0000 UTC m=+229.413697463" watchObservedRunningTime="2026-02-23 06:48:27.184951865 +0000 UTC m=+229.436278999" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.210869 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:27 crc kubenswrapper[5047]: E0223 06:48:27.211360 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:27.711340397 +0000 UTC m=+229.962667531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.266316 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.312689 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:27 crc kubenswrapper[5047]: E0223 06:48:27.312942 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:27.812888343 +0000 UTC m=+230.064215477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.313109 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:27 crc kubenswrapper[5047]: E0223 06:48:27.313564 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:27.813546302 +0000 UTC m=+230.064873426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.324519 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nwplb"] Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.325619 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.339446 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwplb"] Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.418088 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.418444 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn9gt\" (UniqueName: \"kubernetes.io/projected/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-kube-api-access-pn9gt\") pod \"redhat-marketplace-nwplb\" (UID: \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\") " pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.418515 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-catalog-content\") pod \"redhat-marketplace-nwplb\" (UID: \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\") " pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.418556 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-utilities\") pod \"redhat-marketplace-nwplb\" (UID: \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\") " pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:48:27 crc kubenswrapper[5047]: E0223 06:48:27.418704 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:27.918685487 +0000 UTC m=+230.170012621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.423758 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:27 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:27 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:27 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.423855 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.520144 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn9gt\" (UniqueName: \"kubernetes.io/projected/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-kube-api-access-pn9gt\") pod \"redhat-marketplace-nwplb\" (UID: \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\") " pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.520313 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-catalog-content\") pod \"redhat-marketplace-nwplb\" (UID: \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\") " pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.520354 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.520375 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-utilities\") pod \"redhat-marketplace-nwplb\" (UID: \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\") " pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.521505 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-utilities\") pod \"redhat-marketplace-nwplb\" (UID: \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\") " pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.521605 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-catalog-content\") pod \"redhat-marketplace-nwplb\" (UID: \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\") " pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:48:27 crc kubenswrapper[5047]: E0223 06:48:27.521920 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:48:28.02188521 +0000 UTC m=+230.273212344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8fh6b" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.570102 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn9gt\" (UniqueName: \"kubernetes.io/projected/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-kube-api-access-pn9gt\") pod \"redhat-marketplace-nwplb\" (UID: \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\") " pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.593030 5047 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-23T06:48:27.112249469Z","Handler":null,"Name":""} Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.623678 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:27 crc kubenswrapper[5047]: E0223 06:48:27.624063 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:48:28.124040382 +0000 UTC m=+230.375367516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.637038 5047 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.637329 5047 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.649541 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g6khf"] Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.663551 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.710214 5047 patch_prober.go:28] interesting pod/downloads-7954f5f757-75wvb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.710274 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-75wvb" podUID="b4a2bba0-ebb9-4923-a829-69112ef9c89c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.710658 5047 patch_prober.go:28] interesting pod/downloads-7954f5f757-75wvb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.710672 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-75wvb" podUID="b4a2bba0-ebb9-4923-a829-69112ef9c89c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.725157 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.739419 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.739746 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.751764 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.752066 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.758226 5047 patch_prober.go:28] interesting pod/console-f9d7485db-v5zcl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.758319 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-v5zcl" podUID="cd20eb8d-5a96-407f-a898-1dad49ba8355" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.805550 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8fh6b\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.827448 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.870262 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.938573 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f7bfp"] Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.944047 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.946999 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.954446 5047 patch_prober.go:28] interesting pod/apiserver-76f77b778f-9l6zf container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 23 06:48:27 crc kubenswrapper[5047]: [+]log ok Feb 23 06:48:27 crc kubenswrapper[5047]: [+]etcd ok Feb 23 06:48:27 crc kubenswrapper[5047]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 23 06:48:27 crc kubenswrapper[5047]: [+]poststarthook/generic-apiserver-start-informers ok Feb 23 06:48:27 crc kubenswrapper[5047]: [+]poststarthook/max-in-flight-filter ok Feb 23 06:48:27 crc kubenswrapper[5047]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 23 06:48:27 crc kubenswrapper[5047]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 23 06:48:27 crc kubenswrapper[5047]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 23 06:48:27 crc kubenswrapper[5047]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 23 06:48:27 crc kubenswrapper[5047]: [+]poststarthook/project.openshift.io-projectcache ok Feb 23 06:48:27 crc kubenswrapper[5047]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 23 06:48:27 crc kubenswrapper[5047]: [+]poststarthook/openshift.io-startinformers ok Feb 23 06:48:27 crc kubenswrapper[5047]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 23 06:48:27 crc kubenswrapper[5047]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 23 06:48:27 crc kubenswrapper[5047]: livez check failed Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.954514 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" podUID="ac96f0e5-3605-48c8-80b4-ef4e02123af8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:27 crc kubenswrapper[5047]: I0223 06:48:27.970883 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7bfp"] Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.002556 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.032309 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmbls\" (UniqueName: \"kubernetes.io/projected/c470fb1c-868d-420e-b6e2-61369cecd6c3-kube-api-access-fmbls\") pod \"redhat-operators-f7bfp\" (UID: \"c470fb1c-868d-420e-b6e2-61369cecd6c3\") " pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.032382 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c470fb1c-868d-420e-b6e2-61369cecd6c3-catalog-content\") pod \"redhat-operators-f7bfp\" (UID: \"c470fb1c-868d-420e-b6e2-61369cecd6c3\") " pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.032407 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c470fb1c-868d-420e-b6e2-61369cecd6c3-utilities\") pod \"redhat-operators-f7bfp\" (UID: \"c470fb1c-868d-420e-b6e2-61369cecd6c3\") " pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.092617 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6khf" event={"ID":"8964090d-1ca0-4cb9-b2a6-f4293fdf9591","Type":"ContainerStarted","Data":"ea988d7f2f3e97a697e52b9c07413bc629420ae562398d6b7fd7258f51474f7c"} Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.092786 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6khf" event={"ID":"8964090d-1ca0-4cb9-b2a6-f4293fdf9591","Type":"ContainerStarted","Data":"a5f3a076d07f09179a85af2e69202d09d71ec16134ad0f709bf9ee2d323ac77e"} Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.107026 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m247m" event={"ID":"b0031f21-2311-48e2-acb4-fc64475e66a2","Type":"ContainerStarted","Data":"8f51d7cc01ed7469e4b3b0edbe17c8aa6a51ce3ddce8870b22f981ab24383805"} Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.129054 5047 generic.go:334] "Generic (PLEG): container finished" podID="66cfc9b8-5b52-47f9-91be-e41f21385713" containerID="b7eb80b68e412ead5ca76cf0bd99d4e6e6e8a72fd0649ff4bdcf6944501bded0" exitCode=0 Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.129364 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"66cfc9b8-5b52-47f9-91be-e41f21385713","Type":"ContainerDied","Data":"b7eb80b68e412ead5ca76cf0bd99d4e6e6e8a72fd0649ff4bdcf6944501bded0"} Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.136440 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c470fb1c-868d-420e-b6e2-61369cecd6c3-catalog-content\") pod \"redhat-operators-f7bfp\" (UID: \"c470fb1c-868d-420e-b6e2-61369cecd6c3\") " pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.136499 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c470fb1c-868d-420e-b6e2-61369cecd6c3-utilities\") pod \"redhat-operators-f7bfp\" (UID: \"c470fb1c-868d-420e-b6e2-61369cecd6c3\") " pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.136623 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmbls\" (UniqueName: \"kubernetes.io/projected/c470fb1c-868d-420e-b6e2-61369cecd6c3-kube-api-access-fmbls\") pod \"redhat-operators-f7bfp\" (UID: \"c470fb1c-868d-420e-b6e2-61369cecd6c3\") " pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.139345 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c470fb1c-868d-420e-b6e2-61369cecd6c3-catalog-content\") pod \"redhat-operators-f7bfp\" (UID: \"c470fb1c-868d-420e-b6e2-61369cecd6c3\") " pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.140022 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c470fb1c-868d-420e-b6e2-61369cecd6c3-utilities\") pod \"redhat-operators-f7bfp\" (UID: \"c470fb1c-868d-420e-b6e2-61369cecd6c3\") " pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.148966 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-m247m" podStartSLOduration=13.148949992 podStartE2EDuration="13.148949992s" podCreationTimestamp="2026-02-23 06:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:28.146144694 +0000 UTC m=+230.397471838" watchObservedRunningTime="2026-02-23 06:48:28.148949992 +0000 UTC m=+230.400277126" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.154408 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwplb"] Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.177654 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmbls\" (UniqueName: \"kubernetes.io/projected/c470fb1c-868d-420e-b6e2-61369cecd6c3-kube-api-access-fmbls\") pod \"redhat-operators-f7bfp\" (UID: \"c470fb1c-868d-420e-b6e2-61369cecd6c3\") " pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.273460 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.289833 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.292087 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.297611 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.301094 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.327623 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.340848 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6289171-aad5-413e-ae73-8cab9950b113-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c6289171-aad5-413e-ae73-8cab9950b113\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.340963 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6289171-aad5-413e-ae73-8cab9950b113-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c6289171-aad5-413e-ae73-8cab9950b113\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.381041 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.383836 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dzzqw"] Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.390381 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzzqw"] Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.390533 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.413564 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.415397 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8"] Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.427750 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.431913 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.432220 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.432423 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.432544 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.432775 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.437716 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.437780 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8"] Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.438729 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:28 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:28 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:28 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.438777 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.461360 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d5e28d-8de9-4514-92d8-05223c439cd4-catalog-content\") pod \"redhat-operators-dzzqw\" (UID: \"f8d5e28d-8de9-4514-92d8-05223c439cd4\") " pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.461443 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7fd9d004-a17c-4302-b139-b7fe123a5896-client-ca\") pod \"route-controller-manager-677bd9c759-kqbn8\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.461524 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6289171-aad5-413e-ae73-8cab9950b113-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c6289171-aad5-413e-ae73-8cab9950b113\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.461571 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6289171-aad5-413e-ae73-8cab9950b113-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c6289171-aad5-413e-ae73-8cab9950b113\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.461598 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skmdz\" (UniqueName: \"kubernetes.io/projected/f8d5e28d-8de9-4514-92d8-05223c439cd4-kube-api-access-skmdz\") pod \"redhat-operators-dzzqw\" (UID: \"f8d5e28d-8de9-4514-92d8-05223c439cd4\") " pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.461632 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd9d004-a17c-4302-b139-b7fe123a5896-config\") pod \"route-controller-manager-677bd9c759-kqbn8\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.461663 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvmph\" (UniqueName: \"kubernetes.io/projected/7fd9d004-a17c-4302-b139-b7fe123a5896-kube-api-access-pvmph\") pod \"route-controller-manager-677bd9c759-kqbn8\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.461705 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d5e28d-8de9-4514-92d8-05223c439cd4-utilities\") pod \"redhat-operators-dzzqw\" (UID: \"f8d5e28d-8de9-4514-92d8-05223c439cd4\") " pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.461748 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fd9d004-a17c-4302-b139-b7fe123a5896-serving-cert\") pod \"route-controller-manager-677bd9c759-kqbn8\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.462723 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6289171-aad5-413e-ae73-8cab9950b113-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c6289171-aad5-413e-ae73-8cab9950b113\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.543809 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6289171-aad5-413e-ae73-8cab9950b113-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c6289171-aad5-413e-ae73-8cab9950b113\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.571117 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nvbsc" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.573257 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd9d004-a17c-4302-b139-b7fe123a5896-config\") pod \"route-controller-manager-677bd9c759-kqbn8\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.573304 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvmph\" (UniqueName: \"kubernetes.io/projected/7fd9d004-a17c-4302-b139-b7fe123a5896-kube-api-access-pvmph\") pod \"route-controller-manager-677bd9c759-kqbn8\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.573335 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d5e28d-8de9-4514-92d8-05223c439cd4-utilities\") pod \"redhat-operators-dzzqw\" (UID: \"f8d5e28d-8de9-4514-92d8-05223c439cd4\") " pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.573365 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fd9d004-a17c-4302-b139-b7fe123a5896-serving-cert\") pod \"route-controller-manager-677bd9c759-kqbn8\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.573398 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d5e28d-8de9-4514-92d8-05223c439cd4-catalog-content\") pod \"redhat-operators-dzzqw\" (UID: \"f8d5e28d-8de9-4514-92d8-05223c439cd4\") " pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.573422 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7fd9d004-a17c-4302-b139-b7fe123a5896-client-ca\") pod \"route-controller-manager-677bd9c759-kqbn8\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.573481 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skmdz\" (UniqueName: \"kubernetes.io/projected/f8d5e28d-8de9-4514-92d8-05223c439cd4-kube-api-access-skmdz\") pod \"redhat-operators-dzzqw\" (UID: \"f8d5e28d-8de9-4514-92d8-05223c439cd4\") " pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.574519 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d5e28d-8de9-4514-92d8-05223c439cd4-utilities\") pod \"redhat-operators-dzzqw\" (UID: \"f8d5e28d-8de9-4514-92d8-05223c439cd4\") " pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.578180 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd9d004-a17c-4302-b139-b7fe123a5896-config\") pod \"route-controller-manager-677bd9c759-kqbn8\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.578836 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d5e28d-8de9-4514-92d8-05223c439cd4-catalog-content\") pod \"redhat-operators-dzzqw\" (UID: \"f8d5e28d-8de9-4514-92d8-05223c439cd4\") " pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.582971 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8fh6b"] Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.590356 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fd9d004-a17c-4302-b139-b7fe123a5896-serving-cert\") pod \"route-controller-manager-677bd9c759-kqbn8\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.605348 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7fd9d004-a17c-4302-b139-b7fe123a5896-client-ca\") pod \"route-controller-manager-677bd9c759-kqbn8\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.616737 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skmdz\" (UniqueName: \"kubernetes.io/projected/f8d5e28d-8de9-4514-92d8-05223c439cd4-kube-api-access-skmdz\") pod \"redhat-operators-dzzqw\" (UID: \"f8d5e28d-8de9-4514-92d8-05223c439cd4\") " pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.620064 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvmph\" (UniqueName: \"kubernetes.io/projected/7fd9d004-a17c-4302-b139-b7fe123a5896-kube-api-access-pvmph\") pod \"route-controller-manager-677bd9c759-kqbn8\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.630470 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.635394 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.666975 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m5fb6" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.667730 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.725476 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7bfp"] Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.727951 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:48:28 crc kubenswrapper[5047]: W0223 06:48:28.762235 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc470fb1c_868d_420e_b6e2_61369cecd6c3.slice/crio-3433b147794ed354200ac095348e1ec0837f0ebb942c33b162ae541674ee9648 WatchSource:0}: Error finding container 3433b147794ed354200ac095348e1ec0837f0ebb942c33b162ae541674ee9648: Status 404 returned error can't find the container with id 3433b147794ed354200ac095348e1ec0837f0ebb942c33b162ae541674ee9648 Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.766450 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.777777 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.789286 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-config-volume\") pod \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\" (UID: \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\") " Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.789383 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-secret-volume\") pod \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\" (UID: \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\") " Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.789542 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhd28\" (UniqueName: \"kubernetes.io/projected/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-kube-api-access-hhd28\") pod \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\" (UID: \"d845d9f8-e91e-46b9-b484-dbfeb6006c7f\") " Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.789967 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-config-volume" (OuterVolumeSpecName: "config-volume") pod "d845d9f8-e91e-46b9-b484-dbfeb6006c7f" (UID: "d845d9f8-e91e-46b9-b484-dbfeb6006c7f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.809111 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d845d9f8-e91e-46b9-b484-dbfeb6006c7f" (UID: "d845d9f8-e91e-46b9-b484-dbfeb6006c7f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.848647 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-kube-api-access-hhd28" (OuterVolumeSpecName: "kube-api-access-hhd28") pod "d845d9f8-e91e-46b9-b484-dbfeb6006c7f" (UID: "d845d9f8-e91e-46b9-b484-dbfeb6006c7f"). InnerVolumeSpecName "kube-api-access-hhd28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.893141 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhd28\" (UniqueName: \"kubernetes.io/projected/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-kube-api-access-hhd28\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.893175 5047 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:28 crc kubenswrapper[5047]: I0223 06:48:28.893187 5047 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d845d9f8-e91e-46b9-b484-dbfeb6006c7f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.185917 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" event={"ID":"d845d9f8-e91e-46b9-b484-dbfeb6006c7f","Type":"ContainerDied","Data":"c7706a9f207e8d0896d04400047ea942eabb8238184365de6098db73b8ee6d02"} Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.186305 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7706a9f207e8d0896d04400047ea942eabb8238184365de6098db73b8ee6d02" Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.186371 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4" Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.194286 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" event={"ID":"a75720bd-50a7-4a3c-b12c-e901126d4382","Type":"ContainerStarted","Data":"6cb6c058a68e75cd3c0e55ef29a764d7b079e59724f259a6ad417baa85dc1029"} Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.194338 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" event={"ID":"a75720bd-50a7-4a3c-b12c-e901126d4382","Type":"ContainerStarted","Data":"e24294f2db085f188d0a25250af1c632457d9e8508965092c6b6915e2d425e7c"} Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.211492 5047 generic.go:334] "Generic (PLEG): container finished" podID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" containerID="85ea9bf5866baffb3de79fee60dc6badac55ae38008041239c82df1a477280d8" exitCode=0 Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.211592 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwplb" event={"ID":"2fb237ea-bf04-4d9d-bbb0-2363c97e139b","Type":"ContainerDied","Data":"85ea9bf5866baffb3de79fee60dc6badac55ae38008041239c82df1a477280d8"} Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.211623 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwplb" event={"ID":"2fb237ea-bf04-4d9d-bbb0-2363c97e139b","Type":"ContainerStarted","Data":"9d8aeaa52ab88af05d4b401edb4392c9626f20da445782a3132827ac7572b45e"} Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.221516 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7bfp" event={"ID":"c470fb1c-868d-420e-b6e2-61369cecd6c3","Type":"ContainerStarted","Data":"3433b147794ed354200ac095348e1ec0837f0ebb942c33b162ae541674ee9648"} Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.226394 5047 generic.go:334] "Generic (PLEG): container finished" podID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" containerID="ea988d7f2f3e97a697e52b9c07413bc629420ae562398d6b7fd7258f51474f7c" exitCode=0 Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.226531 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6khf" event={"ID":"8964090d-1ca0-4cb9-b2a6-f4293fdf9591","Type":"ContainerDied","Data":"ea988d7f2f3e97a697e52b9c07413bc629420ae562398d6b7fd7258f51474f7c"} Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.258607 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.436263 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:29 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:29 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:29 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.436326 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.508798 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzzqw"] Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.552232 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8"] Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.889624 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.918130 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66cfc9b8-5b52-47f9-91be-e41f21385713-kube-api-access\") pod \"66cfc9b8-5b52-47f9-91be-e41f21385713\" (UID: \"66cfc9b8-5b52-47f9-91be-e41f21385713\") " Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.918276 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66cfc9b8-5b52-47f9-91be-e41f21385713-kubelet-dir\") pod \"66cfc9b8-5b52-47f9-91be-e41f21385713\" (UID: \"66cfc9b8-5b52-47f9-91be-e41f21385713\") " Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.918712 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66cfc9b8-5b52-47f9-91be-e41f21385713-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "66cfc9b8-5b52-47f9-91be-e41f21385713" (UID: "66cfc9b8-5b52-47f9-91be-e41f21385713"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:48:29 crc kubenswrapper[5047]: I0223 06:48:29.929389 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66cfc9b8-5b52-47f9-91be-e41f21385713-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "66cfc9b8-5b52-47f9-91be-e41f21385713" (UID: "66cfc9b8-5b52-47f9-91be-e41f21385713"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.019039 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66cfc9b8-5b52-47f9-91be-e41f21385713-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.019403 5047 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66cfc9b8-5b52-47f9-91be-e41f21385713-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.239266 5047 generic.go:334] "Generic (PLEG): container finished" podID="c470fb1c-868d-420e-b6e2-61369cecd6c3" containerID="69410343c94b91aabfc9390d5bf57547d09b63efb0eec0fc80fa9adc8eb0aa42" exitCode=0 Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.239387 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7bfp" event={"ID":"c470fb1c-868d-420e-b6e2-61369cecd6c3","Type":"ContainerDied","Data":"69410343c94b91aabfc9390d5bf57547d09b63efb0eec0fc80fa9adc8eb0aa42"} Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.245753 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" event={"ID":"7fd9d004-a17c-4302-b139-b7fe123a5896","Type":"ContainerStarted","Data":"0b9c00f2a8142c693529ca056eb5c0c25995131f6b2d6ea7cefac9296a3c3f58"} Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.245786 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" event={"ID":"7fd9d004-a17c-4302-b139-b7fe123a5896","Type":"ContainerStarted","Data":"de552b40290aa71ce85734b8cbaaba5ac0a09b00b13dadcdc8a334b8905f3aab"} Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.247212 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzzqw" event={"ID":"f8d5e28d-8de9-4514-92d8-05223c439cd4","Type":"ContainerStarted","Data":"783f0f4f191e815a95308c08b1128190bd83b481cc4d56447ee3beb71cac639c"} Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.249073 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"66cfc9b8-5b52-47f9-91be-e41f21385713","Type":"ContainerDied","Data":"06673324f90b7bcbe20d22a33dcb8422ffbe90aee8f7f9decdf3fc4eecb0417f"} Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.249099 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06673324f90b7bcbe20d22a33dcb8422ffbe90aee8f7f9decdf3fc4eecb0417f" Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.249132 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.251498 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c6289171-aad5-413e-ae73-8cab9950b113","Type":"ContainerStarted","Data":"76513c50a2a32207dcec89011f3f8c73fa7e4b37ed1bf99f64521568ba4f1f0c"} Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.251653 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.279581 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" podStartSLOduration=161.279559104 podStartE2EDuration="2m41.279559104s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:30.279249026 +0000 UTC m=+232.530576170" watchObservedRunningTime="2026-02-23 06:48:30.279559104 +0000 UTC m=+232.530886238" Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.415617 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:30 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:30 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:30 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:30 crc kubenswrapper[5047]: I0223 06:48:30.415791 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:31 crc kubenswrapper[5047]: I0223 06:48:31.263131 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c6289171-aad5-413e-ae73-8cab9950b113","Type":"ContainerStarted","Data":"83f0635fb5f37bfdecb59b68655631cb23d2113ad3c3a281e8c59fb26763bc64"} Feb 23 06:48:31 crc kubenswrapper[5047]: I0223 06:48:31.266100 5047 generic.go:334] "Generic (PLEG): container finished" podID="f8d5e28d-8de9-4514-92d8-05223c439cd4" containerID="11d43239cc1e1694ddfad025e3fa1e2d647ff401c247e3a65dcae4c19124eea7" exitCode=0 Feb 23 06:48:31 crc kubenswrapper[5047]: I0223 06:48:31.266213 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzzqw" event={"ID":"f8d5e28d-8de9-4514-92d8-05223c439cd4","Type":"ContainerDied","Data":"11d43239cc1e1694ddfad025e3fa1e2d647ff401c247e3a65dcae4c19124eea7"} Feb 23 06:48:31 crc kubenswrapper[5047]: I0223 06:48:31.266539 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:31 crc kubenswrapper[5047]: I0223 06:48:31.286612 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:48:31 crc kubenswrapper[5047]: I0223 06:48:31.290181 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.290159694 podStartE2EDuration="3.290159694s" podCreationTimestamp="2026-02-23 06:48:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:31.285059662 +0000 UTC m=+233.536386796" watchObservedRunningTime="2026-02-23 06:48:31.290159694 +0000 UTC m=+233.541486828" Feb 23 06:48:31 crc kubenswrapper[5047]: I0223 06:48:31.326527 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" podStartSLOduration=7.326500172 podStartE2EDuration="7.326500172s" podCreationTimestamp="2026-02-23 06:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:31.325664939 +0000 UTC m=+233.576992093" watchObservedRunningTime="2026-02-23 06:48:31.326500172 +0000 UTC m=+233.577827326" Feb 23 06:48:31 crc kubenswrapper[5047]: I0223 06:48:31.413284 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:31 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:31 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:31 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:31 crc kubenswrapper[5047]: I0223 06:48:31.413362 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:32 crc kubenswrapper[5047]: I0223 06:48:32.286992 5047 generic.go:334] "Generic (PLEG): container finished" podID="c6289171-aad5-413e-ae73-8cab9950b113" containerID="83f0635fb5f37bfdecb59b68655631cb23d2113ad3c3a281e8c59fb26763bc64" exitCode=0 Feb 23 06:48:32 crc kubenswrapper[5047]: I0223 06:48:32.287177 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c6289171-aad5-413e-ae73-8cab9950b113","Type":"ContainerDied","Data":"83f0635fb5f37bfdecb59b68655631cb23d2113ad3c3a281e8c59fb26763bc64"} Feb 23 06:48:32 crc kubenswrapper[5047]: I0223 06:48:32.415886 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:32 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:32 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:32 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:32 crc kubenswrapper[5047]: I0223 06:48:32.416033 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:32 crc kubenswrapper[5047]: I0223 06:48:32.940291 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:32 crc kubenswrapper[5047]: I0223 06:48:32.945354 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-9l6zf" Feb 23 06:48:34 crc kubenswrapper[5047]: I0223 06:48:34.664977 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:34 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:34 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:34 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:34 crc kubenswrapper[5047]: I0223 06:48:34.666162 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:34 crc kubenswrapper[5047]: I0223 06:48:34.703350 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p7tmt" Feb 23 06:48:35 crc kubenswrapper[5047]: I0223 06:48:35.414810 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:35 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:35 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:35 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:35 crc kubenswrapper[5047]: I0223 06:48:35.415229 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:36 crc kubenswrapper[5047]: I0223 06:48:36.416850 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:36 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:36 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:36 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:36 crc kubenswrapper[5047]: I0223 06:48:36.416946 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:37 crc kubenswrapper[5047]: I0223 06:48:37.414551 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:37 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:37 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:37 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:37 crc kubenswrapper[5047]: I0223 06:48:37.414640 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:37 crc kubenswrapper[5047]: I0223 06:48:37.600053 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:48:37 crc kubenswrapper[5047]: I0223 06:48:37.626885 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb811549-5811-4996-ba8c-6f8848a80ce7-metrics-certs\") pod \"network-metrics-daemon-54jbp\" (UID: \"cb811549-5811-4996-ba8c-6f8848a80ce7\") " pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:48:37 crc kubenswrapper[5047]: I0223 06:48:37.709714 5047 patch_prober.go:28] interesting pod/downloads-7954f5f757-75wvb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 23 06:48:37 crc kubenswrapper[5047]: I0223 06:48:37.709791 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-75wvb" podUID="b4a2bba0-ebb9-4923-a829-69112ef9c89c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 23 06:48:37 crc kubenswrapper[5047]: I0223 06:48:37.709815 5047 patch_prober.go:28] interesting pod/downloads-7954f5f757-75wvb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 23 06:48:37 crc kubenswrapper[5047]: I0223 06:48:37.709890 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-75wvb" podUID="b4a2bba0-ebb9-4923-a829-69112ef9c89c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 23 06:48:37 crc kubenswrapper[5047]: I0223 06:48:37.749365 5047 patch_prober.go:28] interesting pod/console-f9d7485db-v5zcl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Feb 23 06:48:37 crc kubenswrapper[5047]: I0223 06:48:37.749448 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-v5zcl" podUID="cd20eb8d-5a96-407f-a898-1dad49ba8355" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Feb 23 06:48:37 crc kubenswrapper[5047]: I0223 06:48:37.896578 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-54jbp" Feb 23 06:48:38 crc kubenswrapper[5047]: I0223 06:48:38.414766 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:38 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:38 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:38 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:38 crc kubenswrapper[5047]: I0223 06:48:38.414825 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:39 crc kubenswrapper[5047]: I0223 06:48:39.413638 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:39 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:39 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:39 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:39 crc kubenswrapper[5047]: I0223 06:48:39.414064 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:40 crc kubenswrapper[5047]: I0223 06:48:40.464684 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:40 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:40 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:40 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:40 crc kubenswrapper[5047]: I0223 06:48:40.464770 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:41 crc kubenswrapper[5047]: I0223 06:48:41.413763 5047 patch_prober.go:28] interesting pod/router-default-5444994796-g5zp5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:48:41 crc kubenswrapper[5047]: [-]has-synced failed: reason withheld Feb 23 06:48:41 crc kubenswrapper[5047]: [+]process-running ok Feb 23 06:48:41 crc kubenswrapper[5047]: healthz check failed Feb 23 06:48:41 crc kubenswrapper[5047]: I0223 06:48:41.413851 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g5zp5" podUID="07d3cb2b-5f23-45b2-b650-a2ed7aab3f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:48:43 crc kubenswrapper[5047]: I0223 06:48:43.110789 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:43 crc kubenswrapper[5047]: I0223 06:48:43.116479 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-g5zp5" Feb 23 06:48:43 crc kubenswrapper[5047]: I0223 06:48:43.358520 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:48:43 crc kubenswrapper[5047]: I0223 06:48:43.500776 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6289171-aad5-413e-ae73-8cab9950b113-kube-api-access\") pod \"c6289171-aad5-413e-ae73-8cab9950b113\" (UID: \"c6289171-aad5-413e-ae73-8cab9950b113\") " Feb 23 06:48:43 crc kubenswrapper[5047]: I0223 06:48:43.501233 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6289171-aad5-413e-ae73-8cab9950b113-kubelet-dir\") pod \"c6289171-aad5-413e-ae73-8cab9950b113\" (UID: \"c6289171-aad5-413e-ae73-8cab9950b113\") " Feb 23 06:48:43 crc kubenswrapper[5047]: I0223 06:48:43.501389 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6289171-aad5-413e-ae73-8cab9950b113-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c6289171-aad5-413e-ae73-8cab9950b113" (UID: "c6289171-aad5-413e-ae73-8cab9950b113"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:48:43 crc kubenswrapper[5047]: I0223 06:48:43.501716 5047 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6289171-aad5-413e-ae73-8cab9950b113-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:43 crc kubenswrapper[5047]: I0223 06:48:43.508435 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6289171-aad5-413e-ae73-8cab9950b113-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c6289171-aad5-413e-ae73-8cab9950b113" (UID: "c6289171-aad5-413e-ae73-8cab9950b113"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:43 crc kubenswrapper[5047]: I0223 06:48:43.603021 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6289171-aad5-413e-ae73-8cab9950b113-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:44 crc kubenswrapper[5047]: I0223 06:48:44.141311 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c6289171-aad5-413e-ae73-8cab9950b113","Type":"ContainerDied","Data":"76513c50a2a32207dcec89011f3f8c73fa7e4b37ed1bf99f64521568ba4f1f0c"} Feb 23 06:48:44 crc kubenswrapper[5047]: I0223 06:48:44.141358 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:48:44 crc kubenswrapper[5047]: I0223 06:48:44.141379 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76513c50a2a32207dcec89011f3f8c73fa7e4b37ed1bf99f64521568ba4f1f0c" Feb 23 06:48:46 crc kubenswrapper[5047]: I0223 06:48:46.760194 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:48:46 crc kubenswrapper[5047]: I0223 06:48:46.760271 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:48:47 crc kubenswrapper[5047]: I0223 06:48:47.715756 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-75wvb" Feb 23 06:48:47 crc kubenswrapper[5047]: I0223 06:48:47.754459 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:47 crc kubenswrapper[5047]: I0223 06:48:47.763744 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 06:48:48 crc kubenswrapper[5047]: I0223 06:48:48.010534 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:48:56 crc kubenswrapper[5047]: E0223 06:48:56.815489 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 23 06:48:56 crc kubenswrapper[5047]: E0223 06:48:56.816435 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-skmdz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dzzqw_openshift-marketplace(f8d5e28d-8de9-4514-92d8-05223c439cd4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:48:56 crc kubenswrapper[5047]: E0223 06:48:56.817802 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-dzzqw" podUID="f8d5e28d-8de9-4514-92d8-05223c439cd4" Feb 23 06:48:57 crc kubenswrapper[5047]: E0223 06:48:57.505602 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dzzqw" podUID="f8d5e28d-8de9-4514-92d8-05223c439cd4" Feb 23 06:48:57 crc kubenswrapper[5047]: E0223 06:48:57.610461 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 23 06:48:57 crc kubenswrapper[5047]: E0223 06:48:57.611173 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmbls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-f7bfp_openshift-marketplace(c470fb1c-868d-420e-b6e2-61369cecd6c3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:48:57 crc kubenswrapper[5047]: E0223 06:48:57.612453 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-f7bfp" podUID="c470fb1c-868d-420e-b6e2-61369cecd6c3" Feb 23 06:48:57 crc kubenswrapper[5047]: E0223 06:48:57.618345 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 23 06:48:57 crc kubenswrapper[5047]: E0223 06:48:57.618551 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q5pts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zhsns_openshift-marketplace(83f104c4-c1f9-4714-a807-2a6368b538fd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:48:57 crc kubenswrapper[5047]: E0223 06:48:57.619835 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zhsns" podUID="83f104c4-c1f9-4714-a807-2a6368b538fd" Feb 23 06:48:58 crc kubenswrapper[5047]: I0223 06:48:58.745505 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lxddc" Feb 23 06:48:59 crc kubenswrapper[5047]: E0223 06:48:59.366539 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zhsns" podUID="83f104c4-c1f9-4714-a807-2a6368b538fd" Feb 23 06:48:59 crc kubenswrapper[5047]: E0223 06:48:59.366748 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-f7bfp" podUID="c470fb1c-868d-420e-b6e2-61369cecd6c3" Feb 23 06:48:59 crc kubenswrapper[5047]: E0223 06:48:59.443407 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 23 06:48:59 crc kubenswrapper[5047]: E0223 06:48:59.443604 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c48zt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7sssp_openshift-marketplace(b848e749-e1a5-47ab-aca5-13e799a2504e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:48:59 crc kubenswrapper[5047]: E0223 06:48:59.444842 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7sssp" podUID="b848e749-e1a5-47ab-aca5-13e799a2504e" Feb 23 06:48:59 crc kubenswrapper[5047]: E0223 06:48:59.462007 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 23 06:48:59 crc kubenswrapper[5047]: E0223 06:48:59.462697 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4jl2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-gnn8v_openshift-marketplace(46419724-8d7c-47d4-9d51-3aef5c54ab1b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:48:59 crc kubenswrapper[5047]: E0223 06:48:59.464279 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-gnn8v" podUID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" Feb 23 06:49:00 crc kubenswrapper[5047]: E0223 06:49:00.814057 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7sssp" podUID="b848e749-e1a5-47ab-aca5-13e799a2504e" Feb 23 06:49:00 crc kubenswrapper[5047]: E0223 06:49:00.814088 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-gnn8v" podUID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" Feb 23 06:49:00 crc kubenswrapper[5047]: E0223 06:49:00.841572 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 23 06:49:00 crc kubenswrapper[5047]: E0223 06:49:00.841860 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pn9gt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nwplb_openshift-marketplace(2fb237ea-bf04-4d9d-bbb0-2363c97e139b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:49:00 crc kubenswrapper[5047]: E0223 06:49:00.843151 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nwplb" podUID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" Feb 23 06:49:00 crc kubenswrapper[5047]: E0223 06:49:00.906778 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 23 06:49:00 crc kubenswrapper[5047]: E0223 06:49:00.908022 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4f7xd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-g6khf_openshift-marketplace(8964090d-1ca0-4cb9-b2a6-f4293fdf9591): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:49:00 crc kubenswrapper[5047]: E0223 06:49:00.909346 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-g6khf" podUID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" Feb 23 06:49:01 crc kubenswrapper[5047]: I0223 06:49:01.228129 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-54jbp"] Feb 23 06:49:01 crc kubenswrapper[5047]: I0223 06:49:01.263592 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-54jbp" event={"ID":"cb811549-5811-4996-ba8c-6f8848a80ce7","Type":"ContainerStarted","Data":"dc9c03d3dfbfbb435ec6d5fab9032f8a125c8bb08df69948d771b2d6ed49a51a"} Feb 23 06:49:01 crc kubenswrapper[5047]: I0223 06:49:01.265654 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qnnx" event={"ID":"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0","Type":"ContainerStarted","Data":"c7fd5be5c06c3e053f9fdb36a9276849faad45b973d7f3265fd133bd30663289"} Feb 23 06:49:01 crc kubenswrapper[5047]: E0223 06:49:01.267409 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-g6khf" podUID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" Feb 23 06:49:01 crc kubenswrapper[5047]: E0223 06:49:01.267986 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nwplb" podUID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" Feb 23 06:49:01 crc kubenswrapper[5047]: I0223 06:49:01.691269 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:49:02 crc kubenswrapper[5047]: I0223 06:49:02.274189 5047 generic.go:334] "Generic (PLEG): container finished" podID="8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" containerID="c7fd5be5c06c3e053f9fdb36a9276849faad45b973d7f3265fd133bd30663289" exitCode=0 Feb 23 06:49:02 crc kubenswrapper[5047]: I0223 06:49:02.274322 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qnnx" event={"ID":"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0","Type":"ContainerDied","Data":"c7fd5be5c06c3e053f9fdb36a9276849faad45b973d7f3265fd133bd30663289"} Feb 23 06:49:02 crc kubenswrapper[5047]: I0223 06:49:02.281792 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-54jbp" event={"ID":"cb811549-5811-4996-ba8c-6f8848a80ce7","Type":"ContainerStarted","Data":"d3a5900df9da91fa83126150e0c608a4cfa7219512cdb91fb1a67be9adc649ec"} Feb 23 06:49:02 crc kubenswrapper[5047]: I0223 06:49:02.281848 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-54jbp" event={"ID":"cb811549-5811-4996-ba8c-6f8848a80ce7","Type":"ContainerStarted","Data":"81342d55363b4376f333ded716af391f570d855ad29cf0032aa5533a3fa3e082"} Feb 23 06:49:02 crc kubenswrapper[5047]: I0223 06:49:02.324221 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-54jbp" podStartSLOduration=193.324184757 podStartE2EDuration="3m13.324184757s" podCreationTimestamp="2026-02-23 06:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:49:02.311737362 +0000 UTC m=+264.563064536" watchObservedRunningTime="2026-02-23 06:49:02.324184757 +0000 UTC m=+264.575511891" Feb 23 06:49:03 crc kubenswrapper[5047]: I0223 06:49:03.291154 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qnnx" event={"ID":"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0","Type":"ContainerStarted","Data":"3c7d1c5a6e3cdc3bc7c29ef80833f29f0c4972c4aa92a2c21c4fcd43fb29ea45"} Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.083609 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2qnnx" podStartSLOduration=3.327576614 podStartE2EDuration="39.083585245s" podCreationTimestamp="2026-02-23 06:48:25 +0000 UTC" firstStartedPulling="2026-02-23 06:48:26.94715273 +0000 UTC m=+229.198479864" lastFinishedPulling="2026-02-23 06:49:02.703161361 +0000 UTC m=+264.954488495" observedRunningTime="2026-02-23 06:49:03.314067769 +0000 UTC m=+265.565394913" watchObservedRunningTime="2026-02-23 06:49:04.083585245 +0000 UTC m=+266.334912379" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.086662 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 23 06:49:04 crc kubenswrapper[5047]: E0223 06:49:04.086936 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d845d9f8-e91e-46b9-b484-dbfeb6006c7f" containerName="collect-profiles" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.086957 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d845d9f8-e91e-46b9-b484-dbfeb6006c7f" containerName="collect-profiles" Feb 23 06:49:04 crc kubenswrapper[5047]: E0223 06:49:04.086971 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6289171-aad5-413e-ae73-8cab9950b113" containerName="pruner" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.086977 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6289171-aad5-413e-ae73-8cab9950b113" containerName="pruner" Feb 23 06:49:04 crc kubenswrapper[5047]: E0223 06:49:04.086988 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66cfc9b8-5b52-47f9-91be-e41f21385713" containerName="pruner" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.086995 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="66cfc9b8-5b52-47f9-91be-e41f21385713" containerName="pruner" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.087094 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6289171-aad5-413e-ae73-8cab9950b113" containerName="pruner" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.087111 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d845d9f8-e91e-46b9-b484-dbfeb6006c7f" containerName="collect-profiles" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.087121 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="66cfc9b8-5b52-47f9-91be-e41f21385713" containerName="pruner" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.087544 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.093017 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.098976 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.102734 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.184971 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a31cf027-e43c-4370-9069-8520e051182b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a31cf027-e43c-4370-9069-8520e051182b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.185054 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a31cf027-e43c-4370-9069-8520e051182b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a31cf027-e43c-4370-9069-8520e051182b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.286713 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a31cf027-e43c-4370-9069-8520e051182b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a31cf027-e43c-4370-9069-8520e051182b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.286814 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a31cf027-e43c-4370-9069-8520e051182b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a31cf027-e43c-4370-9069-8520e051182b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.287432 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a31cf027-e43c-4370-9069-8520e051182b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a31cf027-e43c-4370-9069-8520e051182b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.309629 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a31cf027-e43c-4370-9069-8520e051182b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a31cf027-e43c-4370-9069-8520e051182b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.425049 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:49:04 crc kubenswrapper[5047]: I0223 06:49:04.838803 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 23 06:49:04 crc kubenswrapper[5047]: W0223 06:49:04.840540 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda31cf027_e43c_4370_9069_8520e051182b.slice/crio-b4f02b9843b024cf633ec5b10c84a602fc438465d84fcb0b7ff2c7129a3b430d WatchSource:0}: Error finding container b4f02b9843b024cf633ec5b10c84a602fc438465d84fcb0b7ff2c7129a3b430d: Status 404 returned error can't find the container with id b4f02b9843b024cf633ec5b10c84a602fc438465d84fcb0b7ff2c7129a3b430d Feb 23 06:49:05 crc kubenswrapper[5047]: I0223 06:49:05.303687 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a31cf027-e43c-4370-9069-8520e051182b","Type":"ContainerStarted","Data":"b4f02b9843b024cf633ec5b10c84a602fc438465d84fcb0b7ff2c7129a3b430d"} Feb 23 06:49:05 crc kubenswrapper[5047]: I0223 06:49:05.447313 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:49:05 crc kubenswrapper[5047]: I0223 06:49:05.447370 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:49:05 crc kubenswrapper[5047]: I0223 06:49:05.578651 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:49:06 crc kubenswrapper[5047]: I0223 06:49:06.316007 5047 generic.go:334] "Generic (PLEG): container finished" podID="a31cf027-e43c-4370-9069-8520e051182b" containerID="f0c30d1597499bb80d22b0a0756c3e6999e8ce8640461312775c774c5c41608a" exitCode=0 Feb 23 06:49:06 crc kubenswrapper[5047]: I0223 06:49:06.316087 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a31cf027-e43c-4370-9069-8520e051182b","Type":"ContainerDied","Data":"f0c30d1597499bb80d22b0a0756c3e6999e8ce8640461312775c774c5c41608a"} Feb 23 06:49:07 crc kubenswrapper[5047]: I0223 06:49:07.630366 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:49:07 crc kubenswrapper[5047]: I0223 06:49:07.732268 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a31cf027-e43c-4370-9069-8520e051182b-kube-api-access\") pod \"a31cf027-e43c-4370-9069-8520e051182b\" (UID: \"a31cf027-e43c-4370-9069-8520e051182b\") " Feb 23 06:49:07 crc kubenswrapper[5047]: I0223 06:49:07.732367 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a31cf027-e43c-4370-9069-8520e051182b-kubelet-dir\") pod \"a31cf027-e43c-4370-9069-8520e051182b\" (UID: \"a31cf027-e43c-4370-9069-8520e051182b\") " Feb 23 06:49:07 crc kubenswrapper[5047]: I0223 06:49:07.732723 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a31cf027-e43c-4370-9069-8520e051182b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a31cf027-e43c-4370-9069-8520e051182b" (UID: "a31cf027-e43c-4370-9069-8520e051182b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:49:07 crc kubenswrapper[5047]: I0223 06:49:07.737685 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31cf027-e43c-4370-9069-8520e051182b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a31cf027-e43c-4370-9069-8520e051182b" (UID: "a31cf027-e43c-4370-9069-8520e051182b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:49:07 crc kubenswrapper[5047]: I0223 06:49:07.834626 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a31cf027-e43c-4370-9069-8520e051182b-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:07 crc kubenswrapper[5047]: I0223 06:49:07.834680 5047 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a31cf027-e43c-4370-9069-8520e051182b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:08 crc kubenswrapper[5047]: I0223 06:49:08.328352 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a31cf027-e43c-4370-9069-8520e051182b","Type":"ContainerDied","Data":"b4f02b9843b024cf633ec5b10c84a602fc438465d84fcb0b7ff2c7129a3b430d"} Feb 23 06:49:08 crc kubenswrapper[5047]: I0223 06:49:08.328407 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4f02b9843b024cf633ec5b10c84a602fc438465d84fcb0b7ff2c7129a3b430d" Feb 23 06:49:08 crc kubenswrapper[5047]: I0223 06:49:08.328473 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.285820 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 23 06:49:10 crc kubenswrapper[5047]: E0223 06:49:10.287739 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a31cf027-e43c-4370-9069-8520e051182b" containerName="pruner" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.287845 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a31cf027-e43c-4370-9069-8520e051182b" containerName="pruner" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.288057 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a31cf027-e43c-4370-9069-8520e051182b" containerName="pruner" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.288646 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.290767 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.290967 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.299450 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.370001 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/23875e6d-032c-4db1-ae77-f1f19254115b-var-lock\") pod \"installer-9-crc\" (UID: \"23875e6d-032c-4db1-ae77-f1f19254115b\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.370531 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23875e6d-032c-4db1-ae77-f1f19254115b-kube-api-access\") pod \"installer-9-crc\" (UID: \"23875e6d-032c-4db1-ae77-f1f19254115b\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.370578 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23875e6d-032c-4db1-ae77-f1f19254115b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"23875e6d-032c-4db1-ae77-f1f19254115b\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.471858 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/23875e6d-032c-4db1-ae77-f1f19254115b-var-lock\") pod \"installer-9-crc\" (UID: \"23875e6d-032c-4db1-ae77-f1f19254115b\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.472194 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23875e6d-032c-4db1-ae77-f1f19254115b-kube-api-access\") pod \"installer-9-crc\" (UID: \"23875e6d-032c-4db1-ae77-f1f19254115b\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.472422 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23875e6d-032c-4db1-ae77-f1f19254115b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"23875e6d-032c-4db1-ae77-f1f19254115b\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.472500 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23875e6d-032c-4db1-ae77-f1f19254115b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"23875e6d-032c-4db1-ae77-f1f19254115b\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.472058 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/23875e6d-032c-4db1-ae77-f1f19254115b-var-lock\") pod \"installer-9-crc\" (UID: \"23875e6d-032c-4db1-ae77-f1f19254115b\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.496921 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23875e6d-032c-4db1-ae77-f1f19254115b-kube-api-access\") pod \"installer-9-crc\" (UID: \"23875e6d-032c-4db1-ae77-f1f19254115b\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:49:10 crc kubenswrapper[5047]: I0223 06:49:10.663623 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:49:11 crc kubenswrapper[5047]: I0223 06:49:11.077715 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 23 06:49:11 crc kubenswrapper[5047]: I0223 06:49:11.361031 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"23875e6d-032c-4db1-ae77-f1f19254115b","Type":"ContainerStarted","Data":"12164f032ba3cde43f1fdea995b3c6af6f06495738f7be698bbe841e2f57aa67"} Feb 23 06:49:11 crc kubenswrapper[5047]: I0223 06:49:11.361607 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"23875e6d-032c-4db1-ae77-f1f19254115b","Type":"ContainerStarted","Data":"0d6c79ebc8514b2e91042c88b23a81c3da4ce8054c4204720a752cbe79942f01"} Feb 23 06:49:11 crc kubenswrapper[5047]: I0223 06:49:11.362804 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhsns" event={"ID":"83f104c4-c1f9-4714-a807-2a6368b538fd","Type":"ContainerStarted","Data":"b46d2932d6ae16021fc34ef9b5b8220f11fcb00001da97658dc74da6a6aa31e2"} Feb 23 06:49:11 crc kubenswrapper[5047]: I0223 06:49:11.364549 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7bfp" event={"ID":"c470fb1c-868d-420e-b6e2-61369cecd6c3","Type":"ContainerStarted","Data":"f259ff89e68ae58894a56f89824d3c3569b17b202f718f0f9070c9506508cb78"} Feb 23 06:49:11 crc kubenswrapper[5047]: I0223 06:49:11.415441 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.4154151750000001 podStartE2EDuration="1.415415175s" podCreationTimestamp="2026-02-23 06:49:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:49:11.39545092 +0000 UTC m=+273.646778054" watchObservedRunningTime="2026-02-23 06:49:11.415415175 +0000 UTC m=+273.666742309" Feb 23 06:49:12 crc kubenswrapper[5047]: I0223 06:49:12.374081 5047 generic.go:334] "Generic (PLEG): container finished" podID="83f104c4-c1f9-4714-a807-2a6368b538fd" containerID="b46d2932d6ae16021fc34ef9b5b8220f11fcb00001da97658dc74da6a6aa31e2" exitCode=0 Feb 23 06:49:12 crc kubenswrapper[5047]: I0223 06:49:12.374152 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhsns" event={"ID":"83f104c4-c1f9-4714-a807-2a6368b538fd","Type":"ContainerDied","Data":"b46d2932d6ae16021fc34ef9b5b8220f11fcb00001da97658dc74da6a6aa31e2"} Feb 23 06:49:12 crc kubenswrapper[5047]: I0223 06:49:12.384477 5047 generic.go:334] "Generic (PLEG): container finished" podID="c470fb1c-868d-420e-b6e2-61369cecd6c3" containerID="f259ff89e68ae58894a56f89824d3c3569b17b202f718f0f9070c9506508cb78" exitCode=0 Feb 23 06:49:12 crc kubenswrapper[5047]: I0223 06:49:12.384563 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7bfp" event={"ID":"c470fb1c-868d-420e-b6e2-61369cecd6c3","Type":"ContainerDied","Data":"f259ff89e68ae58894a56f89824d3c3569b17b202f718f0f9070c9506508cb78"} Feb 23 06:49:12 crc kubenswrapper[5047]: I0223 06:49:12.389384 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sssp" event={"ID":"b848e749-e1a5-47ab-aca5-13e799a2504e","Type":"ContainerStarted","Data":"d0977f48f0619763fb255dba7010776858cf611951e3291510196e4d6c8d9cbd"} Feb 23 06:49:13 crc kubenswrapper[5047]: I0223 06:49:13.406463 5047 generic.go:334] "Generic (PLEG): container finished" podID="b848e749-e1a5-47ab-aca5-13e799a2504e" containerID="d0977f48f0619763fb255dba7010776858cf611951e3291510196e4d6c8d9cbd" exitCode=0 Feb 23 06:49:13 crc kubenswrapper[5047]: I0223 06:49:13.406586 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sssp" event={"ID":"b848e749-e1a5-47ab-aca5-13e799a2504e","Type":"ContainerDied","Data":"d0977f48f0619763fb255dba7010776858cf611951e3291510196e4d6c8d9cbd"} Feb 23 06:49:13 crc kubenswrapper[5047]: I0223 06:49:13.419848 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhsns" event={"ID":"83f104c4-c1f9-4714-a807-2a6368b538fd","Type":"ContainerStarted","Data":"8ed7f78c234aa50c4fe1b7033a3f434673f22d836693d48e45e440cdcf28a6da"} Feb 23 06:49:13 crc kubenswrapper[5047]: I0223 06:49:13.425273 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7bfp" event={"ID":"c470fb1c-868d-420e-b6e2-61369cecd6c3","Type":"ContainerStarted","Data":"d4e76bc72a160f061b4b98927080be677655d8f47dc067970180be6891ece671"} Feb 23 06:49:13 crc kubenswrapper[5047]: I0223 06:49:13.429141 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzzqw" event={"ID":"f8d5e28d-8de9-4514-92d8-05223c439cd4","Type":"ContainerStarted","Data":"63f7071d8efd7f08ad874d9c7c825f08bcaa17142784099977e08d0243b1554a"} Feb 23 06:49:13 crc kubenswrapper[5047]: I0223 06:49:13.493138 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zhsns" podStartSLOduration=3.735480435 podStartE2EDuration="49.493097799s" podCreationTimestamp="2026-02-23 06:48:24 +0000 UTC" firstStartedPulling="2026-02-23 06:48:27.024814024 +0000 UTC m=+229.276141158" lastFinishedPulling="2026-02-23 06:49:12.782431388 +0000 UTC m=+275.033758522" observedRunningTime="2026-02-23 06:49:13.487866104 +0000 UTC m=+275.739193258" watchObservedRunningTime="2026-02-23 06:49:13.493097799 +0000 UTC m=+275.744424933" Feb 23 06:49:13 crc kubenswrapper[5047]: I0223 06:49:13.514984 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f7bfp" podStartSLOduration=3.947983474 podStartE2EDuration="46.514960987s" podCreationTimestamp="2026-02-23 06:48:27 +0000 UTC" firstStartedPulling="2026-02-23 06:48:30.2408111 +0000 UTC m=+232.492138234" lastFinishedPulling="2026-02-23 06:49:12.807788623 +0000 UTC m=+275.059115747" observedRunningTime="2026-02-23 06:49:13.510318808 +0000 UTC m=+275.761645952" watchObservedRunningTime="2026-02-23 06:49:13.514960987 +0000 UTC m=+275.766288121" Feb 23 06:49:14 crc kubenswrapper[5047]: I0223 06:49:14.435700 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnn8v" event={"ID":"46419724-8d7c-47d4-9d51-3aef5c54ab1b","Type":"ContainerStarted","Data":"c03820942d0513eaa46c1cc655a487c4ade9e636f7ba47b413748deb5d7a5f1f"} Feb 23 06:49:14 crc kubenswrapper[5047]: I0223 06:49:14.438604 5047 generic.go:334] "Generic (PLEG): container finished" podID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" containerID="3cf758c7c7fedd67412aae0fb43c219720ae2afbd685a4157a4c306141f1359b" exitCode=0 Feb 23 06:49:14 crc kubenswrapper[5047]: I0223 06:49:14.438659 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6khf" event={"ID":"8964090d-1ca0-4cb9-b2a6-f4293fdf9591","Type":"ContainerDied","Data":"3cf758c7c7fedd67412aae0fb43c219720ae2afbd685a4157a4c306141f1359b"} Feb 23 06:49:14 crc kubenswrapper[5047]: I0223 06:49:14.440876 5047 generic.go:334] "Generic (PLEG): container finished" podID="f8d5e28d-8de9-4514-92d8-05223c439cd4" containerID="63f7071d8efd7f08ad874d9c7c825f08bcaa17142784099977e08d0243b1554a" exitCode=0 Feb 23 06:49:14 crc kubenswrapper[5047]: I0223 06:49:14.440926 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzzqw" event={"ID":"f8d5e28d-8de9-4514-92d8-05223c439cd4","Type":"ContainerDied","Data":"63f7071d8efd7f08ad874d9c7c825f08bcaa17142784099977e08d0243b1554a"} Feb 23 06:49:14 crc kubenswrapper[5047]: I0223 06:49:14.443777 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sssp" event={"ID":"b848e749-e1a5-47ab-aca5-13e799a2504e","Type":"ContainerStarted","Data":"0566b4f3d56123c0d4788ef424f01f3e28a86d43dd4615c5708c044e540aff00"} Feb 23 06:49:14 crc kubenswrapper[5047]: I0223 06:49:14.536338 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7sssp" podStartSLOduration=2.70079638 podStartE2EDuration="49.536323074s" podCreationTimestamp="2026-02-23 06:48:25 +0000 UTC" firstStartedPulling="2026-02-23 06:48:27.040767306 +0000 UTC m=+229.292094440" lastFinishedPulling="2026-02-23 06:49:13.876294 +0000 UTC m=+276.127621134" observedRunningTime="2026-02-23 06:49:14.532412865 +0000 UTC m=+276.783739999" watchObservedRunningTime="2026-02-23 06:49:14.536323074 +0000 UTC m=+276.787650208" Feb 23 06:49:15 crc kubenswrapper[5047]: I0223 06:49:15.142035 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:49:15 crc kubenswrapper[5047]: I0223 06:49:15.142502 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:49:15 crc kubenswrapper[5047]: I0223 06:49:15.451883 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzzqw" event={"ID":"f8d5e28d-8de9-4514-92d8-05223c439cd4","Type":"ContainerStarted","Data":"f22fe408cb96824f07bc9f1fb260b0f20284234ea9a74ebf15a5f38e7e683c72"} Feb 23 06:49:15 crc kubenswrapper[5047]: I0223 06:49:15.471440 5047 generic.go:334] "Generic (PLEG): container finished" podID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" containerID="c03820942d0513eaa46c1cc655a487c4ade9e636f7ba47b413748deb5d7a5f1f" exitCode=0 Feb 23 06:49:15 crc kubenswrapper[5047]: I0223 06:49:15.471540 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnn8v" event={"ID":"46419724-8d7c-47d4-9d51-3aef5c54ab1b","Type":"ContainerDied","Data":"c03820942d0513eaa46c1cc655a487c4ade9e636f7ba47b413748deb5d7a5f1f"} Feb 23 06:49:15 crc kubenswrapper[5047]: I0223 06:49:15.476931 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dzzqw" podStartSLOduration=3.882368783 podStartE2EDuration="47.476876253s" podCreationTimestamp="2026-02-23 06:48:28 +0000 UTC" firstStartedPulling="2026-02-23 06:48:31.272607187 +0000 UTC m=+233.523934321" lastFinishedPulling="2026-02-23 06:49:14.867114657 +0000 UTC m=+277.118441791" observedRunningTime="2026-02-23 06:49:15.473797948 +0000 UTC m=+277.725125072" watchObservedRunningTime="2026-02-23 06:49:15.476876253 +0000 UTC m=+277.728203397" Feb 23 06:49:15 crc kubenswrapper[5047]: I0223 06:49:15.484790 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6khf" event={"ID":"8964090d-1ca0-4cb9-b2a6-f4293fdf9591","Type":"ContainerStarted","Data":"18fae45da81a0ab477442f3ee4e08d740fbde08563cb3c4c9029e5357f14f863"} Feb 23 06:49:15 crc kubenswrapper[5047]: I0223 06:49:15.512772 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:49:15 crc kubenswrapper[5047]: I0223 06:49:15.531943 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g6khf" podStartSLOduration=3.861747243 podStartE2EDuration="49.531899413s" podCreationTimestamp="2026-02-23 06:48:26 +0000 UTC" firstStartedPulling="2026-02-23 06:48:29.230463728 +0000 UTC m=+231.481790852" lastFinishedPulling="2026-02-23 06:49:14.900615888 +0000 UTC m=+277.151943022" observedRunningTime="2026-02-23 06:49:15.530527974 +0000 UTC m=+277.781855108" watchObservedRunningTime="2026-02-23 06:49:15.531899413 +0000 UTC m=+277.783226547" Feb 23 06:49:15 crc kubenswrapper[5047]: I0223 06:49:15.683691 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:49:15 crc kubenswrapper[5047]: I0223 06:49:15.683754 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:49:16 crc kubenswrapper[5047]: I0223 06:49:16.190316 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-zhsns" podUID="83f104c4-c1f9-4714-a807-2a6368b538fd" containerName="registry-server" probeResult="failure" output=< Feb 23 06:49:16 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 06:49:16 crc kubenswrapper[5047]: > Feb 23 06:49:16 crc kubenswrapper[5047]: I0223 06:49:16.493935 5047 generic.go:334] "Generic (PLEG): container finished" podID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" containerID="3d7c2f860b5bcbdc1a58fbaae84722fb5b6bba084a2a1746b33a0f6a9d64770f" exitCode=0 Feb 23 06:49:16 crc kubenswrapper[5047]: I0223 06:49:16.494030 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwplb" event={"ID":"2fb237ea-bf04-4d9d-bbb0-2363c97e139b","Type":"ContainerDied","Data":"3d7c2f860b5bcbdc1a58fbaae84722fb5b6bba084a2a1746b33a0f6a9d64770f"} Feb 23 06:49:16 crc kubenswrapper[5047]: I0223 06:49:16.497440 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnn8v" event={"ID":"46419724-8d7c-47d4-9d51-3aef5c54ab1b","Type":"ContainerStarted","Data":"e412cd0935d9a8ee580de3acadeb6ddb981572eea673d6e4ec9c08529f32320d"} Feb 23 06:49:16 crc kubenswrapper[5047]: I0223 06:49:16.537710 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gnn8v" podStartSLOduration=3.6032266330000002 podStartE2EDuration="52.537685556s" podCreationTimestamp="2026-02-23 06:48:24 +0000 UTC" firstStartedPulling="2026-02-23 06:48:26.939936809 +0000 UTC m=+229.191263953" lastFinishedPulling="2026-02-23 06:49:15.874395742 +0000 UTC m=+278.125722876" observedRunningTime="2026-02-23 06:49:16.536806592 +0000 UTC m=+278.788133736" watchObservedRunningTime="2026-02-23 06:49:16.537685556 +0000 UTC m=+278.789012690" Feb 23 06:49:16 crc kubenswrapper[5047]: I0223 06:49:16.730157 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7sssp" podUID="b848e749-e1a5-47ab-aca5-13e799a2504e" containerName="registry-server" probeResult="failure" output=< Feb 23 06:49:16 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 06:49:16 crc kubenswrapper[5047]: > Feb 23 06:49:16 crc kubenswrapper[5047]: I0223 06:49:16.760002 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:49:16 crc kubenswrapper[5047]: I0223 06:49:16.760078 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:49:16 crc kubenswrapper[5047]: I0223 06:49:16.760136 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:49:16 crc kubenswrapper[5047]: I0223 06:49:16.760798 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 06:49:16 crc kubenswrapper[5047]: I0223 06:49:16.760861 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee" gracePeriod=600 Feb 23 06:49:17 crc kubenswrapper[5047]: I0223 06:49:17.266564 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:49:17 crc kubenswrapper[5047]: I0223 06:49:17.266939 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:49:17 crc kubenswrapper[5047]: I0223 06:49:17.505558 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee" exitCode=0 Feb 23 06:49:17 crc kubenswrapper[5047]: I0223 06:49:17.505666 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee"} Feb 23 06:49:17 crc kubenswrapper[5047]: I0223 06:49:17.505764 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"5e9070ba26d360a04508200057f939930be7816f548d48466f2bbab9cbbc2f3d"} Feb 23 06:49:17 crc kubenswrapper[5047]: I0223 06:49:17.508704 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwplb" event={"ID":"2fb237ea-bf04-4d9d-bbb0-2363c97e139b","Type":"ContainerStarted","Data":"f9465befb93c0faf70b2e4a6794350fdd63bb27c2c80100e94c09ab7729204ec"} Feb 23 06:49:17 crc kubenswrapper[5047]: I0223 06:49:17.547618 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nwplb" podStartSLOduration=2.871826505 podStartE2EDuration="50.547590464s" podCreationTimestamp="2026-02-23 06:48:27 +0000 UTC" firstStartedPulling="2026-02-23 06:48:29.216067559 +0000 UTC m=+231.467394683" lastFinishedPulling="2026-02-23 06:49:16.891831508 +0000 UTC m=+279.143158642" observedRunningTime="2026-02-23 06:49:17.546413111 +0000 UTC m=+279.797740245" watchObservedRunningTime="2026-02-23 06:49:17.547590464 +0000 UTC m=+279.798917598" Feb 23 06:49:17 crc kubenswrapper[5047]: I0223 06:49:17.664607 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:49:17 crc kubenswrapper[5047]: I0223 06:49:17.665103 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:49:18 crc kubenswrapper[5047]: I0223 06:49:18.274244 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:49:18 crc kubenswrapper[5047]: I0223 06:49:18.274314 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:49:18 crc kubenswrapper[5047]: I0223 06:49:18.317773 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-g6khf" podUID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" containerName="registry-server" probeResult="failure" output=< Feb 23 06:49:18 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 06:49:18 crc kubenswrapper[5047]: > Feb 23 06:49:18 crc kubenswrapper[5047]: I0223 06:49:18.711405 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-nwplb" podUID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" containerName="registry-server" probeResult="failure" output=< Feb 23 06:49:18 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 06:49:18 crc kubenswrapper[5047]: > Feb 23 06:49:18 crc kubenswrapper[5047]: I0223 06:49:18.728827 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:49:18 crc kubenswrapper[5047]: I0223 06:49:18.728959 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:49:19 crc kubenswrapper[5047]: I0223 06:49:19.310313 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f7bfp" podUID="c470fb1c-868d-420e-b6e2-61369cecd6c3" containerName="registry-server" probeResult="failure" output=< Feb 23 06:49:19 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 06:49:19 crc kubenswrapper[5047]: > Feb 23 06:49:19 crc kubenswrapper[5047]: I0223 06:49:19.777279 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dzzqw" podUID="f8d5e28d-8de9-4514-92d8-05223c439cd4" containerName="registry-server" probeResult="failure" output=< Feb 23 06:49:19 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 06:49:19 crc kubenswrapper[5047]: > Feb 23 06:49:19 crc kubenswrapper[5047]: I0223 06:49:19.794127 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2qnnx"] Feb 23 06:49:19 crc kubenswrapper[5047]: I0223 06:49:19.794598 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2qnnx" podUID="8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" containerName="registry-server" containerID="cri-o://3c7d1c5a6e3cdc3bc7c29ef80833f29f0c4972c4aa92a2c21c4fcd43fb29ea45" gracePeriod=2 Feb 23 06:49:20 crc kubenswrapper[5047]: I0223 06:49:20.526918 5047 generic.go:334] "Generic (PLEG): container finished" podID="8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" containerID="3c7d1c5a6e3cdc3bc7c29ef80833f29f0c4972c4aa92a2c21c4fcd43fb29ea45" exitCode=0 Feb 23 06:49:20 crc kubenswrapper[5047]: I0223 06:49:20.527129 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qnnx" event={"ID":"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0","Type":"ContainerDied","Data":"3c7d1c5a6e3cdc3bc7c29ef80833f29f0c4972c4aa92a2c21c4fcd43fb29ea45"} Feb 23 06:49:20 crc kubenswrapper[5047]: I0223 06:49:20.861622 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:49:20 crc kubenswrapper[5047]: I0223 06:49:20.941089 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-catalog-content\") pod \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\" (UID: \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\") " Feb 23 06:49:20 crc kubenswrapper[5047]: I0223 06:49:20.941213 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-utilities\") pod \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\" (UID: \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\") " Feb 23 06:49:20 crc kubenswrapper[5047]: I0223 06:49:20.941304 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnfnt\" (UniqueName: \"kubernetes.io/projected/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-kube-api-access-mnfnt\") pod \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\" (UID: \"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0\") " Feb 23 06:49:20 crc kubenswrapper[5047]: I0223 06:49:20.942144 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-utilities" (OuterVolumeSpecName: "utilities") pod "8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" (UID: "8ea6f7a1-a966-453a-85a0-fafb0e87bcf0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:49:20 crc kubenswrapper[5047]: I0223 06:49:20.949973 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-kube-api-access-mnfnt" (OuterVolumeSpecName: "kube-api-access-mnfnt") pod "8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" (UID: "8ea6f7a1-a966-453a-85a0-fafb0e87bcf0"). InnerVolumeSpecName "kube-api-access-mnfnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:49:21 crc kubenswrapper[5047]: I0223 06:49:21.009204 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" (UID: "8ea6f7a1-a966-453a-85a0-fafb0e87bcf0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:49:21 crc kubenswrapper[5047]: I0223 06:49:21.042759 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnfnt\" (UniqueName: \"kubernetes.io/projected/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-kube-api-access-mnfnt\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:21 crc kubenswrapper[5047]: I0223 06:49:21.042809 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:21 crc kubenswrapper[5047]: I0223 06:49:21.042819 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:21 crc kubenswrapper[5047]: I0223 06:49:21.538072 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qnnx" event={"ID":"8ea6f7a1-a966-453a-85a0-fafb0e87bcf0","Type":"ContainerDied","Data":"23e82e80b0cd4427f7c211f9795e3ae85c2ef77debaae4341d812cbb9a3906a8"} Feb 23 06:49:21 crc kubenswrapper[5047]: I0223 06:49:21.538249 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qnnx" Feb 23 06:49:21 crc kubenswrapper[5047]: I0223 06:49:21.538544 5047 scope.go:117] "RemoveContainer" containerID="3c7d1c5a6e3cdc3bc7c29ef80833f29f0c4972c4aa92a2c21c4fcd43fb29ea45" Feb 23 06:49:21 crc kubenswrapper[5047]: I0223 06:49:21.572734 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2qnnx"] Feb 23 06:49:21 crc kubenswrapper[5047]: I0223 06:49:21.576471 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2qnnx"] Feb 23 06:49:21 crc kubenswrapper[5047]: I0223 06:49:21.576682 5047 scope.go:117] "RemoveContainer" containerID="c7fd5be5c06c3e053f9fdb36a9276849faad45b973d7f3265fd133bd30663289" Feb 23 06:49:21 crc kubenswrapper[5047]: I0223 06:49:21.602950 5047 scope.go:117] "RemoveContainer" containerID="09dfcb4a4d2d7764b5e6c3068780a41cda7b7a4231fe0d3eff02f2fbd4d28f30" Feb 23 06:49:22 crc kubenswrapper[5047]: I0223 06:49:22.352984 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" path="/var/lib/kubelet/pods/8ea6f7a1-a966-453a-85a0-fafb0e87bcf0/volumes" Feb 23 06:49:23 crc kubenswrapper[5047]: I0223 06:49:23.089456 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-694f4c68df-rbr22"] Feb 23 06:49:23 crc kubenswrapper[5047]: I0223 06:49:23.089778 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" podUID="81624f2b-37c1-42b3-863c-2f4a61bb68cc" containerName="controller-manager" containerID="cri-o://808aac692cd147797b9a4330c3137dffddf573443fc6fbe488ecb07b73570f1a" gracePeriod=30 Feb 23 06:49:23 crc kubenswrapper[5047]: I0223 06:49:23.182508 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8"] Feb 23 06:49:23 crc kubenswrapper[5047]: I0223 06:49:23.183187 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" podUID="7fd9d004-a17c-4302-b139-b7fe123a5896" containerName="route-controller-manager" containerID="cri-o://0b9c00f2a8142c693529ca056eb5c0c25995131f6b2d6ea7cefac9296a3c3f58" gracePeriod=30 Feb 23 06:49:23 crc kubenswrapper[5047]: I0223 06:49:23.564234 5047 generic.go:334] "Generic (PLEG): container finished" podID="81624f2b-37c1-42b3-863c-2f4a61bb68cc" containerID="808aac692cd147797b9a4330c3137dffddf573443fc6fbe488ecb07b73570f1a" exitCode=0 Feb 23 06:49:23 crc kubenswrapper[5047]: I0223 06:49:23.564399 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" event={"ID":"81624f2b-37c1-42b3-863c-2f4a61bb68cc","Type":"ContainerDied","Data":"808aac692cd147797b9a4330c3137dffddf573443fc6fbe488ecb07b73570f1a"} Feb 23 06:49:23 crc kubenswrapper[5047]: I0223 06:49:23.567362 5047 generic.go:334] "Generic (PLEG): container finished" podID="7fd9d004-a17c-4302-b139-b7fe123a5896" containerID="0b9c00f2a8142c693529ca056eb5c0c25995131f6b2d6ea7cefac9296a3c3f58" exitCode=0 Feb 23 06:49:23 crc kubenswrapper[5047]: I0223 06:49:23.567416 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" event={"ID":"7fd9d004-a17c-4302-b139-b7fe123a5896","Type":"ContainerDied","Data":"0b9c00f2a8142c693529ca056eb5c0c25995131f6b2d6ea7cefac9296a3c3f58"} Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.085371 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.099665 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.194144 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd9d004-a17c-4302-b139-b7fe123a5896-config\") pod \"7fd9d004-a17c-4302-b139-b7fe123a5896\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.194194 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-client-ca\") pod \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.194224 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7fd9d004-a17c-4302-b139-b7fe123a5896-client-ca\") pod \"7fd9d004-a17c-4302-b139-b7fe123a5896\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.194251 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81624f2b-37c1-42b3-863c-2f4a61bb68cc-serving-cert\") pod \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.194303 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd9hd\" (UniqueName: \"kubernetes.io/projected/81624f2b-37c1-42b3-863c-2f4a61bb68cc-kube-api-access-pd9hd\") pod \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.194331 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvmph\" (UniqueName: \"kubernetes.io/projected/7fd9d004-a17c-4302-b139-b7fe123a5896-kube-api-access-pvmph\") pod \"7fd9d004-a17c-4302-b139-b7fe123a5896\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.194351 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-proxy-ca-bundles\") pod \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.194383 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fd9d004-a17c-4302-b139-b7fe123a5896-serving-cert\") pod \"7fd9d004-a17c-4302-b139-b7fe123a5896\" (UID: \"7fd9d004-a17c-4302-b139-b7fe123a5896\") " Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.194407 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-config\") pod \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\" (UID: \"81624f2b-37c1-42b3-863c-2f4a61bb68cc\") " Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.195640 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-config" (OuterVolumeSpecName: "config") pod "81624f2b-37c1-42b3-863c-2f4a61bb68cc" (UID: "81624f2b-37c1-42b3-863c-2f4a61bb68cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.196167 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fd9d004-a17c-4302-b139-b7fe123a5896-config" (OuterVolumeSpecName: "config") pod "7fd9d004-a17c-4302-b139-b7fe123a5896" (UID: "7fd9d004-a17c-4302-b139-b7fe123a5896"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.196408 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-client-ca" (OuterVolumeSpecName: "client-ca") pod "81624f2b-37c1-42b3-863c-2f4a61bb68cc" (UID: "81624f2b-37c1-42b3-863c-2f4a61bb68cc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.196630 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fd9d004-a17c-4302-b139-b7fe123a5896-client-ca" (OuterVolumeSpecName: "client-ca") pod "7fd9d004-a17c-4302-b139-b7fe123a5896" (UID: "7fd9d004-a17c-4302-b139-b7fe123a5896"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.198254 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "81624f2b-37c1-42b3-863c-2f4a61bb68cc" (UID: "81624f2b-37c1-42b3-863c-2f4a61bb68cc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.203892 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81624f2b-37c1-42b3-863c-2f4a61bb68cc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "81624f2b-37c1-42b3-863c-2f4a61bb68cc" (UID: "81624f2b-37c1-42b3-863c-2f4a61bb68cc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.203985 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fd9d004-a17c-4302-b139-b7fe123a5896-kube-api-access-pvmph" (OuterVolumeSpecName: "kube-api-access-pvmph") pod "7fd9d004-a17c-4302-b139-b7fe123a5896" (UID: "7fd9d004-a17c-4302-b139-b7fe123a5896"). InnerVolumeSpecName "kube-api-access-pvmph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.212061 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fd9d004-a17c-4302-b139-b7fe123a5896-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7fd9d004-a17c-4302-b139-b7fe123a5896" (UID: "7fd9d004-a17c-4302-b139-b7fe123a5896"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.218707 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81624f2b-37c1-42b3-863c-2f4a61bb68cc-kube-api-access-pd9hd" (OuterVolumeSpecName: "kube-api-access-pd9hd") pod "81624f2b-37c1-42b3-863c-2f4a61bb68cc" (UID: "81624f2b-37c1-42b3-863c-2f4a61bb68cc"). InnerVolumeSpecName "kube-api-access-pd9hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.295733 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd9hd\" (UniqueName: \"kubernetes.io/projected/81624f2b-37c1-42b3-863c-2f4a61bb68cc-kube-api-access-pd9hd\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.295777 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvmph\" (UniqueName: \"kubernetes.io/projected/7fd9d004-a17c-4302-b139-b7fe123a5896-kube-api-access-pvmph\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.295788 5047 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.295797 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fd9d004-a17c-4302-b139-b7fe123a5896-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.295810 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.295823 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd9d004-a17c-4302-b139-b7fe123a5896-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.295831 5047 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81624f2b-37c1-42b3-863c-2f4a61bb68cc-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.295839 5047 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7fd9d004-a17c-4302-b139-b7fe123a5896-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.295850 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81624f2b-37c1-42b3-863c-2f4a61bb68cc-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.579565 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" event={"ID":"81624f2b-37c1-42b3-863c-2f4a61bb68cc","Type":"ContainerDied","Data":"00470f164bc9557fb541ab3ddad5f4c7098587009a313398e2471f8109c45023"} Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.579691 5047 scope.go:117] "RemoveContainer" containerID="808aac692cd147797b9a4330c3137dffddf573443fc6fbe488ecb07b73570f1a" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.579896 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-694f4c68df-rbr22" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.585162 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" event={"ID":"7fd9d004-a17c-4302-b139-b7fe123a5896","Type":"ContainerDied","Data":"de552b40290aa71ce85734b8cbaaba5ac0a09b00b13dadcdc8a334b8905f3aab"} Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.585615 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.617288 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-694f4c68df-rbr22"] Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.621518 5047 scope.go:117] "RemoveContainer" containerID="0b9c00f2a8142c693529ca056eb5c0c25995131f6b2d6ea7cefac9296a3c3f58" Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.633501 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-694f4c68df-rbr22"] Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.640680 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8"] Feb 23 06:49:24 crc kubenswrapper[5047]: I0223 06:49:24.646214 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677bd9c759-kqbn8"] Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.130450 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd"] Feb 23 06:49:25 crc kubenswrapper[5047]: E0223 06:49:25.130738 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" containerName="extract-utilities" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.130755 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" containerName="extract-utilities" Feb 23 06:49:25 crc kubenswrapper[5047]: E0223 06:49:25.130770 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" containerName="extract-content" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.130779 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" containerName="extract-content" Feb 23 06:49:25 crc kubenswrapper[5047]: E0223 06:49:25.130796 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" containerName="registry-server" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.130805 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" containerName="registry-server" Feb 23 06:49:25 crc kubenswrapper[5047]: E0223 06:49:25.130820 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81624f2b-37c1-42b3-863c-2f4a61bb68cc" containerName="controller-manager" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.130828 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="81624f2b-37c1-42b3-863c-2f4a61bb68cc" containerName="controller-manager" Feb 23 06:49:25 crc kubenswrapper[5047]: E0223 06:49:25.130843 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd9d004-a17c-4302-b139-b7fe123a5896" containerName="route-controller-manager" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.130852 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd9d004-a17c-4302-b139-b7fe123a5896" containerName="route-controller-manager" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.130996 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ea6f7a1-a966-453a-85a0-fafb0e87bcf0" containerName="registry-server" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.131016 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fd9d004-a17c-4302-b139-b7fe123a5896" containerName="route-controller-manager" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.131028 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="81624f2b-37c1-42b3-863c-2f4a61bb68cc" containerName="controller-manager" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.131516 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.134465 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.134859 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.134947 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.136546 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.138652 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.141495 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq"] Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.142759 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.145324 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.147899 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.155831 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.155851 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.155870 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.155944 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.156019 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.156067 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.166699 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd"] Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.172748 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq"] Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.205832 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.208991 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68a2d8c8-392c-4f88-b685-2827c52e8de5-serving-cert\") pod \"route-controller-manager-7f89c65d5c-k8ffq\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.209102 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-config\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.209173 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-client-ca\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.209231 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-proxy-ca-bundles\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.209273 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68a2d8c8-392c-4f88-b685-2827c52e8de5-client-ca\") pod \"route-controller-manager-7f89c65d5c-k8ffq\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.209320 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pl7m\" (UniqueName: \"kubernetes.io/projected/68a2d8c8-392c-4f88-b685-2827c52e8de5-kube-api-access-6pl7m\") pod \"route-controller-manager-7f89c65d5c-k8ffq\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.209378 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44ff94ab-7f35-4a62-8217-568c5f36047a-serving-cert\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.209425 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68a2d8c8-392c-4f88-b685-2827c52e8de5-config\") pod \"route-controller-manager-7f89c65d5c-k8ffq\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.209464 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbfrg\" (UniqueName: \"kubernetes.io/projected/44ff94ab-7f35-4a62-8217-568c5f36047a-kube-api-access-sbfrg\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.246888 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.249031 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.249073 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.297075 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.310753 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68a2d8c8-392c-4f88-b685-2827c52e8de5-client-ca\") pod \"route-controller-manager-7f89c65d5c-k8ffq\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.310810 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pl7m\" (UniqueName: \"kubernetes.io/projected/68a2d8c8-392c-4f88-b685-2827c52e8de5-kube-api-access-6pl7m\") pod \"route-controller-manager-7f89c65d5c-k8ffq\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.310846 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44ff94ab-7f35-4a62-8217-568c5f36047a-serving-cert\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.310876 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68a2d8c8-392c-4f88-b685-2827c52e8de5-config\") pod \"route-controller-manager-7f89c65d5c-k8ffq\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.310894 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbfrg\" (UniqueName: \"kubernetes.io/projected/44ff94ab-7f35-4a62-8217-568c5f36047a-kube-api-access-sbfrg\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.310918 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68a2d8c8-392c-4f88-b685-2827c52e8de5-serving-cert\") pod \"route-controller-manager-7f89c65d5c-k8ffq\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.310990 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-config\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.311009 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-client-ca\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.311035 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-proxy-ca-bundles\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.311851 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68a2d8c8-392c-4f88-b685-2827c52e8de5-client-ca\") pod \"route-controller-manager-7f89c65d5c-k8ffq\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.312592 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-client-ca\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.312947 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-config\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.312954 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68a2d8c8-392c-4f88-b685-2827c52e8de5-config\") pod \"route-controller-manager-7f89c65d5c-k8ffq\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.313330 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-proxy-ca-bundles\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.325175 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44ff94ab-7f35-4a62-8217-568c5f36047a-serving-cert\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.325660 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68a2d8c8-392c-4f88-b685-2827c52e8de5-serving-cert\") pod \"route-controller-manager-7f89c65d5c-k8ffq\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.338694 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbfrg\" (UniqueName: \"kubernetes.io/projected/44ff94ab-7f35-4a62-8217-568c5f36047a-kube-api-access-sbfrg\") pod \"controller-manager-546ffcb6cc-2xhxd\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.342133 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pl7m\" (UniqueName: \"kubernetes.io/projected/68a2d8c8-392c-4f88-b685-2827c52e8de5-kube-api-access-6pl7m\") pod \"route-controller-manager-7f89c65d5c-k8ffq\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.456553 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.470035 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.662061 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.665543 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq"] Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.751818 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.752512 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd"] Feb 23 06:49:25 crc kubenswrapper[5047]: W0223 06:49:25.760940 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ff94ab_7f35_4a62_8217_568c5f36047a.slice/crio-56f74e03de4d09be1c22b039bc967be86d3765d6e379c4cc428041186992c1d9 WatchSource:0}: Error finding container 56f74e03de4d09be1c22b039bc967be86d3765d6e379c4cc428041186992c1d9: Status 404 returned error can't find the container with id 56f74e03de4d09be1c22b039bc967be86d3765d6e379c4cc428041186992c1d9 Feb 23 06:49:25 crc kubenswrapper[5047]: I0223 06:49:25.813622 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:49:26 crc kubenswrapper[5047]: I0223 06:49:26.350939 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fd9d004-a17c-4302-b139-b7fe123a5896" path="/var/lib/kubelet/pods/7fd9d004-a17c-4302-b139-b7fe123a5896/volumes" Feb 23 06:49:26 crc kubenswrapper[5047]: I0223 06:49:26.352242 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81624f2b-37c1-42b3-863c-2f4a61bb68cc" path="/var/lib/kubelet/pods/81624f2b-37c1-42b3-863c-2f4a61bb68cc/volumes" Feb 23 06:49:26 crc kubenswrapper[5047]: I0223 06:49:26.605126 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" event={"ID":"44ff94ab-7f35-4a62-8217-568c5f36047a","Type":"ContainerStarted","Data":"c536ab6f0940447e0700784cdaf008dc2c48e19501bc6c85bea87bf0793fbde4"} Feb 23 06:49:26 crc kubenswrapper[5047]: I0223 06:49:26.612119 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" event={"ID":"44ff94ab-7f35-4a62-8217-568c5f36047a","Type":"ContainerStarted","Data":"56f74e03de4d09be1c22b039bc967be86d3765d6e379c4cc428041186992c1d9"} Feb 23 06:49:26 crc kubenswrapper[5047]: I0223 06:49:26.615789 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:26 crc kubenswrapper[5047]: I0223 06:49:26.616319 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" event={"ID":"68a2d8c8-392c-4f88-b685-2827c52e8de5","Type":"ContainerStarted","Data":"4e16148110fcfba6b8fd7315143f5f4ab3553f46e5e25ee1c82823808fb38b62"} Feb 23 06:49:26 crc kubenswrapper[5047]: I0223 06:49:26.616389 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" event={"ID":"68a2d8c8-392c-4f88-b685-2827c52e8de5","Type":"ContainerStarted","Data":"7509d6d82c601117c78d562a684d9b8474e0a234fe4eaa819a533f98193bbae8"} Feb 23 06:49:26 crc kubenswrapper[5047]: I0223 06:49:26.626567 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:26 crc kubenswrapper[5047]: I0223 06:49:26.652469 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" podStartSLOduration=3.652448491 podStartE2EDuration="3.652448491s" podCreationTimestamp="2026-02-23 06:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:49:26.648290055 +0000 UTC m=+288.899617199" watchObservedRunningTime="2026-02-23 06:49:26.652448491 +0000 UTC m=+288.903775615" Feb 23 06:49:27 crc kubenswrapper[5047]: I0223 06:49:27.313740 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:49:27 crc kubenswrapper[5047]: I0223 06:49:27.352632 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" podStartSLOduration=4.35260009 podStartE2EDuration="4.35260009s" podCreationTimestamp="2026-02-23 06:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:49:26.701814852 +0000 UTC m=+288.953142006" watchObservedRunningTime="2026-02-23 06:49:27.35260009 +0000 UTC m=+289.603927234" Feb 23 06:49:27 crc kubenswrapper[5047]: I0223 06:49:27.363790 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:49:27 crc kubenswrapper[5047]: I0223 06:49:27.594187 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7sssp"] Feb 23 06:49:27 crc kubenswrapper[5047]: I0223 06:49:27.624135 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7sssp" podUID="b848e749-e1a5-47ab-aca5-13e799a2504e" containerName="registry-server" containerID="cri-o://0566b4f3d56123c0d4788ef424f01f3e28a86d43dd4615c5708c044e540aff00" gracePeriod=2 Feb 23 06:49:27 crc kubenswrapper[5047]: I0223 06:49:27.624715 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:27 crc kubenswrapper[5047]: I0223 06:49:27.640313 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:27 crc kubenswrapper[5047]: I0223 06:49:27.765468 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:49:27 crc kubenswrapper[5047]: I0223 06:49:27.824964 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.032289 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.058485 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b848e749-e1a5-47ab-aca5-13e799a2504e-utilities\") pod \"b848e749-e1a5-47ab-aca5-13e799a2504e\" (UID: \"b848e749-e1a5-47ab-aca5-13e799a2504e\") " Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.058588 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c48zt\" (UniqueName: \"kubernetes.io/projected/b848e749-e1a5-47ab-aca5-13e799a2504e-kube-api-access-c48zt\") pod \"b848e749-e1a5-47ab-aca5-13e799a2504e\" (UID: \"b848e749-e1a5-47ab-aca5-13e799a2504e\") " Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.058669 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b848e749-e1a5-47ab-aca5-13e799a2504e-catalog-content\") pod \"b848e749-e1a5-47ab-aca5-13e799a2504e\" (UID: \"b848e749-e1a5-47ab-aca5-13e799a2504e\") " Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.062824 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b848e749-e1a5-47ab-aca5-13e799a2504e-utilities" (OuterVolumeSpecName: "utilities") pod "b848e749-e1a5-47ab-aca5-13e799a2504e" (UID: "b848e749-e1a5-47ab-aca5-13e799a2504e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.073393 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b848e749-e1a5-47ab-aca5-13e799a2504e-kube-api-access-c48zt" (OuterVolumeSpecName: "kube-api-access-c48zt") pod "b848e749-e1a5-47ab-aca5-13e799a2504e" (UID: "b848e749-e1a5-47ab-aca5-13e799a2504e"). InnerVolumeSpecName "kube-api-access-c48zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.126139 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b848e749-e1a5-47ab-aca5-13e799a2504e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b848e749-e1a5-47ab-aca5-13e799a2504e" (UID: "b848e749-e1a5-47ab-aca5-13e799a2504e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.160741 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b848e749-e1a5-47ab-aca5-13e799a2504e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.160789 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b848e749-e1a5-47ab-aca5-13e799a2504e-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.160820 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c48zt\" (UniqueName: \"kubernetes.io/projected/b848e749-e1a5-47ab-aca5-13e799a2504e-kube-api-access-c48zt\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.325433 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.370100 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.632385 5047 generic.go:334] "Generic (PLEG): container finished" podID="b848e749-e1a5-47ab-aca5-13e799a2504e" containerID="0566b4f3d56123c0d4788ef424f01f3e28a86d43dd4615c5708c044e540aff00" exitCode=0 Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.632568 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7sssp" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.632664 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sssp" event={"ID":"b848e749-e1a5-47ab-aca5-13e799a2504e","Type":"ContainerDied","Data":"0566b4f3d56123c0d4788ef424f01f3e28a86d43dd4615c5708c044e540aff00"} Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.632732 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sssp" event={"ID":"b848e749-e1a5-47ab-aca5-13e799a2504e","Type":"ContainerDied","Data":"d8b07b669d4ab3ac572382b802ccb9a00656830b6d87ea091e45b2086c3fd2df"} Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.632763 5047 scope.go:117] "RemoveContainer" containerID="0566b4f3d56123c0d4788ef424f01f3e28a86d43dd4615c5708c044e540aff00" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.651546 5047 scope.go:117] "RemoveContainer" containerID="d0977f48f0619763fb255dba7010776858cf611951e3291510196e4d6c8d9cbd" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.666419 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7sssp"] Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.670907 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7sssp"] Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.683529 5047 scope.go:117] "RemoveContainer" containerID="504c07b2ba8502e5cb1d852c7b7ca9904e1651d50fda26ef38cb0fda428c1472" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.710445 5047 scope.go:117] "RemoveContainer" containerID="0566b4f3d56123c0d4788ef424f01f3e28a86d43dd4615c5708c044e540aff00" Feb 23 06:49:28 crc kubenswrapper[5047]: E0223 06:49:28.721185 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0566b4f3d56123c0d4788ef424f01f3e28a86d43dd4615c5708c044e540aff00\": container with ID starting with 0566b4f3d56123c0d4788ef424f01f3e28a86d43dd4615c5708c044e540aff00 not found: ID does not exist" containerID="0566b4f3d56123c0d4788ef424f01f3e28a86d43dd4615c5708c044e540aff00" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.721267 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0566b4f3d56123c0d4788ef424f01f3e28a86d43dd4615c5708c044e540aff00"} err="failed to get container status \"0566b4f3d56123c0d4788ef424f01f3e28a86d43dd4615c5708c044e540aff00\": rpc error: code = NotFound desc = could not find container \"0566b4f3d56123c0d4788ef424f01f3e28a86d43dd4615c5708c044e540aff00\": container with ID starting with 0566b4f3d56123c0d4788ef424f01f3e28a86d43dd4615c5708c044e540aff00 not found: ID does not exist" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.721317 5047 scope.go:117] "RemoveContainer" containerID="d0977f48f0619763fb255dba7010776858cf611951e3291510196e4d6c8d9cbd" Feb 23 06:49:28 crc kubenswrapper[5047]: E0223 06:49:28.722164 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0977f48f0619763fb255dba7010776858cf611951e3291510196e4d6c8d9cbd\": container with ID starting with d0977f48f0619763fb255dba7010776858cf611951e3291510196e4d6c8d9cbd not found: ID does not exist" containerID="d0977f48f0619763fb255dba7010776858cf611951e3291510196e4d6c8d9cbd" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.722230 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0977f48f0619763fb255dba7010776858cf611951e3291510196e4d6c8d9cbd"} err="failed to get container status \"d0977f48f0619763fb255dba7010776858cf611951e3291510196e4d6c8d9cbd\": rpc error: code = NotFound desc = could not find container \"d0977f48f0619763fb255dba7010776858cf611951e3291510196e4d6c8d9cbd\": container with ID starting with d0977f48f0619763fb255dba7010776858cf611951e3291510196e4d6c8d9cbd not found: ID does not exist" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.722277 5047 scope.go:117] "RemoveContainer" containerID="504c07b2ba8502e5cb1d852c7b7ca9904e1651d50fda26ef38cb0fda428c1472" Feb 23 06:49:28 crc kubenswrapper[5047]: E0223 06:49:28.724015 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"504c07b2ba8502e5cb1d852c7b7ca9904e1651d50fda26ef38cb0fda428c1472\": container with ID starting with 504c07b2ba8502e5cb1d852c7b7ca9904e1651d50fda26ef38cb0fda428c1472 not found: ID does not exist" containerID="504c07b2ba8502e5cb1d852c7b7ca9904e1651d50fda26ef38cb0fda428c1472" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.724062 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"504c07b2ba8502e5cb1d852c7b7ca9904e1651d50fda26ef38cb0fda428c1472"} err="failed to get container status \"504c07b2ba8502e5cb1d852c7b7ca9904e1651d50fda26ef38cb0fda428c1472\": rpc error: code = NotFound desc = could not find container \"504c07b2ba8502e5cb1d852c7b7ca9904e1651d50fda26ef38cb0fda428c1472\": container with ID starting with 504c07b2ba8502e5cb1d852c7b7ca9904e1651d50fda26ef38cb0fda428c1472 not found: ID does not exist" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.781135 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:49:28 crc kubenswrapper[5047]: I0223 06:49:28.827109 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:49:29 crc kubenswrapper[5047]: I0223 06:49:29.991326 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwplb"] Feb 23 06:49:29 crc kubenswrapper[5047]: I0223 06:49:29.991951 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nwplb" podUID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" containerName="registry-server" containerID="cri-o://f9465befb93c0faf70b2e4a6794350fdd63bb27c2c80100e94c09ab7729204ec" gracePeriod=2 Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.358996 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b848e749-e1a5-47ab-aca5-13e799a2504e" path="/var/lib/kubelet/pods/b848e749-e1a5-47ab-aca5-13e799a2504e/volumes" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.553859 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.593077 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-catalog-content\") pod \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\" (UID: \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\") " Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.593260 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-utilities\") pod \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\" (UID: \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\") " Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.593301 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn9gt\" (UniqueName: \"kubernetes.io/projected/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-kube-api-access-pn9gt\") pod \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\" (UID: \"2fb237ea-bf04-4d9d-bbb0-2363c97e139b\") " Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.594434 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-utilities" (OuterVolumeSpecName: "utilities") pod "2fb237ea-bf04-4d9d-bbb0-2363c97e139b" (UID: "2fb237ea-bf04-4d9d-bbb0-2363c97e139b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.602192 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-kube-api-access-pn9gt" (OuterVolumeSpecName: "kube-api-access-pn9gt") pod "2fb237ea-bf04-4d9d-bbb0-2363c97e139b" (UID: "2fb237ea-bf04-4d9d-bbb0-2363c97e139b"). InnerVolumeSpecName "kube-api-access-pn9gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.618871 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fb237ea-bf04-4d9d-bbb0-2363c97e139b" (UID: "2fb237ea-bf04-4d9d-bbb0-2363c97e139b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.651191 5047 generic.go:334] "Generic (PLEG): container finished" podID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" containerID="f9465befb93c0faf70b2e4a6794350fdd63bb27c2c80100e94c09ab7729204ec" exitCode=0 Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.651267 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwplb" event={"ID":"2fb237ea-bf04-4d9d-bbb0-2363c97e139b","Type":"ContainerDied","Data":"f9465befb93c0faf70b2e4a6794350fdd63bb27c2c80100e94c09ab7729204ec"} Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.651376 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwplb" event={"ID":"2fb237ea-bf04-4d9d-bbb0-2363c97e139b","Type":"ContainerDied","Data":"9d8aeaa52ab88af05d4b401edb4392c9626f20da445782a3132827ac7572b45e"} Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.651374 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwplb" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.651459 5047 scope.go:117] "RemoveContainer" containerID="f9465befb93c0faf70b2e4a6794350fdd63bb27c2c80100e94c09ab7729204ec" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.673481 5047 scope.go:117] "RemoveContainer" containerID="3d7c2f860b5bcbdc1a58fbaae84722fb5b6bba084a2a1746b33a0f6a9d64770f" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.705633 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.705681 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn9gt\" (UniqueName: \"kubernetes.io/projected/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-kube-api-access-pn9gt\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.705699 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb237ea-bf04-4d9d-bbb0-2363c97e139b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.707107 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwplb"] Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.715903 5047 scope.go:117] "RemoveContainer" containerID="85ea9bf5866baffb3de79fee60dc6badac55ae38008041239c82df1a477280d8" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.715924 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwplb"] Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.748795 5047 scope.go:117] "RemoveContainer" containerID="f9465befb93c0faf70b2e4a6794350fdd63bb27c2c80100e94c09ab7729204ec" Feb 23 06:49:30 crc kubenswrapper[5047]: E0223 06:49:30.751765 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9465befb93c0faf70b2e4a6794350fdd63bb27c2c80100e94c09ab7729204ec\": container with ID starting with f9465befb93c0faf70b2e4a6794350fdd63bb27c2c80100e94c09ab7729204ec not found: ID does not exist" containerID="f9465befb93c0faf70b2e4a6794350fdd63bb27c2c80100e94c09ab7729204ec" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.751834 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9465befb93c0faf70b2e4a6794350fdd63bb27c2c80100e94c09ab7729204ec"} err="failed to get container status \"f9465befb93c0faf70b2e4a6794350fdd63bb27c2c80100e94c09ab7729204ec\": rpc error: code = NotFound desc = could not find container \"f9465befb93c0faf70b2e4a6794350fdd63bb27c2c80100e94c09ab7729204ec\": container with ID starting with f9465befb93c0faf70b2e4a6794350fdd63bb27c2c80100e94c09ab7729204ec not found: ID does not exist" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.751877 5047 scope.go:117] "RemoveContainer" containerID="3d7c2f860b5bcbdc1a58fbaae84722fb5b6bba084a2a1746b33a0f6a9d64770f" Feb 23 06:49:30 crc kubenswrapper[5047]: E0223 06:49:30.752499 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d7c2f860b5bcbdc1a58fbaae84722fb5b6bba084a2a1746b33a0f6a9d64770f\": container with ID starting with 3d7c2f860b5bcbdc1a58fbaae84722fb5b6bba084a2a1746b33a0f6a9d64770f not found: ID does not exist" containerID="3d7c2f860b5bcbdc1a58fbaae84722fb5b6bba084a2a1746b33a0f6a9d64770f" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.752550 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7c2f860b5bcbdc1a58fbaae84722fb5b6bba084a2a1746b33a0f6a9d64770f"} err="failed to get container status \"3d7c2f860b5bcbdc1a58fbaae84722fb5b6bba084a2a1746b33a0f6a9d64770f\": rpc error: code = NotFound desc = could not find container \"3d7c2f860b5bcbdc1a58fbaae84722fb5b6bba084a2a1746b33a0f6a9d64770f\": container with ID starting with 3d7c2f860b5bcbdc1a58fbaae84722fb5b6bba084a2a1746b33a0f6a9d64770f not found: ID does not exist" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.752590 5047 scope.go:117] "RemoveContainer" containerID="85ea9bf5866baffb3de79fee60dc6badac55ae38008041239c82df1a477280d8" Feb 23 06:49:30 crc kubenswrapper[5047]: E0223 06:49:30.752974 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ea9bf5866baffb3de79fee60dc6badac55ae38008041239c82df1a477280d8\": container with ID starting with 85ea9bf5866baffb3de79fee60dc6badac55ae38008041239c82df1a477280d8 not found: ID does not exist" containerID="85ea9bf5866baffb3de79fee60dc6badac55ae38008041239c82df1a477280d8" Feb 23 06:49:30 crc kubenswrapper[5047]: I0223 06:49:30.753000 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ea9bf5866baffb3de79fee60dc6badac55ae38008041239c82df1a477280d8"} err="failed to get container status \"85ea9bf5866baffb3de79fee60dc6badac55ae38008041239c82df1a477280d8\": rpc error: code = NotFound desc = could not find container \"85ea9bf5866baffb3de79fee60dc6badac55ae38008041239c82df1a477280d8\": container with ID starting with 85ea9bf5866baffb3de79fee60dc6badac55ae38008041239c82df1a477280d8 not found: ID does not exist" Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.190166 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzzqw"] Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.190475 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dzzqw" podUID="f8d5e28d-8de9-4514-92d8-05223c439cd4" containerName="registry-server" containerID="cri-o://f22fe408cb96824f07bc9f1fb260b0f20284234ea9a74ebf15a5f38e7e683c72" gracePeriod=2 Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.356725 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" path="/var/lib/kubelet/pods/2fb237ea-bf04-4d9d-bbb0-2363c97e139b/volumes" Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.669297 5047 generic.go:334] "Generic (PLEG): container finished" podID="f8d5e28d-8de9-4514-92d8-05223c439cd4" containerID="f22fe408cb96824f07bc9f1fb260b0f20284234ea9a74ebf15a5f38e7e683c72" exitCode=0 Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.669386 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzzqw" event={"ID":"f8d5e28d-8de9-4514-92d8-05223c439cd4","Type":"ContainerDied","Data":"f22fe408cb96824f07bc9f1fb260b0f20284234ea9a74ebf15a5f38e7e683c72"} Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.748525 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.841972 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d5e28d-8de9-4514-92d8-05223c439cd4-utilities\") pod \"f8d5e28d-8de9-4514-92d8-05223c439cd4\" (UID: \"f8d5e28d-8de9-4514-92d8-05223c439cd4\") " Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.842130 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d5e28d-8de9-4514-92d8-05223c439cd4-catalog-content\") pod \"f8d5e28d-8de9-4514-92d8-05223c439cd4\" (UID: \"f8d5e28d-8de9-4514-92d8-05223c439cd4\") " Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.842220 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skmdz\" (UniqueName: \"kubernetes.io/projected/f8d5e28d-8de9-4514-92d8-05223c439cd4-kube-api-access-skmdz\") pod \"f8d5e28d-8de9-4514-92d8-05223c439cd4\" (UID: \"f8d5e28d-8de9-4514-92d8-05223c439cd4\") " Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.842932 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8d5e28d-8de9-4514-92d8-05223c439cd4-utilities" (OuterVolumeSpecName: "utilities") pod "f8d5e28d-8de9-4514-92d8-05223c439cd4" (UID: "f8d5e28d-8de9-4514-92d8-05223c439cd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.847636 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d5e28d-8de9-4514-92d8-05223c439cd4-kube-api-access-skmdz" (OuterVolumeSpecName: "kube-api-access-skmdz") pod "f8d5e28d-8de9-4514-92d8-05223c439cd4" (UID: "f8d5e28d-8de9-4514-92d8-05223c439cd4"). InnerVolumeSpecName "kube-api-access-skmdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.944762 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skmdz\" (UniqueName: \"kubernetes.io/projected/f8d5e28d-8de9-4514-92d8-05223c439cd4-kube-api-access-skmdz\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.944856 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8d5e28d-8de9-4514-92d8-05223c439cd4-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:32 crc kubenswrapper[5047]: I0223 06:49:32.972602 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8d5e28d-8de9-4514-92d8-05223c439cd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8d5e28d-8de9-4514-92d8-05223c439cd4" (UID: "f8d5e28d-8de9-4514-92d8-05223c439cd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:49:33 crc kubenswrapper[5047]: I0223 06:49:33.046314 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8d5e28d-8de9-4514-92d8-05223c439cd4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:33 crc kubenswrapper[5047]: I0223 06:49:33.681416 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzzqw" event={"ID":"f8d5e28d-8de9-4514-92d8-05223c439cd4","Type":"ContainerDied","Data":"783f0f4f191e815a95308c08b1128190bd83b481cc4d56447ee3beb71cac639c"} Feb 23 06:49:33 crc kubenswrapper[5047]: I0223 06:49:33.681512 5047 scope.go:117] "RemoveContainer" containerID="f22fe408cb96824f07bc9f1fb260b0f20284234ea9a74ebf15a5f38e7e683c72" Feb 23 06:49:33 crc kubenswrapper[5047]: I0223 06:49:33.681596 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzzqw" Feb 23 06:49:33 crc kubenswrapper[5047]: I0223 06:49:33.703446 5047 scope.go:117] "RemoveContainer" containerID="63f7071d8efd7f08ad874d9c7c825f08bcaa17142784099977e08d0243b1554a" Feb 23 06:49:33 crc kubenswrapper[5047]: I0223 06:49:33.729779 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzzqw"] Feb 23 06:49:33 crc kubenswrapper[5047]: I0223 06:49:33.732879 5047 scope.go:117] "RemoveContainer" containerID="11d43239cc1e1694ddfad025e3fa1e2d647ff401c247e3a65dcae4c19124eea7" Feb 23 06:49:33 crc kubenswrapper[5047]: I0223 06:49:33.735507 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dzzqw"] Feb 23 06:49:34 crc kubenswrapper[5047]: I0223 06:49:34.353379 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d5e28d-8de9-4514-92d8-05223c439cd4" path="/var/lib/kubelet/pods/f8d5e28d-8de9-4514-92d8-05223c439cd4/volumes" Feb 23 06:49:37 crc kubenswrapper[5047]: I0223 06:49:37.212151 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ttb4m"] Feb 23 06:49:38 crc kubenswrapper[5047]: I0223 06:49:38.169788 5047 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.086764 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd"] Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.087102 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" podUID="44ff94ab-7f35-4a62-8217-568c5f36047a" containerName="controller-manager" containerID="cri-o://c536ab6f0940447e0700784cdaf008dc2c48e19501bc6c85bea87bf0793fbde4" gracePeriod=30 Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.115663 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq"] Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.116507 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" podUID="68a2d8c8-392c-4f88-b685-2827c52e8de5" containerName="route-controller-manager" containerID="cri-o://4e16148110fcfba6b8fd7315143f5f4ab3553f46e5e25ee1c82823808fb38b62" gracePeriod=30 Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.624621 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.695236 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.711711 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68a2d8c8-392c-4f88-b685-2827c52e8de5-client-ca\") pod \"68a2d8c8-392c-4f88-b685-2827c52e8de5\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.711770 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68a2d8c8-392c-4f88-b685-2827c52e8de5-serving-cert\") pod \"68a2d8c8-392c-4f88-b685-2827c52e8de5\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.711849 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68a2d8c8-392c-4f88-b685-2827c52e8de5-config\") pod \"68a2d8c8-392c-4f88-b685-2827c52e8de5\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.711942 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pl7m\" (UniqueName: \"kubernetes.io/projected/68a2d8c8-392c-4f88-b685-2827c52e8de5-kube-api-access-6pl7m\") pod \"68a2d8c8-392c-4f88-b685-2827c52e8de5\" (UID: \"68a2d8c8-392c-4f88-b685-2827c52e8de5\") " Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.714356 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68a2d8c8-392c-4f88-b685-2827c52e8de5-client-ca" (OuterVolumeSpecName: "client-ca") pod "68a2d8c8-392c-4f88-b685-2827c52e8de5" (UID: "68a2d8c8-392c-4f88-b685-2827c52e8de5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.714442 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68a2d8c8-392c-4f88-b685-2827c52e8de5-config" (OuterVolumeSpecName: "config") pod "68a2d8c8-392c-4f88-b685-2827c52e8de5" (UID: "68a2d8c8-392c-4f88-b685-2827c52e8de5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.722966 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a2d8c8-392c-4f88-b685-2827c52e8de5-kube-api-access-6pl7m" (OuterVolumeSpecName: "kube-api-access-6pl7m") pod "68a2d8c8-392c-4f88-b685-2827c52e8de5" (UID: "68a2d8c8-392c-4f88-b685-2827c52e8de5"). InnerVolumeSpecName "kube-api-access-6pl7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.724630 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a2d8c8-392c-4f88-b685-2827c52e8de5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "68a2d8c8-392c-4f88-b685-2827c52e8de5" (UID: "68a2d8c8-392c-4f88-b685-2827c52e8de5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.775850 5047 generic.go:334] "Generic (PLEG): container finished" podID="68a2d8c8-392c-4f88-b685-2827c52e8de5" containerID="4e16148110fcfba6b8fd7315143f5f4ab3553f46e5e25ee1c82823808fb38b62" exitCode=0 Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.775949 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" event={"ID":"68a2d8c8-392c-4f88-b685-2827c52e8de5","Type":"ContainerDied","Data":"4e16148110fcfba6b8fd7315143f5f4ab3553f46e5e25ee1c82823808fb38b62"} Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.775986 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" event={"ID":"68a2d8c8-392c-4f88-b685-2827c52e8de5","Type":"ContainerDied","Data":"7509d6d82c601117c78d562a684d9b8474e0a234fe4eaa819a533f98193bbae8"} Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.776007 5047 scope.go:117] "RemoveContainer" containerID="4e16148110fcfba6b8fd7315143f5f4ab3553f46e5e25ee1c82823808fb38b62" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.776139 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.788539 5047 generic.go:334] "Generic (PLEG): container finished" podID="44ff94ab-7f35-4a62-8217-568c5f36047a" containerID="c536ab6f0940447e0700784cdaf008dc2c48e19501bc6c85bea87bf0793fbde4" exitCode=0 Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.788603 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" event={"ID":"44ff94ab-7f35-4a62-8217-568c5f36047a","Type":"ContainerDied","Data":"c536ab6f0940447e0700784cdaf008dc2c48e19501bc6c85bea87bf0793fbde4"} Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.788638 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" event={"ID":"44ff94ab-7f35-4a62-8217-568c5f36047a","Type":"ContainerDied","Data":"56f74e03de4d09be1c22b039bc967be86d3765d6e379c4cc428041186992c1d9"} Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.788691 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.805199 5047 scope.go:117] "RemoveContainer" containerID="4e16148110fcfba6b8fd7315143f5f4ab3553f46e5e25ee1c82823808fb38b62" Feb 23 06:49:43 crc kubenswrapper[5047]: E0223 06:49:43.806066 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e16148110fcfba6b8fd7315143f5f4ab3553f46e5e25ee1c82823808fb38b62\": container with ID starting with 4e16148110fcfba6b8fd7315143f5f4ab3553f46e5e25ee1c82823808fb38b62 not found: ID does not exist" containerID="4e16148110fcfba6b8fd7315143f5f4ab3553f46e5e25ee1c82823808fb38b62" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.806121 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e16148110fcfba6b8fd7315143f5f4ab3553f46e5e25ee1c82823808fb38b62"} err="failed to get container status \"4e16148110fcfba6b8fd7315143f5f4ab3553f46e5e25ee1c82823808fb38b62\": rpc error: code = NotFound desc = could not find container \"4e16148110fcfba6b8fd7315143f5f4ab3553f46e5e25ee1c82823808fb38b62\": container with ID starting with 4e16148110fcfba6b8fd7315143f5f4ab3553f46e5e25ee1c82823808fb38b62 not found: ID does not exist" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.806161 5047 scope.go:117] "RemoveContainer" containerID="c536ab6f0940447e0700784cdaf008dc2c48e19501bc6c85bea87bf0793fbde4" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.813354 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbfrg\" (UniqueName: \"kubernetes.io/projected/44ff94ab-7f35-4a62-8217-568c5f36047a-kube-api-access-sbfrg\") pod \"44ff94ab-7f35-4a62-8217-568c5f36047a\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.813415 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-config\") pod \"44ff94ab-7f35-4a62-8217-568c5f36047a\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.813450 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-proxy-ca-bundles\") pod \"44ff94ab-7f35-4a62-8217-568c5f36047a\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.813472 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-client-ca\") pod \"44ff94ab-7f35-4a62-8217-568c5f36047a\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.813531 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44ff94ab-7f35-4a62-8217-568c5f36047a-serving-cert\") pod \"44ff94ab-7f35-4a62-8217-568c5f36047a\" (UID: \"44ff94ab-7f35-4a62-8217-568c5f36047a\") " Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.813822 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pl7m\" (UniqueName: \"kubernetes.io/projected/68a2d8c8-392c-4f88-b685-2827c52e8de5-kube-api-access-6pl7m\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.813835 5047 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68a2d8c8-392c-4f88-b685-2827c52e8de5-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.813858 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68a2d8c8-392c-4f88-b685-2827c52e8de5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.813868 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68a2d8c8-392c-4f88-b685-2827c52e8de5-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.814180 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq"] Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.814503 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-config" (OuterVolumeSpecName: "config") pod "44ff94ab-7f35-4a62-8217-568c5f36047a" (UID: "44ff94ab-7f35-4a62-8217-568c5f36047a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.814794 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-client-ca" (OuterVolumeSpecName: "client-ca") pod "44ff94ab-7f35-4a62-8217-568c5f36047a" (UID: "44ff94ab-7f35-4a62-8217-568c5f36047a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.814951 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "44ff94ab-7f35-4a62-8217-568c5f36047a" (UID: "44ff94ab-7f35-4a62-8217-568c5f36047a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.816718 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f89c65d5c-k8ffq"] Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.817616 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ff94ab-7f35-4a62-8217-568c5f36047a-kube-api-access-sbfrg" (OuterVolumeSpecName: "kube-api-access-sbfrg") pod "44ff94ab-7f35-4a62-8217-568c5f36047a" (UID: "44ff94ab-7f35-4a62-8217-568c5f36047a"). InnerVolumeSpecName "kube-api-access-sbfrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.817842 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ff94ab-7f35-4a62-8217-568c5f36047a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "44ff94ab-7f35-4a62-8217-568c5f36047a" (UID: "44ff94ab-7f35-4a62-8217-568c5f36047a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.828177 5047 scope.go:117] "RemoveContainer" containerID="c536ab6f0940447e0700784cdaf008dc2c48e19501bc6c85bea87bf0793fbde4" Feb 23 06:49:43 crc kubenswrapper[5047]: E0223 06:49:43.828843 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c536ab6f0940447e0700784cdaf008dc2c48e19501bc6c85bea87bf0793fbde4\": container with ID starting with c536ab6f0940447e0700784cdaf008dc2c48e19501bc6c85bea87bf0793fbde4 not found: ID does not exist" containerID="c536ab6f0940447e0700784cdaf008dc2c48e19501bc6c85bea87bf0793fbde4" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.828892 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c536ab6f0940447e0700784cdaf008dc2c48e19501bc6c85bea87bf0793fbde4"} err="failed to get container status \"c536ab6f0940447e0700784cdaf008dc2c48e19501bc6c85bea87bf0793fbde4\": rpc error: code = NotFound desc = could not find container \"c536ab6f0940447e0700784cdaf008dc2c48e19501bc6c85bea87bf0793fbde4\": container with ID starting with c536ab6f0940447e0700784cdaf008dc2c48e19501bc6c85bea87bf0793fbde4 not found: ID does not exist" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.915153 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbfrg\" (UniqueName: \"kubernetes.io/projected/44ff94ab-7f35-4a62-8217-568c5f36047a-kube-api-access-sbfrg\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.915237 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.915266 5047 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.915287 5047 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44ff94ab-7f35-4a62-8217-568c5f36047a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:43 crc kubenswrapper[5047]: I0223 06:49:43.915308 5047 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44ff94ab-7f35-4a62-8217-568c5f36047a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.116080 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd"] Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.119402 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-546ffcb6cc-2xhxd"] Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144281 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr"] Feb 23 06:49:44 crc kubenswrapper[5047]: E0223 06:49:44.144529 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ff94ab-7f35-4a62-8217-568c5f36047a" containerName="controller-manager" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144544 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ff94ab-7f35-4a62-8217-568c5f36047a" containerName="controller-manager" Feb 23 06:49:44 crc kubenswrapper[5047]: E0223 06:49:44.144560 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" containerName="extract-utilities" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144566 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" containerName="extract-utilities" Feb 23 06:49:44 crc kubenswrapper[5047]: E0223 06:49:44.144575 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" containerName="registry-server" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144582 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" containerName="registry-server" Feb 23 06:49:44 crc kubenswrapper[5047]: E0223 06:49:44.144591 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a2d8c8-392c-4f88-b685-2827c52e8de5" containerName="route-controller-manager" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144598 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a2d8c8-392c-4f88-b685-2827c52e8de5" containerName="route-controller-manager" Feb 23 06:49:44 crc kubenswrapper[5047]: E0223 06:49:44.144604 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d5e28d-8de9-4514-92d8-05223c439cd4" containerName="extract-utilities" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144610 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d5e28d-8de9-4514-92d8-05223c439cd4" containerName="extract-utilities" Feb 23 06:49:44 crc kubenswrapper[5047]: E0223 06:49:44.144617 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b848e749-e1a5-47ab-aca5-13e799a2504e" containerName="extract-content" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144623 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b848e749-e1a5-47ab-aca5-13e799a2504e" containerName="extract-content" Feb 23 06:49:44 crc kubenswrapper[5047]: E0223 06:49:44.144632 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d5e28d-8de9-4514-92d8-05223c439cd4" containerName="extract-content" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144639 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d5e28d-8de9-4514-92d8-05223c439cd4" containerName="extract-content" Feb 23 06:49:44 crc kubenswrapper[5047]: E0223 06:49:44.144649 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b848e749-e1a5-47ab-aca5-13e799a2504e" containerName="extract-utilities" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144655 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b848e749-e1a5-47ab-aca5-13e799a2504e" containerName="extract-utilities" Feb 23 06:49:44 crc kubenswrapper[5047]: E0223 06:49:44.144666 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d5e28d-8de9-4514-92d8-05223c439cd4" containerName="registry-server" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144673 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d5e28d-8de9-4514-92d8-05223c439cd4" containerName="registry-server" Feb 23 06:49:44 crc kubenswrapper[5047]: E0223 06:49:44.144683 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b848e749-e1a5-47ab-aca5-13e799a2504e" containerName="registry-server" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144689 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b848e749-e1a5-47ab-aca5-13e799a2504e" containerName="registry-server" Feb 23 06:49:44 crc kubenswrapper[5047]: E0223 06:49:44.144696 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" containerName="extract-content" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144704 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" containerName="extract-content" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144807 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a2d8c8-392c-4f88-b685-2827c52e8de5" containerName="route-controller-manager" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144818 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d5e28d-8de9-4514-92d8-05223c439cd4" containerName="registry-server" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144825 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fb237ea-bf04-4d9d-bbb0-2363c97e139b" containerName="registry-server" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144838 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b848e749-e1a5-47ab-aca5-13e799a2504e" containerName="registry-server" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.144845 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ff94ab-7f35-4a62-8217-568c5f36047a" containerName="controller-manager" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.145379 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.147473 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.147473 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.147700 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.147712 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.149128 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.149510 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.156318 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.163336 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr"] Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.219314 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-serving-cert\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.219401 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-client-ca\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.219431 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-config\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.219452 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4sjp\" (UniqueName: \"kubernetes.io/projected/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-kube-api-access-g4sjp\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.219620 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-proxy-ca-bundles\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.321691 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-client-ca\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.321760 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-config\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.321796 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4sjp\" (UniqueName: \"kubernetes.io/projected/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-kube-api-access-g4sjp\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.321847 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-proxy-ca-bundles\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.321918 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-serving-cert\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.322967 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-client-ca\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.323149 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-proxy-ca-bundles\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.323780 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-config\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.327615 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-serving-cert\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.339946 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4sjp\" (UniqueName: \"kubernetes.io/projected/ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5-kube-api-access-g4sjp\") pod \"controller-manager-7d8c8d4655-6f9fr\" (UID: \"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5\") " pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.349442 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44ff94ab-7f35-4a62-8217-568c5f36047a" path="/var/lib/kubelet/pods/44ff94ab-7f35-4a62-8217-568c5f36047a/volumes" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.350155 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68a2d8c8-392c-4f88-b685-2827c52e8de5" path="/var/lib/kubelet/pods/68a2d8c8-392c-4f88-b685-2827c52e8de5/volumes" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.464676 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:44 crc kubenswrapper[5047]: I0223 06:49:44.895899 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr"] Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.151276 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55478764c7-xs529"] Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.152076 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.154254 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.155238 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.155732 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.155997 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.156131 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.156266 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.170445 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55478764c7-xs529"] Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.235138 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f330605b-3ab7-4d51-ab9b-6031a8580492-serving-cert\") pod \"route-controller-manager-55478764c7-xs529\" (UID: \"f330605b-3ab7-4d51-ab9b-6031a8580492\") " pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.235224 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f330605b-3ab7-4d51-ab9b-6031a8580492-client-ca\") pod \"route-controller-manager-55478764c7-xs529\" (UID: \"f330605b-3ab7-4d51-ab9b-6031a8580492\") " pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.235248 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f330605b-3ab7-4d51-ab9b-6031a8580492-config\") pod \"route-controller-manager-55478764c7-xs529\" (UID: \"f330605b-3ab7-4d51-ab9b-6031a8580492\") " pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.235388 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89hfm\" (UniqueName: \"kubernetes.io/projected/f330605b-3ab7-4d51-ab9b-6031a8580492-kube-api-access-89hfm\") pod \"route-controller-manager-55478764c7-xs529\" (UID: \"f330605b-3ab7-4d51-ab9b-6031a8580492\") " pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.336369 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f330605b-3ab7-4d51-ab9b-6031a8580492-client-ca\") pod \"route-controller-manager-55478764c7-xs529\" (UID: \"f330605b-3ab7-4d51-ab9b-6031a8580492\") " pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.336426 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f330605b-3ab7-4d51-ab9b-6031a8580492-config\") pod \"route-controller-manager-55478764c7-xs529\" (UID: \"f330605b-3ab7-4d51-ab9b-6031a8580492\") " pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.336485 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89hfm\" (UniqueName: \"kubernetes.io/projected/f330605b-3ab7-4d51-ab9b-6031a8580492-kube-api-access-89hfm\") pod \"route-controller-manager-55478764c7-xs529\" (UID: \"f330605b-3ab7-4d51-ab9b-6031a8580492\") " pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.336525 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f330605b-3ab7-4d51-ab9b-6031a8580492-serving-cert\") pod \"route-controller-manager-55478764c7-xs529\" (UID: \"f330605b-3ab7-4d51-ab9b-6031a8580492\") " pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.338153 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f330605b-3ab7-4d51-ab9b-6031a8580492-client-ca\") pod \"route-controller-manager-55478764c7-xs529\" (UID: \"f330605b-3ab7-4d51-ab9b-6031a8580492\") " pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.338405 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f330605b-3ab7-4d51-ab9b-6031a8580492-config\") pod \"route-controller-manager-55478764c7-xs529\" (UID: \"f330605b-3ab7-4d51-ab9b-6031a8580492\") " pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.352942 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f330605b-3ab7-4d51-ab9b-6031a8580492-serving-cert\") pod \"route-controller-manager-55478764c7-xs529\" (UID: \"f330605b-3ab7-4d51-ab9b-6031a8580492\") " pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.356894 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89hfm\" (UniqueName: \"kubernetes.io/projected/f330605b-3ab7-4d51-ab9b-6031a8580492-kube-api-access-89hfm\") pod \"route-controller-manager-55478764c7-xs529\" (UID: \"f330605b-3ab7-4d51-ab9b-6031a8580492\") " pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.482349 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.766727 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55478764c7-xs529"] Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.811792 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" event={"ID":"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5","Type":"ContainerStarted","Data":"57719e81b42ec924c7df3c4ff16f09207dc231d03f42547738450d6f61837882"} Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.811993 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" event={"ID":"ff746c88-7d7f-4fa1-8d11-85df9b4cf3b5","Type":"ContainerStarted","Data":"05ce0238ff32964d94aeed5a7fb172dc5d8356e3babf0d6da9896ba1d15b6f10"} Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.812114 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.821750 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.823421 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" event={"ID":"f330605b-3ab7-4d51-ab9b-6031a8580492","Type":"ContainerStarted","Data":"3396a89d4cd21e4aecf092ddfb2dc0bdccd3929b093c4a27e75277abaa2e9b1a"} Feb 23 06:49:45 crc kubenswrapper[5047]: I0223 06:49:45.839317 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7d8c8d4655-6f9fr" podStartSLOduration=2.839291082 podStartE2EDuration="2.839291082s" podCreationTimestamp="2026-02-23 06:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:49:45.835009302 +0000 UTC m=+308.086336466" watchObservedRunningTime="2026-02-23 06:49:45.839291082 +0000 UTC m=+308.090618216" Feb 23 06:49:46 crc kubenswrapper[5047]: I0223 06:49:46.832497 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" event={"ID":"f330605b-3ab7-4d51-ab9b-6031a8580492","Type":"ContainerStarted","Data":"e2367fa8bbb1900db33eeef2418e5e2f89a0d287dd4415cfc551eaf3ce5fc481"} Feb 23 06:49:46 crc kubenswrapper[5047]: I0223 06:49:46.832703 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:46 crc kubenswrapper[5047]: I0223 06:49:46.841012 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" Feb 23 06:49:46 crc kubenswrapper[5047]: I0223 06:49:46.859530 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55478764c7-xs529" podStartSLOduration=3.859504436 podStartE2EDuration="3.859504436s" podCreationTimestamp="2026-02-23 06:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:49:46.855854875 +0000 UTC m=+309.107182099" watchObservedRunningTime="2026-02-23 06:49:46.859504436 +0000 UTC m=+309.110831570" Feb 23 06:49:48 crc kubenswrapper[5047]: I0223 06:49:48.991817 5047 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 23 06:49:48 crc kubenswrapper[5047]: I0223 06:49:48.992853 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.037679 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.079651 5047 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.080197 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8" gracePeriod=15 Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.080195 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab" gracePeriod=15 Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.080264 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956" gracePeriod=15 Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.080216 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b" gracePeriod=15 Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.080402 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043" gracePeriod=15 Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.081659 5047 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:49:49 crc kubenswrapper[5047]: E0223 06:49:49.081940 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.081956 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 23 06:49:49 crc kubenswrapper[5047]: E0223 06:49:49.081970 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.081980 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: E0223 06:49:49.081991 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.081999 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 23 06:49:49 crc kubenswrapper[5047]: E0223 06:49:49.082020 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082029 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: E0223 06:49:49.082040 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082048 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 23 06:49:49 crc kubenswrapper[5047]: E0223 06:49:49.082059 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082068 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 06:49:49 crc kubenswrapper[5047]: E0223 06:49:49.082080 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082089 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 23 06:49:49 crc kubenswrapper[5047]: E0223 06:49:49.082113 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082122 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: E0223 06:49:49.082133 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082142 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082295 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082309 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082327 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082339 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082350 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082360 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082370 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082380 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: E0223 06:49:49.082496 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082506 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.082671 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.093102 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.093227 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.093287 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.093330 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.093402 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.194763 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.194835 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.194861 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.194925 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.194944 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.194985 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.195022 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.195044 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.195146 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.195183 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.195210 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.195236 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.195266 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.296352 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.296481 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.296647 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.296746 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.296829 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.296830 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.328162 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:49 crc kubenswrapper[5047]: E0223 06:49:49.361633 5047 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.129:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896cd712b619a42 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:49:49.359684162 +0000 UTC m=+311.611011306,LastTimestamp:2026-02-23 06:49:49.359684162 +0000 UTC m=+311.611011306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.854975 5047 generic.go:334] "Generic (PLEG): container finished" podID="23875e6d-032c-4db1-ae77-f1f19254115b" containerID="12164f032ba3cde43f1fdea995b3c6af6f06495738f7be698bbe841e2f57aa67" exitCode=0 Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.855068 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"23875e6d-032c-4db1-ae77-f1f19254115b","Type":"ContainerDied","Data":"12164f032ba3cde43f1fdea995b3c6af6f06495738f7be698bbe841e2f57aa67"} Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.855972 5047 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.856541 5047 status_manager.go:851] "Failed to get status for pod" podUID="23875e6d-032c-4db1-ae77-f1f19254115b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.857157 5047 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.857501 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f8d745d21ae6c1773e3a01bb2870ab8e29f0d8c1da28ee8c544574a410a2d8e2"} Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.857582 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0af86c9d92ceb4f901547f70db79fbfe6000fbe36715be7f999668f3fbd4060f"} Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.858496 5047 status_manager.go:851] "Failed to get status for pod" podUID="23875e6d-032c-4db1-ae77-f1f19254115b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.859371 5047 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.859870 5047 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.861686 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.865201 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.866410 5047 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8" exitCode=0 Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.866444 5047 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b" exitCode=0 Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.866455 5047 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab" exitCode=0 Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.866468 5047 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956" exitCode=2 Feb 23 06:49:49 crc kubenswrapper[5047]: I0223 06:49:49.866609 5047 scope.go:117] "RemoveContainer" containerID="d8bfa98a7f9d4f48fe4c41022ea1d12357d9257af7bf5e1ef45df03baebb880a" Feb 23 06:49:50 crc kubenswrapper[5047]: I0223 06:49:50.882858 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 06:49:50 crc kubenswrapper[5047]: E0223 06:49:50.912590 5047 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.129:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896cd712b619a42 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:49:49.359684162 +0000 UTC m=+311.611011306,LastTimestamp:2026-02-23 06:49:49.359684162 +0000 UTC m=+311.611011306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.292730 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.294565 5047 status_manager.go:851] "Failed to get status for pod" podUID="23875e6d-032c-4db1-ae77-f1f19254115b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.295388 5047 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.403372 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23875e6d-032c-4db1-ae77-f1f19254115b-kube-api-access\") pod \"23875e6d-032c-4db1-ae77-f1f19254115b\" (UID: \"23875e6d-032c-4db1-ae77-f1f19254115b\") " Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.403461 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/23875e6d-032c-4db1-ae77-f1f19254115b-var-lock\") pod \"23875e6d-032c-4db1-ae77-f1f19254115b\" (UID: \"23875e6d-032c-4db1-ae77-f1f19254115b\") " Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.403740 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23875e6d-032c-4db1-ae77-f1f19254115b-kubelet-dir\") pod \"23875e6d-032c-4db1-ae77-f1f19254115b\" (UID: \"23875e6d-032c-4db1-ae77-f1f19254115b\") " Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.404159 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23875e6d-032c-4db1-ae77-f1f19254115b-var-lock" (OuterVolumeSpecName: "var-lock") pod "23875e6d-032c-4db1-ae77-f1f19254115b" (UID: "23875e6d-032c-4db1-ae77-f1f19254115b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.404283 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23875e6d-032c-4db1-ae77-f1f19254115b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "23875e6d-032c-4db1-ae77-f1f19254115b" (UID: "23875e6d-032c-4db1-ae77-f1f19254115b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.404369 5047 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/23875e6d-032c-4db1-ae77-f1f19254115b-var-lock\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.409752 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23875e6d-032c-4db1-ae77-f1f19254115b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "23875e6d-032c-4db1-ae77-f1f19254115b" (UID: "23875e6d-032c-4db1-ae77-f1f19254115b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.505501 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23875e6d-032c-4db1-ae77-f1f19254115b-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.505554 5047 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23875e6d-032c-4db1-ae77-f1f19254115b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.536128 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.537236 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.538627 5047 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.539339 5047 status_manager.go:851] "Failed to get status for pod" podUID="23875e6d-032c-4db1-ae77-f1f19254115b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.540053 5047 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.610486 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.610627 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.610701 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.610783 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.610858 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.610893 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.612282 5047 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.612322 5047 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.612340 5047 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.894592 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"23875e6d-032c-4db1-ae77-f1f19254115b","Type":"ContainerDied","Data":"0d6c79ebc8514b2e91042c88b23a81c3da4ce8054c4204720a752cbe79942f01"} Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.894652 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d6c79ebc8514b2e91042c88b23a81c3da4ce8054c4204720a752cbe79942f01" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.894653 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.899866 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.901117 5047 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043" exitCode=0 Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.901294 5047 scope.go:117] "RemoveContainer" containerID="81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.901295 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.912062 5047 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.912704 5047 status_manager.go:851] "Failed to get status for pod" podUID="23875e6d-032c-4db1-ae77-f1f19254115b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.913288 5047 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.924301 5047 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.925150 5047 status_manager.go:851] "Failed to get status for pod" podUID="23875e6d-032c-4db1-ae77-f1f19254115b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.925304 5047 scope.go:117] "RemoveContainer" containerID="773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.925708 5047 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.946666 5047 scope.go:117] "RemoveContainer" containerID="68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.968883 5047 scope.go:117] "RemoveContainer" containerID="0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956" Feb 23 06:49:51 crc kubenswrapper[5047]: I0223 06:49:51.987730 5047 scope.go:117] "RemoveContainer" containerID="d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.007394 5047 scope.go:117] "RemoveContainer" containerID="948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.029208 5047 scope.go:117] "RemoveContainer" containerID="81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8" Feb 23 06:49:52 crc kubenswrapper[5047]: E0223 06:49:52.030058 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\": container with ID starting with 81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8 not found: ID does not exist" containerID="81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.030133 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8"} err="failed to get container status \"81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\": rpc error: code = NotFound desc = could not find container \"81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8\": container with ID starting with 81192fa808945cc292e24f68288e23cddfa5ea1521b63d8b3a9fc327ad1662e8 not found: ID does not exist" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.030186 5047 scope.go:117] "RemoveContainer" containerID="773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b" Feb 23 06:49:52 crc kubenswrapper[5047]: E0223 06:49:52.030681 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\": container with ID starting with 773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b not found: ID does not exist" containerID="773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.030729 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b"} err="failed to get container status \"773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\": rpc error: code = NotFound desc = could not find container \"773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b\": container with ID starting with 773ccdd22676e3becc76f5e398cf32c22f27c0dfbdebe40851332025a8a9f13b not found: ID does not exist" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.030765 5047 scope.go:117] "RemoveContainer" containerID="68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab" Feb 23 06:49:52 crc kubenswrapper[5047]: E0223 06:49:52.031267 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\": container with ID starting with 68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab not found: ID does not exist" containerID="68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.031331 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab"} err="failed to get container status \"68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\": rpc error: code = NotFound desc = could not find container \"68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab\": container with ID starting with 68015270120a79a50daf4f6463a2aaf0c9c87957c1fd978603637b70b860abab not found: ID does not exist" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.031373 5047 scope.go:117] "RemoveContainer" containerID="0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956" Feb 23 06:49:52 crc kubenswrapper[5047]: E0223 06:49:52.031829 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\": container with ID starting with 0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956 not found: ID does not exist" containerID="0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.031865 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956"} err="failed to get container status \"0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\": rpc error: code = NotFound desc = could not find container \"0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956\": container with ID starting with 0741b17b07c98c60a38fd8fa6a42e46872b6cc93b4e913eefaeb637a99386956 not found: ID does not exist" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.031885 5047 scope.go:117] "RemoveContainer" containerID="d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043" Feb 23 06:49:52 crc kubenswrapper[5047]: E0223 06:49:52.032229 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\": container with ID starting with d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043 not found: ID does not exist" containerID="d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.032251 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043"} err="failed to get container status \"d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\": rpc error: code = NotFound desc = could not find container \"d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043\": container with ID starting with d9e13cc61f987ff21b7ea740924eebdc8c59d43a95b1046f6a002387e6bc7043 not found: ID does not exist" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.032302 5047 scope.go:117] "RemoveContainer" containerID="948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541" Feb 23 06:49:52 crc kubenswrapper[5047]: E0223 06:49:52.032773 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\": container with ID starting with 948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541 not found: ID does not exist" containerID="948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.032836 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541"} err="failed to get container status \"948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\": rpc error: code = NotFound desc = could not find container \"948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541\": container with ID starting with 948e84f85efef4e806a934d389c89e584d0ac58383ace4793c18c6cfa036c541 not found: ID does not exist" Feb 23 06:49:52 crc kubenswrapper[5047]: I0223 06:49:52.359577 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 23 06:49:56 crc kubenswrapper[5047]: E0223 06:49:56.771138 5047 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:56 crc kubenswrapper[5047]: E0223 06:49:56.773384 5047 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:56 crc kubenswrapper[5047]: E0223 06:49:56.774126 5047 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:56 crc kubenswrapper[5047]: E0223 06:49:56.774751 5047 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:56 crc kubenswrapper[5047]: E0223 06:49:56.775417 5047 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:56 crc kubenswrapper[5047]: I0223 06:49:56.775496 5047 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 23 06:49:56 crc kubenswrapper[5047]: E0223 06:49:56.776084 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="200ms" Feb 23 06:49:56 crc kubenswrapper[5047]: E0223 06:49:56.977460 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="400ms" Feb 23 06:49:57 crc kubenswrapper[5047]: E0223 06:49:57.378484 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="800ms" Feb 23 06:49:58 crc kubenswrapper[5047]: E0223 06:49:58.180166 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="1.6s" Feb 23 06:49:58 crc kubenswrapper[5047]: I0223 06:49:58.343509 5047 status_manager.go:851] "Failed to get status for pod" podUID="23875e6d-032c-4db1-ae77-f1f19254115b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:58 crc kubenswrapper[5047]: I0223 06:49:58.344090 5047 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:49:59 crc kubenswrapper[5047]: E0223 06:49:59.781641 5047 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.129:6443: connect: connection refused" interval="3.2s" Feb 23 06:50:00 crc kubenswrapper[5047]: E0223 06:50:00.913817 5047 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.129:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896cd712b619a42 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:49:49.359684162 +0000 UTC m=+311.611011306,LastTimestamp:2026-02-23 06:49:49.359684162 +0000 UTC m=+311.611011306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:50:01 crc kubenswrapper[5047]: I0223 06:50:01.340670 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:50:01 crc kubenswrapper[5047]: I0223 06:50:01.342475 5047 status_manager.go:851] "Failed to get status for pod" podUID="23875e6d-032c-4db1-ae77-f1f19254115b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:50:01 crc kubenswrapper[5047]: I0223 06:50:01.343094 5047 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:50:01 crc kubenswrapper[5047]: I0223 06:50:01.360681 5047 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="24137acc-2a91-4059-b282-cee970a1a349" Feb 23 06:50:01 crc kubenswrapper[5047]: I0223 06:50:01.360734 5047 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="24137acc-2a91-4059-b282-cee970a1a349" Feb 23 06:50:01 crc kubenswrapper[5047]: E0223 06:50:01.361476 5047 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:50:01 crc kubenswrapper[5047]: I0223 06:50:01.362422 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:50:01 crc kubenswrapper[5047]: I0223 06:50:01.985418 5047 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="7b572ff2ccb6ebae7d8cc21f2cca10f1e711df149a6c4355578fc3deee306140" exitCode=0 Feb 23 06:50:01 crc kubenswrapper[5047]: I0223 06:50:01.985528 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"7b572ff2ccb6ebae7d8cc21f2cca10f1e711df149a6c4355578fc3deee306140"} Feb 23 06:50:01 crc kubenswrapper[5047]: I0223 06:50:01.985944 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e79f9dc0a9b9bdc5c9a3653d029962cae65e59084ad32c750c882dcf307ddfe8"} Feb 23 06:50:01 crc kubenswrapper[5047]: I0223 06:50:01.986282 5047 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="24137acc-2a91-4059-b282-cee970a1a349" Feb 23 06:50:01 crc kubenswrapper[5047]: I0223 06:50:01.986296 5047 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="24137acc-2a91-4059-b282-cee970a1a349" Feb 23 06:50:01 crc kubenswrapper[5047]: I0223 06:50:01.986864 5047 status_manager.go:851] "Failed to get status for pod" podUID="23875e6d-032c-4db1-ae77-f1f19254115b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:50:01 crc kubenswrapper[5047]: E0223 06:50:01.986940 5047 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:50:01 crc kubenswrapper[5047]: I0223 06:50:01.987312 5047 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.129:6443: connect: connection refused" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.236752 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" podUID="d1fdf697-2344-4656-8ae3-8f516f5dd1ca" containerName="oauth-openshift" containerID="cri-o://c3a9e6c055a90e7776e737241eed82bc20f1b5385859bf9284e2e7fce614e603" gracePeriod=15 Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.828743 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.899026 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-provider-selection\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.899558 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-login\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.899751 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-ocp-branding-template\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.899971 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-service-ca\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.900191 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-audit-policies\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.900353 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-trusted-ca-bundle\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.900524 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-cliconfig\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.900693 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-session\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.900848 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-router-certs\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.901051 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-error\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.901237 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-serving-cert\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.901407 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-idp-0-file-data\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.901598 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-audit-dir\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.901808 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv876\" (UniqueName: \"kubernetes.io/projected/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-kube-api-access-pv876\") pod \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\" (UID: \"d1fdf697-2344-4656-8ae3-8f516f5dd1ca\") " Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.900861 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.901196 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.901435 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.904018 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.904790 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.908779 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-kube-api-access-pv876" (OuterVolumeSpecName: "kube-api-access-pv876") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "kube-api-access-pv876". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.909720 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.914552 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.919242 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.920085 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.920502 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.920772 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.924089 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.926397 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d1fdf697-2344-4656-8ae3-8f516f5dd1ca" (UID: "d1fdf697-2344-4656-8ae3-8f516f5dd1ca"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.993885 5047 generic.go:334] "Generic (PLEG): container finished" podID="d1fdf697-2344-4656-8ae3-8f516f5dd1ca" containerID="c3a9e6c055a90e7776e737241eed82bc20f1b5385859bf9284e2e7fce614e603" exitCode=0 Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.993992 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.994030 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" event={"ID":"d1fdf697-2344-4656-8ae3-8f516f5dd1ca","Type":"ContainerDied","Data":"c3a9e6c055a90e7776e737241eed82bc20f1b5385859bf9284e2e7fce614e603"} Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.995556 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ttb4m" event={"ID":"d1fdf697-2344-4656-8ae3-8f516f5dd1ca","Type":"ContainerDied","Data":"e84a6843d8e4fce6afd1f147e36e876bfe99e13c535b84917e84cbd66a1ac988"} Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.995583 5047 scope.go:117] "RemoveContainer" containerID="c3a9e6c055a90e7776e737241eed82bc20f1b5385859bf9284e2e7fce614e603" Feb 23 06:50:02 crc kubenswrapper[5047]: I0223 06:50:02.999444 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.001822 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.001875 5047 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="db410610d66ac6995629552ac1e2a2d1f6f1ee0716cfced332200b1ec230d213" exitCode=1 Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.001951 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"db410610d66ac6995629552ac1e2a2d1f6f1ee0716cfced332200b1ec230d213"} Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.002528 5047 scope.go:117] "RemoveContainer" containerID="db410610d66ac6995629552ac1e2a2d1f6f1ee0716cfced332200b1ec230d213" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.003948 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.003974 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.003990 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.004003 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.004016 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.004026 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.004036 5047 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.004045 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv876\" (UniqueName: \"kubernetes.io/projected/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-kube-api-access-pv876\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.004058 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.004068 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.004090 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.004101 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.004110 5047 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.004119 5047 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1fdf697-2344-4656-8ae3-8f516f5dd1ca-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.008627 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"83f4cd0a8c661afc64459eab10e9ba442521e2a92d7cdbc3964aeca3cfe2cc59"} Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.008674 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"00288132cdbfea63036b3da799b9914b6bb3aff02808b34453b09e31de4dbd12"} Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.008687 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"011a8eec52274a995bdae793319540d63d6ae31272dfd4c03642b072f2a07deb"} Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.015515 5047 scope.go:117] "RemoveContainer" containerID="c3a9e6c055a90e7776e737241eed82bc20f1b5385859bf9284e2e7fce614e603" Feb 23 06:50:03 crc kubenswrapper[5047]: E0223 06:50:03.023007 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a9e6c055a90e7776e737241eed82bc20f1b5385859bf9284e2e7fce614e603\": container with ID starting with c3a9e6c055a90e7776e737241eed82bc20f1b5385859bf9284e2e7fce614e603 not found: ID does not exist" containerID="c3a9e6c055a90e7776e737241eed82bc20f1b5385859bf9284e2e7fce614e603" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.023078 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a9e6c055a90e7776e737241eed82bc20f1b5385859bf9284e2e7fce614e603"} err="failed to get container status \"c3a9e6c055a90e7776e737241eed82bc20f1b5385859bf9284e2e7fce614e603\": rpc error: code = NotFound desc = could not find container \"c3a9e6c055a90e7776e737241eed82bc20f1b5385859bf9284e2e7fce614e603\": container with ID starting with c3a9e6c055a90e7776e737241eed82bc20f1b5385859bf9284e2e7fce614e603 not found: ID does not exist" Feb 23 06:50:03 crc kubenswrapper[5047]: I0223 06:50:03.445876 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:50:04 crc kubenswrapper[5047]: I0223 06:50:04.018035 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"80154c802a63a67f6fa80a708d0470c4d81f28aa4d6183531d5d4afb9c929dde"} Feb 23 06:50:04 crc kubenswrapper[5047]: I0223 06:50:04.018086 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4ecf549b748ce7e299ae33c8feffe36b480e6f28d29d9b474907d0ac62667e4e"} Feb 23 06:50:04 crc kubenswrapper[5047]: I0223 06:50:04.018202 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:50:04 crc kubenswrapper[5047]: I0223 06:50:04.018354 5047 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="24137acc-2a91-4059-b282-cee970a1a349" Feb 23 06:50:04 crc kubenswrapper[5047]: I0223 06:50:04.018387 5047 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="24137acc-2a91-4059-b282-cee970a1a349" Feb 23 06:50:04 crc kubenswrapper[5047]: I0223 06:50:04.021253 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 23 06:50:04 crc kubenswrapper[5047]: I0223 06:50:04.022565 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 23 06:50:04 crc kubenswrapper[5047]: I0223 06:50:04.022639 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d8fac09f4ceec44d8c4d007da5abfd18ee971eb0e28a823c9c81b05b8ecb1afc"} Feb 23 06:50:06 crc kubenswrapper[5047]: I0223 06:50:06.362618 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:50:06 crc kubenswrapper[5047]: I0223 06:50:06.362709 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:50:06 crc kubenswrapper[5047]: I0223 06:50:06.372701 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:50:09 crc kubenswrapper[5047]: I0223 06:50:09.034330 5047 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:50:09 crc kubenswrapper[5047]: I0223 06:50:09.060975 5047 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="24137acc-2a91-4059-b282-cee970a1a349" Feb 23 06:50:09 crc kubenswrapper[5047]: I0223 06:50:09.061017 5047 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="24137acc-2a91-4059-b282-cee970a1a349" Feb 23 06:50:09 crc kubenswrapper[5047]: I0223 06:50:09.065136 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:50:09 crc kubenswrapper[5047]: I0223 06:50:09.069055 5047 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b6e6ef61-a95c-4a91-8e87-6d32ca142d09" Feb 23 06:50:09 crc kubenswrapper[5047]: I0223 06:50:09.228964 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:50:10 crc kubenswrapper[5047]: I0223 06:50:10.067762 5047 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="24137acc-2a91-4059-b282-cee970a1a349" Feb 23 06:50:10 crc kubenswrapper[5047]: I0223 06:50:10.067809 5047 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="24137acc-2a91-4059-b282-cee970a1a349" Feb 23 06:50:13 crc kubenswrapper[5047]: I0223 06:50:13.446962 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:50:13 crc kubenswrapper[5047]: I0223 06:50:13.455517 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:50:14 crc kubenswrapper[5047]: I0223 06:50:14.102407 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:50:18 crc kubenswrapper[5047]: I0223 06:50:18.360765 5047 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b6e6ef61-a95c-4a91-8e87-6d32ca142d09" Feb 23 06:50:19 crc kubenswrapper[5047]: I0223 06:50:19.355186 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 23 06:50:20 crc kubenswrapper[5047]: I0223 06:50:20.131720 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 23 06:50:20 crc kubenswrapper[5047]: I0223 06:50:20.659439 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 23 06:50:20 crc kubenswrapper[5047]: I0223 06:50:20.700296 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 23 06:50:20 crc kubenswrapper[5047]: I0223 06:50:20.948561 5047 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.019966 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.239494 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.266325 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.338119 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.354358 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.404032 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.456693 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.550659 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.616087 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.670379 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.680402 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.705811 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.978606 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 23 06:50:21 crc kubenswrapper[5047]: I0223 06:50:21.984210 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.185341 5047 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.207154 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.235834 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.304227 5047 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.416949 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.525557 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.659022 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.712292 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.724363 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.727427 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.792221 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.820895 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.839424 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 23 06:50:22 crc kubenswrapper[5047]: I0223 06:50:22.915198 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.110267 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.115022 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.433691 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.461939 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.612828 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.691197 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.734330 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.808826 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.854925 5047 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.867962 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.878523 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.934220 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.945318 5047 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.948884 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=34.94885141 podStartE2EDuration="34.94885141s" podCreationTimestamp="2026-02-23 06:49:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:50:08.745614837 +0000 UTC m=+330.996941971" watchObservedRunningTime="2026-02-23 06:50:23.94885141 +0000 UTC m=+346.200178554" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.952049 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ttb4m","openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.952116 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.956144 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.956820 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.957848 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 23 06:50:23 crc kubenswrapper[5047]: I0223 06:50:23.979522 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.979492413 podStartE2EDuration="14.979492413s" podCreationTimestamp="2026-02-23 06:50:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:50:23.976356655 +0000 UTC m=+346.227683799" watchObservedRunningTime="2026-02-23 06:50:23.979492413 +0000 UTC m=+346.230819587" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.188715 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.252299 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.263850 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.288043 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.296679 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.301544 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.350380 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1fdf697-2344-4656-8ae3-8f516f5dd1ca" path="/var/lib/kubelet/pods/d1fdf697-2344-4656-8ae3-8f516f5dd1ca/volumes" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.420393 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.473670 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.609672 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.667095 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.668990 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.764303 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.843098 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 06:50:24 crc kubenswrapper[5047]: I0223 06:50:24.992510 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.145037 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.202832 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.281163 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.364804 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.485537 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.499967 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.501729 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.533002 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.564250 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.568461 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.621507 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.676621 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.878467 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.914743 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.926445 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.967001 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.973444 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 23 06:50:25 crc kubenswrapper[5047]: I0223 06:50:25.994098 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 23 06:50:26 crc kubenswrapper[5047]: I0223 06:50:26.172458 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 23 06:50:26 crc kubenswrapper[5047]: I0223 06:50:26.378349 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 23 06:50:26 crc kubenswrapper[5047]: I0223 06:50:26.389706 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 23 06:50:26 crc kubenswrapper[5047]: I0223 06:50:26.477271 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 23 06:50:26 crc kubenswrapper[5047]: I0223 06:50:26.479083 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 23 06:50:26 crc kubenswrapper[5047]: I0223 06:50:26.513545 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 23 06:50:26 crc kubenswrapper[5047]: I0223 06:50:26.721339 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 23 06:50:26 crc kubenswrapper[5047]: I0223 06:50:26.756008 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 23 06:50:26 crc kubenswrapper[5047]: I0223 06:50:26.849109 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 23 06:50:26 crc kubenswrapper[5047]: I0223 06:50:26.871206 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 23 06:50:26 crc kubenswrapper[5047]: I0223 06:50:26.876662 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 23 06:50:26 crc kubenswrapper[5047]: I0223 06:50:26.890329 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 23 06:50:26 crc kubenswrapper[5047]: I0223 06:50:26.992675 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 23 06:50:27 crc kubenswrapper[5047]: I0223 06:50:27.065205 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 23 06:50:27 crc kubenswrapper[5047]: I0223 06:50:27.142310 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 06:50:27 crc kubenswrapper[5047]: I0223 06:50:27.159690 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 23 06:50:27 crc kubenswrapper[5047]: I0223 06:50:27.172200 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 23 06:50:27 crc kubenswrapper[5047]: I0223 06:50:27.186245 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 23 06:50:27 crc kubenswrapper[5047]: I0223 06:50:27.236067 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 23 06:50:27 crc kubenswrapper[5047]: I0223 06:50:27.274915 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 23 06:50:27 crc kubenswrapper[5047]: I0223 06:50:27.309189 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 23 06:50:27 crc kubenswrapper[5047]: I0223 06:50:27.459385 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 23 06:50:27 crc kubenswrapper[5047]: I0223 06:50:27.596316 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 23 06:50:27 crc kubenswrapper[5047]: I0223 06:50:27.675998 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 23 06:50:27 crc kubenswrapper[5047]: I0223 06:50:27.722580 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 23 06:50:27 crc kubenswrapper[5047]: I0223 06:50:27.815497 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.007416 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.017646 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.085510 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.116035 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.210516 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.259043 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.283293 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.377857 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.438906 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.536085 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.568679 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.635087 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.636250 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.660673 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.795811 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.900161 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.919228 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 06:50:28 crc kubenswrapper[5047]: I0223 06:50:28.934534 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 23 06:50:29 crc kubenswrapper[5047]: I0223 06:50:29.048801 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 23 06:50:29 crc kubenswrapper[5047]: I0223 06:50:29.067194 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 23 06:50:29 crc kubenswrapper[5047]: I0223 06:50:29.143463 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 23 06:50:29 crc kubenswrapper[5047]: I0223 06:50:29.169091 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 23 06:50:29 crc kubenswrapper[5047]: I0223 06:50:29.175422 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 23 06:50:29 crc kubenswrapper[5047]: I0223 06:50:29.299730 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:50:29 crc kubenswrapper[5047]: I0223 06:50:29.364600 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 23 06:50:29 crc kubenswrapper[5047]: I0223 06:50:29.374608 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 23 06:50:29 crc kubenswrapper[5047]: I0223 06:50:29.483371 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 23 06:50:29 crc kubenswrapper[5047]: I0223 06:50:29.602752 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 23 06:50:29 crc kubenswrapper[5047]: I0223 06:50:29.723011 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 23 06:50:29 crc kubenswrapper[5047]: I0223 06:50:29.871804 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 23 06:50:29 crc kubenswrapper[5047]: I0223 06:50:29.969942 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.007397 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.076044 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.127740 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.168365 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.197845 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.223521 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.504171 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.526435 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.538842 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.550503 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.554274 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.577929 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.677352 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.733487 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.745231 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.776153 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.786670 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.812690 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.854639 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.871171 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.905135 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.906326 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.912463 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.970693 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.982399 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 23 06:50:30 crc kubenswrapper[5047]: I0223 06:50:30.989201 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.011806 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.019191 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.064407 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.073020 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.098818 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.113464 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.140318 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.143442 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.185448 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.268782 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.385392 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.423988 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.430660 5047 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.431065 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f8d745d21ae6c1773e3a01bb2870ab8e29f0d8c1da28ee8c544574a410a2d8e2" gracePeriod=5 Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.466629 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.491739 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.580227 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.717501 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.799444 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.801129 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.801183 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.849628 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.927796 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.974660 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 23 06:50:31 crc kubenswrapper[5047]: I0223 06:50:31.975459 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.000724 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.016966 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.034526 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.196796 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.210681 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.286682 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.378672 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.695068 5047 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.704238 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.725686 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.725869 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.734517 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.895192 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.912099 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.918883 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 23 06:50:32 crc kubenswrapper[5047]: I0223 06:50:32.988529 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 23 06:50:33 crc kubenswrapper[5047]: I0223 06:50:33.148022 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 23 06:50:33 crc kubenswrapper[5047]: I0223 06:50:33.190014 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 23 06:50:33 crc kubenswrapper[5047]: I0223 06:50:33.268740 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 23 06:50:33 crc kubenswrapper[5047]: I0223 06:50:33.445610 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 23 06:50:33 crc kubenswrapper[5047]: I0223 06:50:33.446331 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 23 06:50:33 crc kubenswrapper[5047]: I0223 06:50:33.480414 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 23 06:50:33 crc kubenswrapper[5047]: I0223 06:50:33.650063 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 23 06:50:33 crc kubenswrapper[5047]: I0223 06:50:33.652377 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 23 06:50:33 crc kubenswrapper[5047]: I0223 06:50:33.687526 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 06:50:33 crc kubenswrapper[5047]: I0223 06:50:33.764758 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 23 06:50:33 crc kubenswrapper[5047]: I0223 06:50:33.837295 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 23 06:50:33 crc kubenswrapper[5047]: I0223 06:50:33.913422 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.017006 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.033989 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.063893 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.146771 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.240538 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.276954 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.380560 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.434940 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.554750 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.567796 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.608670 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.701547 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.709157 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66f68474cb-q6rsc"] Feb 23 06:50:34 crc kubenswrapper[5047]: E0223 06:50:34.709441 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1fdf697-2344-4656-8ae3-8f516f5dd1ca" containerName="oauth-openshift" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.709458 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1fdf697-2344-4656-8ae3-8f516f5dd1ca" containerName="oauth-openshift" Feb 23 06:50:34 crc kubenswrapper[5047]: E0223 06:50:34.709477 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.709488 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 23 06:50:34 crc kubenswrapper[5047]: E0223 06:50:34.709502 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23875e6d-032c-4db1-ae77-f1f19254115b" containerName="installer" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.709511 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="23875e6d-032c-4db1-ae77-f1f19254115b" containerName="installer" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.709623 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1fdf697-2344-4656-8ae3-8f516f5dd1ca" containerName="oauth-openshift" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.709641 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="23875e6d-032c-4db1-ae77-f1f19254115b" containerName="installer" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.709655 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.710140 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: W0223 06:50:34.712841 5047 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-cliconfig": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-cliconfig" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 23 06:50:34 crc kubenswrapper[5047]: E0223 06:50:34.712896 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-cliconfig\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:50:34 crc kubenswrapper[5047]: W0223 06:50:34.715280 5047 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data": failed to list *v1.Secret: secrets "v4-0-config-user-idp-0-file-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 23 06:50:34 crc kubenswrapper[5047]: E0223 06:50:34.715316 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-idp-0-file-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.716519 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.716847 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.717044 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.718326 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.718639 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.718705 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.719325 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.720556 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 23 06:50:34 crc kubenswrapper[5047]: W0223 06:50:34.721631 5047 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 23 06:50:34 crc kubenswrapper[5047]: E0223 06:50:34.721667 5047 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.722106 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.722643 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.734256 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.736873 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.761111 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.764222 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66f68474cb-q6rsc"] Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.828559 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.870145 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-user-template-login\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.870256 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.870314 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-session\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.870363 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76qvm\" (UniqueName: \"kubernetes.io/projected/4c0f6d47-9fa2-4518-8e18-517a23fe7053-kube-api-access-76qvm\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.871128 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.871187 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c0f6d47-9fa2-4518-8e18-517a23fe7053-audit-dir\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.871230 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.871269 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.871309 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.871350 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-user-template-error\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.871392 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-audit-policies\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.871429 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.871466 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.871500 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.913881 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.972560 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.972617 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.972672 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.972701 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-user-template-error\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.972840 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-audit-policies\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.972889 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.973701 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.973647 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-audit-policies\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.973798 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.973872 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-user-template-login\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.974373 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.974438 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.974845 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-session\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.975216 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76qvm\" (UniqueName: \"kubernetes.io/projected/4c0f6d47-9fa2-4518-8e18-517a23fe7053-kube-api-access-76qvm\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.975260 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.975285 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c0f6d47-9fa2-4518-8e18-517a23fe7053-audit-dir\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.975378 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c0f6d47-9fa2-4518-8e18-517a23fe7053-audit-dir\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.980680 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-user-template-login\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.981007 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.981972 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.982880 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-user-template-error\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.983606 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.984530 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:34 crc kubenswrapper[5047]: I0223 06:50:34.990831 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-session\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:35 crc kubenswrapper[5047]: I0223 06:50:35.011589 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76qvm\" (UniqueName: \"kubernetes.io/projected/4c0f6d47-9fa2-4518-8e18-517a23fe7053-kube-api-access-76qvm\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:35 crc kubenswrapper[5047]: I0223 06:50:35.441891 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 23 06:50:35 crc kubenswrapper[5047]: I0223 06:50:35.636559 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 23 06:50:35 crc kubenswrapper[5047]: I0223 06:50:35.648859 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:35 crc kubenswrapper[5047]: I0223 06:50:35.748563 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 23 06:50:35 crc kubenswrapper[5047]: I0223 06:50:35.758198 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 23 06:50:35 crc kubenswrapper[5047]: I0223 06:50:35.774119 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 23 06:50:35 crc kubenswrapper[5047]: E0223 06:50:35.973739 5047 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:50:35 crc kubenswrapper[5047]: E0223 06:50:35.973860 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-cliconfig podName:4c0f6d47-9fa2-4518-8e18-517a23fe7053 nodeName:}" failed. No retries permitted until 2026-02-23 06:50:36.473832436 +0000 UTC m=+358.725159570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-cliconfig") pod "oauth-openshift-66f68474cb-q6rsc" (UID: "4c0f6d47-9fa2-4518-8e18-517a23fe7053") : failed to sync configmap cache: timed out waiting for the condition Feb 23 06:50:35 crc kubenswrapper[5047]: E0223 06:50:35.976330 5047 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 23 06:50:35 crc kubenswrapper[5047]: E0223 06:50:35.976378 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-trusted-ca-bundle podName:4c0f6d47-9fa2-4518-8e18-517a23fe7053 nodeName:}" failed. No retries permitted until 2026-02-23 06:50:36.476366698 +0000 UTC m=+358.727693822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-trusted-ca-bundle") pod "oauth-openshift-66f68474cb-q6rsc" (UID: "4c0f6d47-9fa2-4518-8e18-517a23fe7053") : failed to sync configmap cache: timed out waiting for the condition Feb 23 06:50:35 crc kubenswrapper[5047]: I0223 06:50:35.996158 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 23 06:50:36 crc kubenswrapper[5047]: I0223 06:50:36.171892 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 23 06:50:36 crc kubenswrapper[5047]: I0223 06:50:36.277762 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 23 06:50:36 crc kubenswrapper[5047]: I0223 06:50:36.293827 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 23 06:50:36 crc kubenswrapper[5047]: I0223 06:50:36.501024 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:36 crc kubenswrapper[5047]: I0223 06:50:36.501150 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:36 crc kubenswrapper[5047]: I0223 06:50:36.502714 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:36 crc kubenswrapper[5047]: I0223 06:50:36.502801 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c0f6d47-9fa2-4518-8e18-517a23fe7053-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f68474cb-q6rsc\" (UID: \"4c0f6d47-9fa2-4518-8e18-517a23fe7053\") " pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:36 crc kubenswrapper[5047]: I0223 06:50:36.531185 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.021318 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66f68474cb-q6rsc"] Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.069754 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.072288 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.072378 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.108742 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.108980 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.108892 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.109096 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.109117 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.109181 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.109248 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.109278 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.109480 5047 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.109887 5047 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.109901 5047 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.109818 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.116617 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.210748 5047 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.210783 5047 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.279211 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" event={"ID":"4c0f6d47-9fa2-4518-8e18-517a23fe7053","Type":"ContainerStarted","Data":"964d9bd11136424829dda53f9f2753fd60e4ad3275036b74b56839656cc0bc7a"} Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.281109 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.281154 5047 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f8d745d21ae6c1773e3a01bb2870ab8e29f0d8c1da28ee8c544574a410a2d8e2" exitCode=137 Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.281204 5047 scope.go:117] "RemoveContainer" containerID="f8d745d21ae6c1773e3a01bb2870ab8e29f0d8c1da28ee8c544574a410a2d8e2" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.281369 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.299268 5047 scope.go:117] "RemoveContainer" containerID="f8d745d21ae6c1773e3a01bb2870ab8e29f0d8c1da28ee8c544574a410a2d8e2" Feb 23 06:50:37 crc kubenswrapper[5047]: E0223 06:50:37.299669 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d745d21ae6c1773e3a01bb2870ab8e29f0d8c1da28ee8c544574a410a2d8e2\": container with ID starting with f8d745d21ae6c1773e3a01bb2870ab8e29f0d8c1da28ee8c544574a410a2d8e2 not found: ID does not exist" containerID="f8d745d21ae6c1773e3a01bb2870ab8e29f0d8c1da28ee8c544574a410a2d8e2" Feb 23 06:50:37 crc kubenswrapper[5047]: I0223 06:50:37.299709 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d745d21ae6c1773e3a01bb2870ab8e29f0d8c1da28ee8c544574a410a2d8e2"} err="failed to get container status \"f8d745d21ae6c1773e3a01bb2870ab8e29f0d8c1da28ee8c544574a410a2d8e2\": rpc error: code = NotFound desc = could not find container \"f8d745d21ae6c1773e3a01bb2870ab8e29f0d8c1da28ee8c544574a410a2d8e2\": container with ID starting with f8d745d21ae6c1773e3a01bb2870ab8e29f0d8c1da28ee8c544574a410a2d8e2 not found: ID does not exist" Feb 23 06:50:38 crc kubenswrapper[5047]: I0223 06:50:38.278342 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 23 06:50:38 crc kubenswrapper[5047]: I0223 06:50:38.315508 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" event={"ID":"4c0f6d47-9fa2-4518-8e18-517a23fe7053","Type":"ContainerStarted","Data":"aea60a1d95e9e97fe3a5875bf1b45fedde4207d09118a4a512f77da59dced338"} Feb 23 06:50:38 crc kubenswrapper[5047]: I0223 06:50:38.317504 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:38 crc kubenswrapper[5047]: I0223 06:50:38.330389 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" Feb 23 06:50:38 crc kubenswrapper[5047]: I0223 06:50:38.360429 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 23 06:50:38 crc kubenswrapper[5047]: I0223 06:50:38.360967 5047 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 23 06:50:38 crc kubenswrapper[5047]: I0223 06:50:38.367368 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66f68474cb-q6rsc" podStartSLOduration=61.367343483 podStartE2EDuration="1m1.367343483s" podCreationTimestamp="2026-02-23 06:49:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:50:38.351351953 +0000 UTC m=+360.602679117" watchObservedRunningTime="2026-02-23 06:50:38.367343483 +0000 UTC m=+360.618670617" Feb 23 06:50:38 crc kubenswrapper[5047]: I0223 06:50:38.368930 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 23 06:50:38 crc kubenswrapper[5047]: I0223 06:50:38.368980 5047 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="4926f92e-7ed3-4b20-8717-293d50616162" Feb 23 06:50:38 crc kubenswrapper[5047]: I0223 06:50:38.371802 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 23 06:50:38 crc kubenswrapper[5047]: I0223 06:50:38.371840 5047 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="4926f92e-7ed3-4b20-8717-293d50616162" Feb 23 06:50:54 crc kubenswrapper[5047]: I0223 06:50:54.824094 5047 generic.go:334] "Generic (PLEG): container finished" podID="601765d3-2ae0-4cd2-a1fb-2c54de37487b" containerID="4c6fbfd459b7a7aaa45a0830c9aebf570aeed540eee5a950bb9132edecb8efb5" exitCode=0 Feb 23 06:50:54 crc kubenswrapper[5047]: I0223 06:50:54.824180 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" event={"ID":"601765d3-2ae0-4cd2-a1fb-2c54de37487b","Type":"ContainerDied","Data":"4c6fbfd459b7a7aaa45a0830c9aebf570aeed540eee5a950bb9132edecb8efb5"} Feb 23 06:50:54 crc kubenswrapper[5047]: I0223 06:50:54.825622 5047 scope.go:117] "RemoveContainer" containerID="4c6fbfd459b7a7aaa45a0830c9aebf570aeed540eee5a950bb9132edecb8efb5" Feb 23 06:50:55 crc kubenswrapper[5047]: I0223 06:50:55.835369 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" event={"ID":"601765d3-2ae0-4cd2-a1fb-2c54de37487b","Type":"ContainerStarted","Data":"93ab17697e146d1c348f2b45d7d1fd4d2dd5b698c048aaf60b9221ac55471369"} Feb 23 06:50:55 crc kubenswrapper[5047]: I0223 06:50:55.836103 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:50:55 crc kubenswrapper[5047]: I0223 06:50:55.839208 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:51:46 crc kubenswrapper[5047]: I0223 06:51:46.760429 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:51:46 crc kubenswrapper[5047]: I0223 06:51:46.761499 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.685244 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gxvkj"] Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.686721 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.708251 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gxvkj"] Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.796331 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4778bc8c-06d1-4126-bee6-018623335836-bound-sa-token\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.796384 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4778bc8c-06d1-4126-bee6-018623335836-trusted-ca\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.796434 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvz6q\" (UniqueName: \"kubernetes.io/projected/4778bc8c-06d1-4126-bee6-018623335836-kube-api-access-cvz6q\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.796549 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4778bc8c-06d1-4126-bee6-018623335836-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.796700 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.796736 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4778bc8c-06d1-4126-bee6-018623335836-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.796768 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4778bc8c-06d1-4126-bee6-018623335836-registry-certificates\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.796829 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4778bc8c-06d1-4126-bee6-018623335836-registry-tls\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.818855 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.898040 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4778bc8c-06d1-4126-bee6-018623335836-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.898141 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4778bc8c-06d1-4126-bee6-018623335836-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.898173 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4778bc8c-06d1-4126-bee6-018623335836-registry-certificates\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.898208 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4778bc8c-06d1-4126-bee6-018623335836-registry-tls\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.898259 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4778bc8c-06d1-4126-bee6-018623335836-bound-sa-token\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.898344 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4778bc8c-06d1-4126-bee6-018623335836-trusted-ca\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.898375 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvz6q\" (UniqueName: \"kubernetes.io/projected/4778bc8c-06d1-4126-bee6-018623335836-kube-api-access-cvz6q\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.898822 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4778bc8c-06d1-4126-bee6-018623335836-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.899613 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4778bc8c-06d1-4126-bee6-018623335836-registry-certificates\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.900398 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4778bc8c-06d1-4126-bee6-018623335836-trusted-ca\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.907623 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4778bc8c-06d1-4126-bee6-018623335836-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.907676 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4778bc8c-06d1-4126-bee6-018623335836-registry-tls\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.917422 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4778bc8c-06d1-4126-bee6-018623335836-bound-sa-token\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:08 crc kubenswrapper[5047]: I0223 06:52:08.917952 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvz6q\" (UniqueName: \"kubernetes.io/projected/4778bc8c-06d1-4126-bee6-018623335836-kube-api-access-cvz6q\") pod \"image-registry-66df7c8f76-gxvkj\" (UID: \"4778bc8c-06d1-4126-bee6-018623335836\") " pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:09 crc kubenswrapper[5047]: I0223 06:52:09.008317 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:09 crc kubenswrapper[5047]: I0223 06:52:09.242122 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gxvkj"] Feb 23 06:52:09 crc kubenswrapper[5047]: I0223 06:52:09.393458 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" event={"ID":"4778bc8c-06d1-4126-bee6-018623335836","Type":"ContainerStarted","Data":"e56225f3813d467364acd50750e37d89e3644964c04b7925624697828687fcf6"} Feb 23 06:52:10 crc kubenswrapper[5047]: I0223 06:52:10.402887 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" event={"ID":"4778bc8c-06d1-4126-bee6-018623335836","Type":"ContainerStarted","Data":"e3d8c0f9b186b2fad116c119f10e0c0ee10320c93e65f62e8be954e0840a829a"} Feb 23 06:52:10 crc kubenswrapper[5047]: I0223 06:52:10.403395 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:10 crc kubenswrapper[5047]: I0223 06:52:10.429376 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" podStartSLOduration=2.429348695 podStartE2EDuration="2.429348695s" podCreationTimestamp="2026-02-23 06:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:52:10.426597878 +0000 UTC m=+452.677925052" watchObservedRunningTime="2026-02-23 06:52:10.429348695 +0000 UTC m=+452.680675839" Feb 23 06:52:16 crc kubenswrapper[5047]: I0223 06:52:16.760273 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:52:16 crc kubenswrapper[5047]: I0223 06:52:16.761206 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.511329 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gnn8v"] Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.511864 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gnn8v" podUID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" containerName="registry-server" containerID="cri-o://e412cd0935d9a8ee580de3acadeb6ddb981572eea673d6e4ec9c08529f32320d" gracePeriod=30 Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.527107 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zhsns"] Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.531715 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fdfm4"] Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.532649 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zhsns" podUID="83f104c4-c1f9-4714-a807-2a6368b538fd" containerName="registry-server" containerID="cri-o://8ed7f78c234aa50c4fe1b7033a3f434673f22d836693d48e45e440cdcf28a6da" gracePeriod=30 Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.533178 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" podUID="601765d3-2ae0-4cd2-a1fb-2c54de37487b" containerName="marketplace-operator" containerID="cri-o://93ab17697e146d1c348f2b45d7d1fd4d2dd5b698c048aaf60b9221ac55471369" gracePeriod=30 Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.554848 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g6khf"] Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.555306 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g6khf" podUID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" containerName="registry-server" containerID="cri-o://18fae45da81a0ab477442f3ee4e08d740fbde08563cb3c4c9029e5357f14f863" gracePeriod=30 Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.568718 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zp4xr"] Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.569817 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.573941 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f7bfp"] Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.574323 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f7bfp" podUID="c470fb1c-868d-420e-b6e2-61369cecd6c3" containerName="registry-server" containerID="cri-o://d4e76bc72a160f061b4b98927080be677655d8f47dc067970180be6891ece671" gracePeriod=30 Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.585787 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zp4xr"] Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.654294 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sxrd\" (UniqueName: \"kubernetes.io/projected/3e7c7b9b-b216-4442-88b2-6c2ad1506955-kube-api-access-5sxrd\") pod \"marketplace-operator-79b997595-zp4xr\" (UID: \"3e7c7b9b-b216-4442-88b2-6c2ad1506955\") " pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.654384 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e7c7b9b-b216-4442-88b2-6c2ad1506955-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zp4xr\" (UID: \"3e7c7b9b-b216-4442-88b2-6c2ad1506955\") " pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.654422 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3e7c7b9b-b216-4442-88b2-6c2ad1506955-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zp4xr\" (UID: \"3e7c7b9b-b216-4442-88b2-6c2ad1506955\") " pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.756135 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sxrd\" (UniqueName: \"kubernetes.io/projected/3e7c7b9b-b216-4442-88b2-6c2ad1506955-kube-api-access-5sxrd\") pod \"marketplace-operator-79b997595-zp4xr\" (UID: \"3e7c7b9b-b216-4442-88b2-6c2ad1506955\") " pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.757276 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e7c7b9b-b216-4442-88b2-6c2ad1506955-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zp4xr\" (UID: \"3e7c7b9b-b216-4442-88b2-6c2ad1506955\") " pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.757320 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3e7c7b9b-b216-4442-88b2-6c2ad1506955-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zp4xr\" (UID: \"3e7c7b9b-b216-4442-88b2-6c2ad1506955\") " pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.763638 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e7c7b9b-b216-4442-88b2-6c2ad1506955-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zp4xr\" (UID: \"3e7c7b9b-b216-4442-88b2-6c2ad1506955\") " pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.769886 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3e7c7b9b-b216-4442-88b2-6c2ad1506955-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zp4xr\" (UID: \"3e7c7b9b-b216-4442-88b2-6c2ad1506955\") " pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.773385 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sxrd\" (UniqueName: \"kubernetes.io/projected/3e7c7b9b-b216-4442-88b2-6c2ad1506955-kube-api-access-5sxrd\") pod \"marketplace-operator-79b997595-zp4xr\" (UID: \"3e7c7b9b-b216-4442-88b2-6c2ad1506955\") " pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.947091 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.985636 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.986214 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:52:17 crc kubenswrapper[5047]: I0223 06:52:17.993654 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.012202 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.060859 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/601765d3-2ae0-4cd2-a1fb-2c54de37487b-marketplace-operator-metrics\") pod \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\" (UID: \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.060937 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-utilities\") pod \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\" (UID: \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.061017 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/601765d3-2ae0-4cd2-a1fb-2c54de37487b-marketplace-trusted-ca\") pod \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\" (UID: \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.061103 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46419724-8d7c-47d4-9d51-3aef5c54ab1b-catalog-content\") pod \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\" (UID: \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.061132 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-catalog-content\") pod \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\" (UID: \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.061198 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f104c4-c1f9-4714-a807-2a6368b538fd-catalog-content\") pod \"83f104c4-c1f9-4714-a807-2a6368b538fd\" (UID: \"83f104c4-c1f9-4714-a807-2a6368b538fd\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.061249 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5pts\" (UniqueName: \"kubernetes.io/projected/83f104c4-c1f9-4714-a807-2a6368b538fd-kube-api-access-q5pts\") pod \"83f104c4-c1f9-4714-a807-2a6368b538fd\" (UID: \"83f104c4-c1f9-4714-a807-2a6368b538fd\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.061296 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jl2s\" (UniqueName: \"kubernetes.io/projected/46419724-8d7c-47d4-9d51-3aef5c54ab1b-kube-api-access-4jl2s\") pod \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\" (UID: \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.061321 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f104c4-c1f9-4714-a807-2a6368b538fd-utilities\") pod \"83f104c4-c1f9-4714-a807-2a6368b538fd\" (UID: \"83f104c4-c1f9-4714-a807-2a6368b538fd\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.061359 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhzfq\" (UniqueName: \"kubernetes.io/projected/601765d3-2ae0-4cd2-a1fb-2c54de37487b-kube-api-access-qhzfq\") pod \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\" (UID: \"601765d3-2ae0-4cd2-a1fb-2c54de37487b\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.061379 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46419724-8d7c-47d4-9d51-3aef5c54ab1b-utilities\") pod \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\" (UID: \"46419724-8d7c-47d4-9d51-3aef5c54ab1b\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.061399 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f7xd\" (UniqueName: \"kubernetes.io/projected/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-kube-api-access-4f7xd\") pod \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\" (UID: \"8964090d-1ca0-4cb9-b2a6-f4293fdf9591\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.064204 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-utilities" (OuterVolumeSpecName: "utilities") pod "8964090d-1ca0-4cb9-b2a6-f4293fdf9591" (UID: "8964090d-1ca0-4cb9-b2a6-f4293fdf9591"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.064750 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46419724-8d7c-47d4-9d51-3aef5c54ab1b-utilities" (OuterVolumeSpecName: "utilities") pod "46419724-8d7c-47d4-9d51-3aef5c54ab1b" (UID: "46419724-8d7c-47d4-9d51-3aef5c54ab1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.065526 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/601765d3-2ae0-4cd2-a1fb-2c54de37487b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "601765d3-2ae0-4cd2-a1fb-2c54de37487b" (UID: "601765d3-2ae0-4cd2-a1fb-2c54de37487b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.069832 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-kube-api-access-4f7xd" (OuterVolumeSpecName: "kube-api-access-4f7xd") pod "8964090d-1ca0-4cb9-b2a6-f4293fdf9591" (UID: "8964090d-1ca0-4cb9-b2a6-f4293fdf9591"). InnerVolumeSpecName "kube-api-access-4f7xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.071121 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f104c4-c1f9-4714-a807-2a6368b538fd-utilities" (OuterVolumeSpecName: "utilities") pod "83f104c4-c1f9-4714-a807-2a6368b538fd" (UID: "83f104c4-c1f9-4714-a807-2a6368b538fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.071243 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/601765d3-2ae0-4cd2-a1fb-2c54de37487b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "601765d3-2ae0-4cd2-a1fb-2c54de37487b" (UID: "601765d3-2ae0-4cd2-a1fb-2c54de37487b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.071361 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.072094 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/601765d3-2ae0-4cd2-a1fb-2c54de37487b-kube-api-access-qhzfq" (OuterVolumeSpecName: "kube-api-access-qhzfq") pod "601765d3-2ae0-4cd2-a1fb-2c54de37487b" (UID: "601765d3-2ae0-4cd2-a1fb-2c54de37487b"). InnerVolumeSpecName "kube-api-access-qhzfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.072401 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46419724-8d7c-47d4-9d51-3aef5c54ab1b-kube-api-access-4jl2s" (OuterVolumeSpecName: "kube-api-access-4jl2s") pod "46419724-8d7c-47d4-9d51-3aef5c54ab1b" (UID: "46419724-8d7c-47d4-9d51-3aef5c54ab1b"). InnerVolumeSpecName "kube-api-access-4jl2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.075457 5047 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/601765d3-2ae0-4cd2-a1fb-2c54de37487b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.075505 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.075517 5047 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/601765d3-2ae0-4cd2-a1fb-2c54de37487b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.075532 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jl2s\" (UniqueName: \"kubernetes.io/projected/46419724-8d7c-47d4-9d51-3aef5c54ab1b-kube-api-access-4jl2s\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.075543 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f104c4-c1f9-4714-a807-2a6368b538fd-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.075554 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhzfq\" (UniqueName: \"kubernetes.io/projected/601765d3-2ae0-4cd2-a1fb-2c54de37487b-kube-api-access-qhzfq\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.075563 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46419724-8d7c-47d4-9d51-3aef5c54ab1b-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.075572 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f7xd\" (UniqueName: \"kubernetes.io/projected/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-kube-api-access-4f7xd\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.081192 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f104c4-c1f9-4714-a807-2a6368b538fd-kube-api-access-q5pts" (OuterVolumeSpecName: "kube-api-access-q5pts") pod "83f104c4-c1f9-4714-a807-2a6368b538fd" (UID: "83f104c4-c1f9-4714-a807-2a6368b538fd"). InnerVolumeSpecName "kube-api-access-q5pts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.124236 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8964090d-1ca0-4cb9-b2a6-f4293fdf9591" (UID: "8964090d-1ca0-4cb9-b2a6-f4293fdf9591"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.139964 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f104c4-c1f9-4714-a807-2a6368b538fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83f104c4-c1f9-4714-a807-2a6368b538fd" (UID: "83f104c4-c1f9-4714-a807-2a6368b538fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.144218 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46419724-8d7c-47d4-9d51-3aef5c54ab1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46419724-8d7c-47d4-9d51-3aef5c54ab1b" (UID: "46419724-8d7c-47d4-9d51-3aef5c54ab1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.176742 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c470fb1c-868d-420e-b6e2-61369cecd6c3-utilities\") pod \"c470fb1c-868d-420e-b6e2-61369cecd6c3\" (UID: \"c470fb1c-868d-420e-b6e2-61369cecd6c3\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.176799 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmbls\" (UniqueName: \"kubernetes.io/projected/c470fb1c-868d-420e-b6e2-61369cecd6c3-kube-api-access-fmbls\") pod \"c470fb1c-868d-420e-b6e2-61369cecd6c3\" (UID: \"c470fb1c-868d-420e-b6e2-61369cecd6c3\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.176849 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c470fb1c-868d-420e-b6e2-61369cecd6c3-catalog-content\") pod \"c470fb1c-868d-420e-b6e2-61369cecd6c3\" (UID: \"c470fb1c-868d-420e-b6e2-61369cecd6c3\") " Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.177036 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46419724-8d7c-47d4-9d51-3aef5c54ab1b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.177049 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8964090d-1ca0-4cb9-b2a6-f4293fdf9591-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.177058 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f104c4-c1f9-4714-a807-2a6368b538fd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.177067 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5pts\" (UniqueName: \"kubernetes.io/projected/83f104c4-c1f9-4714-a807-2a6368b538fd-kube-api-access-q5pts\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.178097 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c470fb1c-868d-420e-b6e2-61369cecd6c3-utilities" (OuterVolumeSpecName: "utilities") pod "c470fb1c-868d-420e-b6e2-61369cecd6c3" (UID: "c470fb1c-868d-420e-b6e2-61369cecd6c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.183534 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c470fb1c-868d-420e-b6e2-61369cecd6c3-kube-api-access-fmbls" (OuterVolumeSpecName: "kube-api-access-fmbls") pod "c470fb1c-868d-420e-b6e2-61369cecd6c3" (UID: "c470fb1c-868d-420e-b6e2-61369cecd6c3"). InnerVolumeSpecName "kube-api-access-fmbls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.237155 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zp4xr"] Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.280892 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c470fb1c-868d-420e-b6e2-61369cecd6c3-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.280959 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmbls\" (UniqueName: \"kubernetes.io/projected/c470fb1c-868d-420e-b6e2-61369cecd6c3-kube-api-access-fmbls\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.364299 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c470fb1c-868d-420e-b6e2-61369cecd6c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c470fb1c-868d-420e-b6e2-61369cecd6c3" (UID: "c470fb1c-868d-420e-b6e2-61369cecd6c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.384850 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c470fb1c-868d-420e-b6e2-61369cecd6c3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.457595 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" event={"ID":"3e7c7b9b-b216-4442-88b2-6c2ad1506955","Type":"ContainerStarted","Data":"717a54c233f838d2b3959e6337882e81c16d68203792d36cbd1af2f4e4f035f1"} Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.457655 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" event={"ID":"3e7c7b9b-b216-4442-88b2-6c2ad1506955","Type":"ContainerStarted","Data":"a65d982c4e0cd8ea5e16398ff5b34bb93032a51d10f9eba4996e5ec3be3dce6e"} Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.458800 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.460043 5047 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zp4xr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" start-of-body= Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.460223 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" podUID="3e7c7b9b-b216-4442-88b2-6c2ad1506955" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.460813 5047 generic.go:334] "Generic (PLEG): container finished" podID="83f104c4-c1f9-4714-a807-2a6368b538fd" containerID="8ed7f78c234aa50c4fe1b7033a3f434673f22d836693d48e45e440cdcf28a6da" exitCode=0 Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.460885 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhsns" event={"ID":"83f104c4-c1f9-4714-a807-2a6368b538fd","Type":"ContainerDied","Data":"8ed7f78c234aa50c4fe1b7033a3f434673f22d836693d48e45e440cdcf28a6da"} Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.460942 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhsns" event={"ID":"83f104c4-c1f9-4714-a807-2a6368b538fd","Type":"ContainerDied","Data":"79394dc7f58674e369e970e52ac36b4d6c6a8b93a10204fcb2c952b8cc81ba35"} Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.460979 5047 scope.go:117] "RemoveContainer" containerID="8ed7f78c234aa50c4fe1b7033a3f434673f22d836693d48e45e440cdcf28a6da" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.461141 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zhsns" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.465680 5047 generic.go:334] "Generic (PLEG): container finished" podID="c470fb1c-868d-420e-b6e2-61369cecd6c3" containerID="d4e76bc72a160f061b4b98927080be677655d8f47dc067970180be6891ece671" exitCode=0 Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.465771 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7bfp" event={"ID":"c470fb1c-868d-420e-b6e2-61369cecd6c3","Type":"ContainerDied","Data":"d4e76bc72a160f061b4b98927080be677655d8f47dc067970180be6891ece671"} Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.465814 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7bfp" event={"ID":"c470fb1c-868d-420e-b6e2-61369cecd6c3","Type":"ContainerDied","Data":"3433b147794ed354200ac095348e1ec0837f0ebb942c33b162ae541674ee9648"} Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.465754 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7bfp" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.468285 5047 generic.go:334] "Generic (PLEG): container finished" podID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" containerID="e412cd0935d9a8ee580de3acadeb6ddb981572eea673d6e4ec9c08529f32320d" exitCode=0 Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.468459 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnn8v" event={"ID":"46419724-8d7c-47d4-9d51-3aef5c54ab1b","Type":"ContainerDied","Data":"e412cd0935d9a8ee580de3acadeb6ddb981572eea673d6e4ec9c08529f32320d"} Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.468570 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnn8v" event={"ID":"46419724-8d7c-47d4-9d51-3aef5c54ab1b","Type":"ContainerDied","Data":"7b61f12517e6ee30d60048f3d901538c200c6c6367f94509aa8a817732669022"} Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.468704 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnn8v" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.471672 5047 generic.go:334] "Generic (PLEG): container finished" podID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" containerID="18fae45da81a0ab477442f3ee4e08d740fbde08563cb3c4c9029e5357f14f863" exitCode=0 Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.471778 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6khf" event={"ID":"8964090d-1ca0-4cb9-b2a6-f4293fdf9591","Type":"ContainerDied","Data":"18fae45da81a0ab477442f3ee4e08d740fbde08563cb3c4c9029e5357f14f863"} Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.471849 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g6khf" event={"ID":"8964090d-1ca0-4cb9-b2a6-f4293fdf9591","Type":"ContainerDied","Data":"a5f3a076d07f09179a85af2e69202d09d71ec16134ad0f709bf9ee2d323ac77e"} Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.471968 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g6khf" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.477482 5047 generic.go:334] "Generic (PLEG): container finished" podID="601765d3-2ae0-4cd2-a1fb-2c54de37487b" containerID="93ab17697e146d1c348f2b45d7d1fd4d2dd5b698c048aaf60b9221ac55471369" exitCode=0 Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.477541 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" event={"ID":"601765d3-2ae0-4cd2-a1fb-2c54de37487b","Type":"ContainerDied","Data":"93ab17697e146d1c348f2b45d7d1fd4d2dd5b698c048aaf60b9221ac55471369"} Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.477580 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" event={"ID":"601765d3-2ae0-4cd2-a1fb-2c54de37487b","Type":"ContainerDied","Data":"19f0942471b50cd0bd381679d57b51dda2bf345b557744853c5a24aedfe9cbdf"} Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.477752 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fdfm4" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.483287 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" podStartSLOduration=1.483270429 podStartE2EDuration="1.483270429s" podCreationTimestamp="2026-02-23 06:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:52:18.478552966 +0000 UTC m=+460.729880100" watchObservedRunningTime="2026-02-23 06:52:18.483270429 +0000 UTC m=+460.734597573" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.489357 5047 scope.go:117] "RemoveContainer" containerID="b46d2932d6ae16021fc34ef9b5b8220f11fcb00001da97658dc74da6a6aa31e2" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.501675 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zhsns"] Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.504848 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zhsns"] Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.524878 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g6khf"] Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.525337 5047 scope.go:117] "RemoveContainer" containerID="dbd359375b507a12f1592d9338dfc7b78470829f3cd3aa042fb3511727e764e7" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.540257 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g6khf"] Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.548102 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gnn8v"] Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.552172 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gnn8v"] Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.554709 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fdfm4"] Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.556268 5047 scope.go:117] "RemoveContainer" containerID="8ed7f78c234aa50c4fe1b7033a3f434673f22d836693d48e45e440cdcf28a6da" Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.558366 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ed7f78c234aa50c4fe1b7033a3f434673f22d836693d48e45e440cdcf28a6da\": container with ID starting with 8ed7f78c234aa50c4fe1b7033a3f434673f22d836693d48e45e440cdcf28a6da not found: ID does not exist" containerID="8ed7f78c234aa50c4fe1b7033a3f434673f22d836693d48e45e440cdcf28a6da" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.558524 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ed7f78c234aa50c4fe1b7033a3f434673f22d836693d48e45e440cdcf28a6da"} err="failed to get container status \"8ed7f78c234aa50c4fe1b7033a3f434673f22d836693d48e45e440cdcf28a6da\": rpc error: code = NotFound desc = could not find container \"8ed7f78c234aa50c4fe1b7033a3f434673f22d836693d48e45e440cdcf28a6da\": container with ID starting with 8ed7f78c234aa50c4fe1b7033a3f434673f22d836693d48e45e440cdcf28a6da not found: ID does not exist" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.558651 5047 scope.go:117] "RemoveContainer" containerID="b46d2932d6ae16021fc34ef9b5b8220f11fcb00001da97658dc74da6a6aa31e2" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.559990 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fdfm4"] Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.561759 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b46d2932d6ae16021fc34ef9b5b8220f11fcb00001da97658dc74da6a6aa31e2\": container with ID starting with b46d2932d6ae16021fc34ef9b5b8220f11fcb00001da97658dc74da6a6aa31e2 not found: ID does not exist" containerID="b46d2932d6ae16021fc34ef9b5b8220f11fcb00001da97658dc74da6a6aa31e2" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.561806 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b46d2932d6ae16021fc34ef9b5b8220f11fcb00001da97658dc74da6a6aa31e2"} err="failed to get container status \"b46d2932d6ae16021fc34ef9b5b8220f11fcb00001da97658dc74da6a6aa31e2\": rpc error: code = NotFound desc = could not find container \"b46d2932d6ae16021fc34ef9b5b8220f11fcb00001da97658dc74da6a6aa31e2\": container with ID starting with b46d2932d6ae16021fc34ef9b5b8220f11fcb00001da97658dc74da6a6aa31e2 not found: ID does not exist" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.561837 5047 scope.go:117] "RemoveContainer" containerID="dbd359375b507a12f1592d9338dfc7b78470829f3cd3aa042fb3511727e764e7" Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.562271 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbd359375b507a12f1592d9338dfc7b78470829f3cd3aa042fb3511727e764e7\": container with ID starting with dbd359375b507a12f1592d9338dfc7b78470829f3cd3aa042fb3511727e764e7 not found: ID does not exist" containerID="dbd359375b507a12f1592d9338dfc7b78470829f3cd3aa042fb3511727e764e7" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.562304 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd359375b507a12f1592d9338dfc7b78470829f3cd3aa042fb3511727e764e7"} err="failed to get container status \"dbd359375b507a12f1592d9338dfc7b78470829f3cd3aa042fb3511727e764e7\": rpc error: code = NotFound desc = could not find container \"dbd359375b507a12f1592d9338dfc7b78470829f3cd3aa042fb3511727e764e7\": container with ID starting with dbd359375b507a12f1592d9338dfc7b78470829f3cd3aa042fb3511727e764e7 not found: ID does not exist" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.562327 5047 scope.go:117] "RemoveContainer" containerID="d4e76bc72a160f061b4b98927080be677655d8f47dc067970180be6891ece671" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.562794 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f7bfp"] Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.568490 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f7bfp"] Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.582191 5047 scope.go:117] "RemoveContainer" containerID="f259ff89e68ae58894a56f89824d3c3569b17b202f718f0f9070c9506508cb78" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.602650 5047 scope.go:117] "RemoveContainer" containerID="69410343c94b91aabfc9390d5bf57547d09b63efb0eec0fc80fa9adc8eb0aa42" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.628615 5047 scope.go:117] "RemoveContainer" containerID="d4e76bc72a160f061b4b98927080be677655d8f47dc067970180be6891ece671" Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.629168 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e76bc72a160f061b4b98927080be677655d8f47dc067970180be6891ece671\": container with ID starting with d4e76bc72a160f061b4b98927080be677655d8f47dc067970180be6891ece671 not found: ID does not exist" containerID="d4e76bc72a160f061b4b98927080be677655d8f47dc067970180be6891ece671" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.629243 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e76bc72a160f061b4b98927080be677655d8f47dc067970180be6891ece671"} err="failed to get container status \"d4e76bc72a160f061b4b98927080be677655d8f47dc067970180be6891ece671\": rpc error: code = NotFound desc = could not find container \"d4e76bc72a160f061b4b98927080be677655d8f47dc067970180be6891ece671\": container with ID starting with d4e76bc72a160f061b4b98927080be677655d8f47dc067970180be6891ece671 not found: ID does not exist" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.629287 5047 scope.go:117] "RemoveContainer" containerID="f259ff89e68ae58894a56f89824d3c3569b17b202f718f0f9070c9506508cb78" Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.629647 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f259ff89e68ae58894a56f89824d3c3569b17b202f718f0f9070c9506508cb78\": container with ID starting with f259ff89e68ae58894a56f89824d3c3569b17b202f718f0f9070c9506508cb78 not found: ID does not exist" containerID="f259ff89e68ae58894a56f89824d3c3569b17b202f718f0f9070c9506508cb78" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.629701 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f259ff89e68ae58894a56f89824d3c3569b17b202f718f0f9070c9506508cb78"} err="failed to get container status \"f259ff89e68ae58894a56f89824d3c3569b17b202f718f0f9070c9506508cb78\": rpc error: code = NotFound desc = could not find container \"f259ff89e68ae58894a56f89824d3c3569b17b202f718f0f9070c9506508cb78\": container with ID starting with f259ff89e68ae58894a56f89824d3c3569b17b202f718f0f9070c9506508cb78 not found: ID does not exist" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.629730 5047 scope.go:117] "RemoveContainer" containerID="69410343c94b91aabfc9390d5bf57547d09b63efb0eec0fc80fa9adc8eb0aa42" Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.630291 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69410343c94b91aabfc9390d5bf57547d09b63efb0eec0fc80fa9adc8eb0aa42\": container with ID starting with 69410343c94b91aabfc9390d5bf57547d09b63efb0eec0fc80fa9adc8eb0aa42 not found: ID does not exist" containerID="69410343c94b91aabfc9390d5bf57547d09b63efb0eec0fc80fa9adc8eb0aa42" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.630316 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69410343c94b91aabfc9390d5bf57547d09b63efb0eec0fc80fa9adc8eb0aa42"} err="failed to get container status \"69410343c94b91aabfc9390d5bf57547d09b63efb0eec0fc80fa9adc8eb0aa42\": rpc error: code = NotFound desc = could not find container \"69410343c94b91aabfc9390d5bf57547d09b63efb0eec0fc80fa9adc8eb0aa42\": container with ID starting with 69410343c94b91aabfc9390d5bf57547d09b63efb0eec0fc80fa9adc8eb0aa42 not found: ID does not exist" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.630334 5047 scope.go:117] "RemoveContainer" containerID="e412cd0935d9a8ee580de3acadeb6ddb981572eea673d6e4ec9c08529f32320d" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.703068 5047 scope.go:117] "RemoveContainer" containerID="c03820942d0513eaa46c1cc655a487c4ade9e636f7ba47b413748deb5d7a5f1f" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.723660 5047 scope.go:117] "RemoveContainer" containerID="88a31aef10db7dce12dfcfb30eddfe9c8a634bdbaf41617d116a9153b08e4a0f" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.759966 5047 scope.go:117] "RemoveContainer" containerID="e412cd0935d9a8ee580de3acadeb6ddb981572eea673d6e4ec9c08529f32320d" Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.761736 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e412cd0935d9a8ee580de3acadeb6ddb981572eea673d6e4ec9c08529f32320d\": container with ID starting with e412cd0935d9a8ee580de3acadeb6ddb981572eea673d6e4ec9c08529f32320d not found: ID does not exist" containerID="e412cd0935d9a8ee580de3acadeb6ddb981572eea673d6e4ec9c08529f32320d" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.761783 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e412cd0935d9a8ee580de3acadeb6ddb981572eea673d6e4ec9c08529f32320d"} err="failed to get container status \"e412cd0935d9a8ee580de3acadeb6ddb981572eea673d6e4ec9c08529f32320d\": rpc error: code = NotFound desc = could not find container \"e412cd0935d9a8ee580de3acadeb6ddb981572eea673d6e4ec9c08529f32320d\": container with ID starting with e412cd0935d9a8ee580de3acadeb6ddb981572eea673d6e4ec9c08529f32320d not found: ID does not exist" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.761808 5047 scope.go:117] "RemoveContainer" containerID="c03820942d0513eaa46c1cc655a487c4ade9e636f7ba47b413748deb5d7a5f1f" Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.762156 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c03820942d0513eaa46c1cc655a487c4ade9e636f7ba47b413748deb5d7a5f1f\": container with ID starting with c03820942d0513eaa46c1cc655a487c4ade9e636f7ba47b413748deb5d7a5f1f not found: ID does not exist" containerID="c03820942d0513eaa46c1cc655a487c4ade9e636f7ba47b413748deb5d7a5f1f" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.762187 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c03820942d0513eaa46c1cc655a487c4ade9e636f7ba47b413748deb5d7a5f1f"} err="failed to get container status \"c03820942d0513eaa46c1cc655a487c4ade9e636f7ba47b413748deb5d7a5f1f\": rpc error: code = NotFound desc = could not find container \"c03820942d0513eaa46c1cc655a487c4ade9e636f7ba47b413748deb5d7a5f1f\": container with ID starting with c03820942d0513eaa46c1cc655a487c4ade9e636f7ba47b413748deb5d7a5f1f not found: ID does not exist" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.762202 5047 scope.go:117] "RemoveContainer" containerID="88a31aef10db7dce12dfcfb30eddfe9c8a634bdbaf41617d116a9153b08e4a0f" Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.762604 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a31aef10db7dce12dfcfb30eddfe9c8a634bdbaf41617d116a9153b08e4a0f\": container with ID starting with 88a31aef10db7dce12dfcfb30eddfe9c8a634bdbaf41617d116a9153b08e4a0f not found: ID does not exist" containerID="88a31aef10db7dce12dfcfb30eddfe9c8a634bdbaf41617d116a9153b08e4a0f" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.762669 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a31aef10db7dce12dfcfb30eddfe9c8a634bdbaf41617d116a9153b08e4a0f"} err="failed to get container status \"88a31aef10db7dce12dfcfb30eddfe9c8a634bdbaf41617d116a9153b08e4a0f\": rpc error: code = NotFound desc = could not find container \"88a31aef10db7dce12dfcfb30eddfe9c8a634bdbaf41617d116a9153b08e4a0f\": container with ID starting with 88a31aef10db7dce12dfcfb30eddfe9c8a634bdbaf41617d116a9153b08e4a0f not found: ID does not exist" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.762711 5047 scope.go:117] "RemoveContainer" containerID="18fae45da81a0ab477442f3ee4e08d740fbde08563cb3c4c9029e5357f14f863" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.776666 5047 scope.go:117] "RemoveContainer" containerID="3cf758c7c7fedd67412aae0fb43c219720ae2afbd685a4157a4c306141f1359b" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.795140 5047 scope.go:117] "RemoveContainer" containerID="ea988d7f2f3e97a697e52b9c07413bc629420ae562398d6b7fd7258f51474f7c" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.812391 5047 scope.go:117] "RemoveContainer" containerID="18fae45da81a0ab477442f3ee4e08d740fbde08563cb3c4c9029e5357f14f863" Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.812842 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18fae45da81a0ab477442f3ee4e08d740fbde08563cb3c4c9029e5357f14f863\": container with ID starting with 18fae45da81a0ab477442f3ee4e08d740fbde08563cb3c4c9029e5357f14f863 not found: ID does not exist" containerID="18fae45da81a0ab477442f3ee4e08d740fbde08563cb3c4c9029e5357f14f863" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.812891 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18fae45da81a0ab477442f3ee4e08d740fbde08563cb3c4c9029e5357f14f863"} err="failed to get container status \"18fae45da81a0ab477442f3ee4e08d740fbde08563cb3c4c9029e5357f14f863\": rpc error: code = NotFound desc = could not find container \"18fae45da81a0ab477442f3ee4e08d740fbde08563cb3c4c9029e5357f14f863\": container with ID starting with 18fae45da81a0ab477442f3ee4e08d740fbde08563cb3c4c9029e5357f14f863 not found: ID does not exist" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.812940 5047 scope.go:117] "RemoveContainer" containerID="3cf758c7c7fedd67412aae0fb43c219720ae2afbd685a4157a4c306141f1359b" Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.813446 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cf758c7c7fedd67412aae0fb43c219720ae2afbd685a4157a4c306141f1359b\": container with ID starting with 3cf758c7c7fedd67412aae0fb43c219720ae2afbd685a4157a4c306141f1359b not found: ID does not exist" containerID="3cf758c7c7fedd67412aae0fb43c219720ae2afbd685a4157a4c306141f1359b" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.813489 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf758c7c7fedd67412aae0fb43c219720ae2afbd685a4157a4c306141f1359b"} err="failed to get container status \"3cf758c7c7fedd67412aae0fb43c219720ae2afbd685a4157a4c306141f1359b\": rpc error: code = NotFound desc = could not find container \"3cf758c7c7fedd67412aae0fb43c219720ae2afbd685a4157a4c306141f1359b\": container with ID starting with 3cf758c7c7fedd67412aae0fb43c219720ae2afbd685a4157a4c306141f1359b not found: ID does not exist" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.813523 5047 scope.go:117] "RemoveContainer" containerID="ea988d7f2f3e97a697e52b9c07413bc629420ae562398d6b7fd7258f51474f7c" Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.813788 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea988d7f2f3e97a697e52b9c07413bc629420ae562398d6b7fd7258f51474f7c\": container with ID starting with ea988d7f2f3e97a697e52b9c07413bc629420ae562398d6b7fd7258f51474f7c not found: ID does not exist" containerID="ea988d7f2f3e97a697e52b9c07413bc629420ae562398d6b7fd7258f51474f7c" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.813810 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea988d7f2f3e97a697e52b9c07413bc629420ae562398d6b7fd7258f51474f7c"} err="failed to get container status \"ea988d7f2f3e97a697e52b9c07413bc629420ae562398d6b7fd7258f51474f7c\": rpc error: code = NotFound desc = could not find container \"ea988d7f2f3e97a697e52b9c07413bc629420ae562398d6b7fd7258f51474f7c\": container with ID starting with ea988d7f2f3e97a697e52b9c07413bc629420ae562398d6b7fd7258f51474f7c not found: ID does not exist" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.813831 5047 scope.go:117] "RemoveContainer" containerID="93ab17697e146d1c348f2b45d7d1fd4d2dd5b698c048aaf60b9221ac55471369" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.827456 5047 scope.go:117] "RemoveContainer" containerID="4c6fbfd459b7a7aaa45a0830c9aebf570aeed540eee5a950bb9132edecb8efb5" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.841778 5047 scope.go:117] "RemoveContainer" containerID="93ab17697e146d1c348f2b45d7d1fd4d2dd5b698c048aaf60b9221ac55471369" Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.843252 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ab17697e146d1c348f2b45d7d1fd4d2dd5b698c048aaf60b9221ac55471369\": container with ID starting with 93ab17697e146d1c348f2b45d7d1fd4d2dd5b698c048aaf60b9221ac55471369 not found: ID does not exist" containerID="93ab17697e146d1c348f2b45d7d1fd4d2dd5b698c048aaf60b9221ac55471369" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.843308 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ab17697e146d1c348f2b45d7d1fd4d2dd5b698c048aaf60b9221ac55471369"} err="failed to get container status \"93ab17697e146d1c348f2b45d7d1fd4d2dd5b698c048aaf60b9221ac55471369\": rpc error: code = NotFound desc = could not find container \"93ab17697e146d1c348f2b45d7d1fd4d2dd5b698c048aaf60b9221ac55471369\": container with ID starting with 93ab17697e146d1c348f2b45d7d1fd4d2dd5b698c048aaf60b9221ac55471369 not found: ID does not exist" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.843344 5047 scope.go:117] "RemoveContainer" containerID="4c6fbfd459b7a7aaa45a0830c9aebf570aeed540eee5a950bb9132edecb8efb5" Feb 23 06:52:18 crc kubenswrapper[5047]: E0223 06:52:18.844364 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c6fbfd459b7a7aaa45a0830c9aebf570aeed540eee5a950bb9132edecb8efb5\": container with ID starting with 4c6fbfd459b7a7aaa45a0830c9aebf570aeed540eee5a950bb9132edecb8efb5 not found: ID does not exist" containerID="4c6fbfd459b7a7aaa45a0830c9aebf570aeed540eee5a950bb9132edecb8efb5" Feb 23 06:52:18 crc kubenswrapper[5047]: I0223 06:52:18.844406 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c6fbfd459b7a7aaa45a0830c9aebf570aeed540eee5a950bb9132edecb8efb5"} err="failed to get container status \"4c6fbfd459b7a7aaa45a0830c9aebf570aeed540eee5a950bb9132edecb8efb5\": rpc error: code = NotFound desc = could not find container \"4c6fbfd459b7a7aaa45a0830c9aebf570aeed540eee5a950bb9132edecb8efb5\": container with ID starting with 4c6fbfd459b7a7aaa45a0830c9aebf570aeed540eee5a950bb9132edecb8efb5 not found: ID does not exist" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.493869 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zp4xr" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.726962 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t6m29"] Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727216 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c470fb1c-868d-420e-b6e2-61369cecd6c3" containerName="extract-utilities" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727231 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c470fb1c-868d-420e-b6e2-61369cecd6c3" containerName="extract-utilities" Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727243 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f104c4-c1f9-4714-a807-2a6368b538fd" containerName="registry-server" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727249 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f104c4-c1f9-4714-a807-2a6368b538fd" containerName="registry-server" Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727260 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" containerName="extract-content" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727267 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" containerName="extract-content" Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727275 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" containerName="extract-utilities" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727284 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" containerName="extract-utilities" Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727295 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" containerName="registry-server" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727303 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" containerName="registry-server" Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727310 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f104c4-c1f9-4714-a807-2a6368b538fd" containerName="extract-utilities" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727316 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f104c4-c1f9-4714-a807-2a6368b538fd" containerName="extract-utilities" Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727324 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f104c4-c1f9-4714-a807-2a6368b538fd" containerName="extract-content" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727330 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f104c4-c1f9-4714-a807-2a6368b538fd" containerName="extract-content" Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727340 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" containerName="extract-utilities" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727346 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" containerName="extract-utilities" Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727360 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c470fb1c-868d-420e-b6e2-61369cecd6c3" containerName="registry-server" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727366 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c470fb1c-868d-420e-b6e2-61369cecd6c3" containerName="registry-server" Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727374 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601765d3-2ae0-4cd2-a1fb-2c54de37487b" containerName="marketplace-operator" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727380 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="601765d3-2ae0-4cd2-a1fb-2c54de37487b" containerName="marketplace-operator" Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727388 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601765d3-2ae0-4cd2-a1fb-2c54de37487b" containerName="marketplace-operator" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727394 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="601765d3-2ae0-4cd2-a1fb-2c54de37487b" containerName="marketplace-operator" Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727403 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" containerName="extract-content" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727410 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" containerName="extract-content" Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727417 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c470fb1c-868d-420e-b6e2-61369cecd6c3" containerName="extract-content" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727423 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c470fb1c-868d-420e-b6e2-61369cecd6c3" containerName="extract-content" Feb 23 06:52:19 crc kubenswrapper[5047]: E0223 06:52:19.727430 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" containerName="registry-server" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727435 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" containerName="registry-server" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727521 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" containerName="registry-server" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727536 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" containerName="registry-server" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727547 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f104c4-c1f9-4714-a807-2a6368b538fd" containerName="registry-server" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727556 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c470fb1c-868d-420e-b6e2-61369cecd6c3" containerName="registry-server" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727565 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="601765d3-2ae0-4cd2-a1fb-2c54de37487b" containerName="marketplace-operator" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.727737 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="601765d3-2ae0-4cd2-a1fb-2c54de37487b" containerName="marketplace-operator" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.728438 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.734393 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.749727 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6m29"] Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.816025 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34827715-7787-4f5b-b31d-27a4a217a266-utilities\") pod \"redhat-marketplace-t6m29\" (UID: \"34827715-7787-4f5b-b31d-27a4a217a266\") " pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.816105 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxw7v\" (UniqueName: \"kubernetes.io/projected/34827715-7787-4f5b-b31d-27a4a217a266-kube-api-access-rxw7v\") pod \"redhat-marketplace-t6m29\" (UID: \"34827715-7787-4f5b-b31d-27a4a217a266\") " pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.816449 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34827715-7787-4f5b-b31d-27a4a217a266-catalog-content\") pod \"redhat-marketplace-t6m29\" (UID: \"34827715-7787-4f5b-b31d-27a4a217a266\") " pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.917528 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34827715-7787-4f5b-b31d-27a4a217a266-catalog-content\") pod \"redhat-marketplace-t6m29\" (UID: \"34827715-7787-4f5b-b31d-27a4a217a266\") " pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.917641 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34827715-7787-4f5b-b31d-27a4a217a266-utilities\") pod \"redhat-marketplace-t6m29\" (UID: \"34827715-7787-4f5b-b31d-27a4a217a266\") " pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.917675 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxw7v\" (UniqueName: \"kubernetes.io/projected/34827715-7787-4f5b-b31d-27a4a217a266-kube-api-access-rxw7v\") pod \"redhat-marketplace-t6m29\" (UID: \"34827715-7787-4f5b-b31d-27a4a217a266\") " pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.919111 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34827715-7787-4f5b-b31d-27a4a217a266-utilities\") pod \"redhat-marketplace-t6m29\" (UID: \"34827715-7787-4f5b-b31d-27a4a217a266\") " pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.919412 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34827715-7787-4f5b-b31d-27a4a217a266-catalog-content\") pod \"redhat-marketplace-t6m29\" (UID: \"34827715-7787-4f5b-b31d-27a4a217a266\") " pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.937359 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-85n7r"] Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.941896 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.946331 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.952008 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxw7v\" (UniqueName: \"kubernetes.io/projected/34827715-7787-4f5b-b31d-27a4a217a266-kube-api-access-rxw7v\") pod \"redhat-marketplace-t6m29\" (UID: \"34827715-7787-4f5b-b31d-27a4a217a266\") " pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:19 crc kubenswrapper[5047]: I0223 06:52:19.959206 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85n7r"] Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.018748 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97f58084-18f9-4231-b254-c2c70ffc1909-utilities\") pod \"community-operators-85n7r\" (UID: \"97f58084-18f9-4231-b254-c2c70ffc1909\") " pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.018850 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97f58084-18f9-4231-b254-c2c70ffc1909-catalog-content\") pod \"community-operators-85n7r\" (UID: \"97f58084-18f9-4231-b254-c2c70ffc1909\") " pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.018962 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p7zh\" (UniqueName: \"kubernetes.io/projected/97f58084-18f9-4231-b254-c2c70ffc1909-kube-api-access-4p7zh\") pod \"community-operators-85n7r\" (UID: \"97f58084-18f9-4231-b254-c2c70ffc1909\") " pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.058267 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.121107 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97f58084-18f9-4231-b254-c2c70ffc1909-utilities\") pod \"community-operators-85n7r\" (UID: \"97f58084-18f9-4231-b254-c2c70ffc1909\") " pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.121406 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97f58084-18f9-4231-b254-c2c70ffc1909-catalog-content\") pod \"community-operators-85n7r\" (UID: \"97f58084-18f9-4231-b254-c2c70ffc1909\") " pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.121514 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p7zh\" (UniqueName: \"kubernetes.io/projected/97f58084-18f9-4231-b254-c2c70ffc1909-kube-api-access-4p7zh\") pod \"community-operators-85n7r\" (UID: \"97f58084-18f9-4231-b254-c2c70ffc1909\") " pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.123011 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97f58084-18f9-4231-b254-c2c70ffc1909-catalog-content\") pod \"community-operators-85n7r\" (UID: \"97f58084-18f9-4231-b254-c2c70ffc1909\") " pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.123508 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97f58084-18f9-4231-b254-c2c70ffc1909-utilities\") pod \"community-operators-85n7r\" (UID: \"97f58084-18f9-4231-b254-c2c70ffc1909\") " pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.147747 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p7zh\" (UniqueName: \"kubernetes.io/projected/97f58084-18f9-4231-b254-c2c70ffc1909-kube-api-access-4p7zh\") pod \"community-operators-85n7r\" (UID: \"97f58084-18f9-4231-b254-c2c70ffc1909\") " pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.296032 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.315239 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6m29"] Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.390479 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46419724-8d7c-47d4-9d51-3aef5c54ab1b" path="/var/lib/kubelet/pods/46419724-8d7c-47d4-9d51-3aef5c54ab1b/volumes" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.392875 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="601765d3-2ae0-4cd2-a1fb-2c54de37487b" path="/var/lib/kubelet/pods/601765d3-2ae0-4cd2-a1fb-2c54de37487b/volumes" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.393680 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f104c4-c1f9-4714-a807-2a6368b538fd" path="/var/lib/kubelet/pods/83f104c4-c1f9-4714-a807-2a6368b538fd/volumes" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.396277 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8964090d-1ca0-4cb9-b2a6-f4293fdf9591" path="/var/lib/kubelet/pods/8964090d-1ca0-4cb9-b2a6-f4293fdf9591/volumes" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.398214 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c470fb1c-868d-420e-b6e2-61369cecd6c3" path="/var/lib/kubelet/pods/c470fb1c-868d-420e-b6e2-61369cecd6c3/volumes" Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.500258 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6m29" event={"ID":"34827715-7787-4f5b-b31d-27a4a217a266","Type":"ContainerStarted","Data":"9d95e19c783798e1860b154236d7c015cbe3571ec450581c3adeb5cb8b5deba0"} Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.500294 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6m29" event={"ID":"34827715-7787-4f5b-b31d-27a4a217a266","Type":"ContainerStarted","Data":"880b7b9fbded7d6639e5a338d6cd2dd52fcf8f1ab6735a76a6b637cd5752124b"} Feb 23 06:52:20 crc kubenswrapper[5047]: I0223 06:52:20.565363 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85n7r"] Feb 23 06:52:21 crc kubenswrapper[5047]: I0223 06:52:21.506513 5047 generic.go:334] "Generic (PLEG): container finished" podID="97f58084-18f9-4231-b254-c2c70ffc1909" containerID="983bf09e57bfee124950968e9c85da528dd215ddf2203548a4ff1d3fbe7f2bd0" exitCode=0 Feb 23 06:52:21 crc kubenswrapper[5047]: I0223 06:52:21.506622 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85n7r" event={"ID":"97f58084-18f9-4231-b254-c2c70ffc1909","Type":"ContainerDied","Data":"983bf09e57bfee124950968e9c85da528dd215ddf2203548a4ff1d3fbe7f2bd0"} Feb 23 06:52:21 crc kubenswrapper[5047]: I0223 06:52:21.506677 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85n7r" event={"ID":"97f58084-18f9-4231-b254-c2c70ffc1909","Type":"ContainerStarted","Data":"5246f7e4bc823f554c9dea161d3499a1c445132baf93f36e85e70a382e6b016f"} Feb 23 06:52:21 crc kubenswrapper[5047]: I0223 06:52:21.510650 5047 generic.go:334] "Generic (PLEG): container finished" podID="34827715-7787-4f5b-b31d-27a4a217a266" containerID="9d95e19c783798e1860b154236d7c015cbe3571ec450581c3adeb5cb8b5deba0" exitCode=0 Feb 23 06:52:21 crc kubenswrapper[5047]: I0223 06:52:21.510784 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6m29" event={"ID":"34827715-7787-4f5b-b31d-27a4a217a266","Type":"ContainerDied","Data":"9d95e19c783798e1860b154236d7c015cbe3571ec450581c3adeb5cb8b5deba0"} Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.134365 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f7wgb"] Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.136481 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.141821 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.151090 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7wgb"] Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.272306 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zrbf\" (UniqueName: \"kubernetes.io/projected/35593a4c-662a-4df0-8076-963962ffb460-kube-api-access-8zrbf\") pod \"redhat-operators-f7wgb\" (UID: \"35593a4c-662a-4df0-8076-963962ffb460\") " pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.272393 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35593a4c-662a-4df0-8076-963962ffb460-utilities\") pod \"redhat-operators-f7wgb\" (UID: \"35593a4c-662a-4df0-8076-963962ffb460\") " pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.272427 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35593a4c-662a-4df0-8076-963962ffb460-catalog-content\") pod \"redhat-operators-f7wgb\" (UID: \"35593a4c-662a-4df0-8076-963962ffb460\") " pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.330703 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xp9kv"] Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.331824 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.337842 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.355720 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xp9kv"] Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.373339 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zrbf\" (UniqueName: \"kubernetes.io/projected/35593a4c-662a-4df0-8076-963962ffb460-kube-api-access-8zrbf\") pod \"redhat-operators-f7wgb\" (UID: \"35593a4c-662a-4df0-8076-963962ffb460\") " pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.373410 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35593a4c-662a-4df0-8076-963962ffb460-utilities\") pod \"redhat-operators-f7wgb\" (UID: \"35593a4c-662a-4df0-8076-963962ffb460\") " pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.373440 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35593a4c-662a-4df0-8076-963962ffb460-catalog-content\") pod \"redhat-operators-f7wgb\" (UID: \"35593a4c-662a-4df0-8076-963962ffb460\") " pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.373934 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35593a4c-662a-4df0-8076-963962ffb460-utilities\") pod \"redhat-operators-f7wgb\" (UID: \"35593a4c-662a-4df0-8076-963962ffb460\") " pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.374018 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35593a4c-662a-4df0-8076-963962ffb460-catalog-content\") pod \"redhat-operators-f7wgb\" (UID: \"35593a4c-662a-4df0-8076-963962ffb460\") " pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.396022 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zrbf\" (UniqueName: \"kubernetes.io/projected/35593a4c-662a-4df0-8076-963962ffb460-kube-api-access-8zrbf\") pod \"redhat-operators-f7wgb\" (UID: \"35593a4c-662a-4df0-8076-963962ffb460\") " pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.475041 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cf7a53-d471-4e95-b648-967929583e12-utilities\") pod \"certified-operators-xp9kv\" (UID: \"e6cf7a53-d471-4e95-b648-967929583e12\") " pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.475119 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cf7a53-d471-4e95-b648-967929583e12-catalog-content\") pod \"certified-operators-xp9kv\" (UID: \"e6cf7a53-d471-4e95-b648-967929583e12\") " pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.475144 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk68p\" (UniqueName: \"kubernetes.io/projected/e6cf7a53-d471-4e95-b648-967929583e12-kube-api-access-xk68p\") pod \"certified-operators-xp9kv\" (UID: \"e6cf7a53-d471-4e95-b648-967929583e12\") " pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.515674 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.518062 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85n7r" event={"ID":"97f58084-18f9-4231-b254-c2c70ffc1909","Type":"ContainerStarted","Data":"551890c05d948ecc23def44805e512f3148bc8449354ec64e2e20d00d065279f"} Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.520825 5047 generic.go:334] "Generic (PLEG): container finished" podID="34827715-7787-4f5b-b31d-27a4a217a266" containerID="e3c6e46fed3c33e83f4b27a83a1fe0b1e7383b849432e91e9086c0e6713fd4a7" exitCode=0 Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.520886 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6m29" event={"ID":"34827715-7787-4f5b-b31d-27a4a217a266","Type":"ContainerDied","Data":"e3c6e46fed3c33e83f4b27a83a1fe0b1e7383b849432e91e9086c0e6713fd4a7"} Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.576661 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cf7a53-d471-4e95-b648-967929583e12-utilities\") pod \"certified-operators-xp9kv\" (UID: \"e6cf7a53-d471-4e95-b648-967929583e12\") " pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.577200 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cf7a53-d471-4e95-b648-967929583e12-catalog-content\") pod \"certified-operators-xp9kv\" (UID: \"e6cf7a53-d471-4e95-b648-967929583e12\") " pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.577232 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk68p\" (UniqueName: \"kubernetes.io/projected/e6cf7a53-d471-4e95-b648-967929583e12-kube-api-access-xk68p\") pod \"certified-operators-xp9kv\" (UID: \"e6cf7a53-d471-4e95-b648-967929583e12\") " pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.577279 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cf7a53-d471-4e95-b648-967929583e12-utilities\") pod \"certified-operators-xp9kv\" (UID: \"e6cf7a53-d471-4e95-b648-967929583e12\") " pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.577678 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cf7a53-d471-4e95-b648-967929583e12-catalog-content\") pod \"certified-operators-xp9kv\" (UID: \"e6cf7a53-d471-4e95-b648-967929583e12\") " pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.602093 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk68p\" (UniqueName: \"kubernetes.io/projected/e6cf7a53-d471-4e95-b648-967929583e12-kube-api-access-xk68p\") pod \"certified-operators-xp9kv\" (UID: \"e6cf7a53-d471-4e95-b648-967929583e12\") " pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.655765 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.853079 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xp9kv"] Feb 23 06:52:22 crc kubenswrapper[5047]: W0223 06:52:22.863721 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6cf7a53_d471_4e95_b648_967929583e12.slice/crio-eb111bc5dcc2755dbf1e65ccda0bb162b4527b02f921e332f9f7c3f326e69425 WatchSource:0}: Error finding container eb111bc5dcc2755dbf1e65ccda0bb162b4527b02f921e332f9f7c3f326e69425: Status 404 returned error can't find the container with id eb111bc5dcc2755dbf1e65ccda0bb162b4527b02f921e332f9f7c3f326e69425 Feb 23 06:52:22 crc kubenswrapper[5047]: I0223 06:52:22.917187 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7wgb"] Feb 23 06:52:22 crc kubenswrapper[5047]: W0223 06:52:22.920931 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35593a4c_662a_4df0_8076_963962ffb460.slice/crio-8b64e48a78bd3589b49d0afc7f7c1fba00e42a20c090abaed293e791d6d695f3 WatchSource:0}: Error finding container 8b64e48a78bd3589b49d0afc7f7c1fba00e42a20c090abaed293e791d6d695f3: Status 404 returned error can't find the container with id 8b64e48a78bd3589b49d0afc7f7c1fba00e42a20c090abaed293e791d6d695f3 Feb 23 06:52:23 crc kubenswrapper[5047]: I0223 06:52:23.532242 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6m29" event={"ID":"34827715-7787-4f5b-b31d-27a4a217a266","Type":"ContainerStarted","Data":"e7cbe22704f639e26cc1c43cbebbe15560480a9ba4043753d0a6360427d604c9"} Feb 23 06:52:23 crc kubenswrapper[5047]: I0223 06:52:23.534519 5047 generic.go:334] "Generic (PLEG): container finished" podID="97f58084-18f9-4231-b254-c2c70ffc1909" containerID="551890c05d948ecc23def44805e512f3148bc8449354ec64e2e20d00d065279f" exitCode=0 Feb 23 06:52:23 crc kubenswrapper[5047]: I0223 06:52:23.534606 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85n7r" event={"ID":"97f58084-18f9-4231-b254-c2c70ffc1909","Type":"ContainerDied","Data":"551890c05d948ecc23def44805e512f3148bc8449354ec64e2e20d00d065279f"} Feb 23 06:52:23 crc kubenswrapper[5047]: I0223 06:52:23.536355 5047 generic.go:334] "Generic (PLEG): container finished" podID="e6cf7a53-d471-4e95-b648-967929583e12" containerID="2fc1fe6c48327f950dde8169a3751effa057a86b57ea5171620f75cd43ab3525" exitCode=0 Feb 23 06:52:23 crc kubenswrapper[5047]: I0223 06:52:23.536456 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp9kv" event={"ID":"e6cf7a53-d471-4e95-b648-967929583e12","Type":"ContainerDied","Data":"2fc1fe6c48327f950dde8169a3751effa057a86b57ea5171620f75cd43ab3525"} Feb 23 06:52:23 crc kubenswrapper[5047]: I0223 06:52:23.536495 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp9kv" event={"ID":"e6cf7a53-d471-4e95-b648-967929583e12","Type":"ContainerStarted","Data":"eb111bc5dcc2755dbf1e65ccda0bb162b4527b02f921e332f9f7c3f326e69425"} Feb 23 06:52:23 crc kubenswrapper[5047]: I0223 06:52:23.537894 5047 generic.go:334] "Generic (PLEG): container finished" podID="35593a4c-662a-4df0-8076-963962ffb460" containerID="7bbd9d55cd9e37b367185c405dcde1e0f462e822ed3d2065fd63aff37a16d3b1" exitCode=0 Feb 23 06:52:23 crc kubenswrapper[5047]: I0223 06:52:23.537969 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7wgb" event={"ID":"35593a4c-662a-4df0-8076-963962ffb460","Type":"ContainerDied","Data":"7bbd9d55cd9e37b367185c405dcde1e0f462e822ed3d2065fd63aff37a16d3b1"} Feb 23 06:52:23 crc kubenswrapper[5047]: I0223 06:52:23.537989 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7wgb" event={"ID":"35593a4c-662a-4df0-8076-963962ffb460","Type":"ContainerStarted","Data":"8b64e48a78bd3589b49d0afc7f7c1fba00e42a20c090abaed293e791d6d695f3"} Feb 23 06:52:23 crc kubenswrapper[5047]: I0223 06:52:23.558638 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t6m29" podStartSLOduration=3.107909319 podStartE2EDuration="4.558613009s" podCreationTimestamp="2026-02-23 06:52:19 +0000 UTC" firstStartedPulling="2026-02-23 06:52:21.512640883 +0000 UTC m=+463.763968047" lastFinishedPulling="2026-02-23 06:52:22.963344603 +0000 UTC m=+465.214671737" observedRunningTime="2026-02-23 06:52:23.556049137 +0000 UTC m=+465.807376301" watchObservedRunningTime="2026-02-23 06:52:23.558613009 +0000 UTC m=+465.809940153" Feb 23 06:52:24 crc kubenswrapper[5047]: I0223 06:52:24.545310 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp9kv" event={"ID":"e6cf7a53-d471-4e95-b648-967929583e12","Type":"ContainerStarted","Data":"971dcb84903d6a4031431cc42275b1125ceeec48b6502d13e4458f2981523f7e"} Feb 23 06:52:24 crc kubenswrapper[5047]: I0223 06:52:24.547698 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7wgb" event={"ID":"35593a4c-662a-4df0-8076-963962ffb460","Type":"ContainerStarted","Data":"aed6cf08666e24f9258dc46fa1c1a2cd1fdc76d2d2513b45f8383212c88ce84f"} Feb 23 06:52:24 crc kubenswrapper[5047]: I0223 06:52:24.554348 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85n7r" event={"ID":"97f58084-18f9-4231-b254-c2c70ffc1909","Type":"ContainerStarted","Data":"c5491d59cad6cb0e71a90dcbb7f5a29c3f8fd96d336eb76145a1804a8b0229bb"} Feb 23 06:52:24 crc kubenswrapper[5047]: I0223 06:52:24.610353 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-85n7r" podStartSLOduration=3.187839902 podStartE2EDuration="5.610325251s" podCreationTimestamp="2026-02-23 06:52:19 +0000 UTC" firstStartedPulling="2026-02-23 06:52:21.50830786 +0000 UTC m=+463.759634994" lastFinishedPulling="2026-02-23 06:52:23.930793209 +0000 UTC m=+466.182120343" observedRunningTime="2026-02-23 06:52:24.591617242 +0000 UTC m=+466.842944376" watchObservedRunningTime="2026-02-23 06:52:24.610325251 +0000 UTC m=+466.861652385" Feb 23 06:52:25 crc kubenswrapper[5047]: I0223 06:52:25.561839 5047 generic.go:334] "Generic (PLEG): container finished" podID="e6cf7a53-d471-4e95-b648-967929583e12" containerID="971dcb84903d6a4031431cc42275b1125ceeec48b6502d13e4458f2981523f7e" exitCode=0 Feb 23 06:52:25 crc kubenswrapper[5047]: I0223 06:52:25.561952 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp9kv" event={"ID":"e6cf7a53-d471-4e95-b648-967929583e12","Type":"ContainerDied","Data":"971dcb84903d6a4031431cc42275b1125ceeec48b6502d13e4458f2981523f7e"} Feb 23 06:52:25 crc kubenswrapper[5047]: I0223 06:52:25.566734 5047 generic.go:334] "Generic (PLEG): container finished" podID="35593a4c-662a-4df0-8076-963962ffb460" containerID="aed6cf08666e24f9258dc46fa1c1a2cd1fdc76d2d2513b45f8383212c88ce84f" exitCode=0 Feb 23 06:52:25 crc kubenswrapper[5047]: I0223 06:52:25.566813 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7wgb" event={"ID":"35593a4c-662a-4df0-8076-963962ffb460","Type":"ContainerDied","Data":"aed6cf08666e24f9258dc46fa1c1a2cd1fdc76d2d2513b45f8383212c88ce84f"} Feb 23 06:52:26 crc kubenswrapper[5047]: I0223 06:52:26.577848 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp9kv" event={"ID":"e6cf7a53-d471-4e95-b648-967929583e12","Type":"ContainerStarted","Data":"2f784c1289001583da65857f7f2824fd08fe5bd021b2f96edbc05b9e3910702d"} Feb 23 06:52:26 crc kubenswrapper[5047]: I0223 06:52:26.580640 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7wgb" event={"ID":"35593a4c-662a-4df0-8076-963962ffb460","Type":"ContainerStarted","Data":"a276072457ec4a348f55529c251bb02a6ba667d47f466f98c77073e8144b6adc"} Feb 23 06:52:26 crc kubenswrapper[5047]: I0223 06:52:26.614113 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xp9kv" podStartSLOduration=2.156035638 podStartE2EDuration="4.614095323s" podCreationTimestamp="2026-02-23 06:52:22 +0000 UTC" firstStartedPulling="2026-02-23 06:52:23.538564092 +0000 UTC m=+465.789891236" lastFinishedPulling="2026-02-23 06:52:25.996623777 +0000 UTC m=+468.247950921" observedRunningTime="2026-02-23 06:52:26.612189509 +0000 UTC m=+468.863516643" watchObservedRunningTime="2026-02-23 06:52:26.614095323 +0000 UTC m=+468.865422457" Feb 23 06:52:26 crc kubenswrapper[5047]: I0223 06:52:26.649672 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f7wgb" podStartSLOduration=2.261891194 podStartE2EDuration="4.649648929s" podCreationTimestamp="2026-02-23 06:52:22 +0000 UTC" firstStartedPulling="2026-02-23 06:52:23.539311473 +0000 UTC m=+465.790638617" lastFinishedPulling="2026-02-23 06:52:25.927069218 +0000 UTC m=+468.178396352" observedRunningTime="2026-02-23 06:52:26.646297184 +0000 UTC m=+468.897624318" watchObservedRunningTime="2026-02-23 06:52:26.649648929 +0000 UTC m=+468.900976063" Feb 23 06:52:29 crc kubenswrapper[5047]: I0223 06:52:29.014115 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-gxvkj" Feb 23 06:52:29 crc kubenswrapper[5047]: I0223 06:52:29.073204 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8fh6b"] Feb 23 06:52:30 crc kubenswrapper[5047]: I0223 06:52:30.059378 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:30 crc kubenswrapper[5047]: I0223 06:52:30.060124 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:30 crc kubenswrapper[5047]: I0223 06:52:30.137929 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:30 crc kubenswrapper[5047]: I0223 06:52:30.296438 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:30 crc kubenswrapper[5047]: I0223 06:52:30.296841 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:30 crc kubenswrapper[5047]: I0223 06:52:30.338425 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:30 crc kubenswrapper[5047]: I0223 06:52:30.660119 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-85n7r" Feb 23 06:52:30 crc kubenswrapper[5047]: I0223 06:52:30.673680 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t6m29" Feb 23 06:52:32 crc kubenswrapper[5047]: I0223 06:52:32.516086 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:32 crc kubenswrapper[5047]: I0223 06:52:32.516729 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:32 crc kubenswrapper[5047]: I0223 06:52:32.656420 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:32 crc kubenswrapper[5047]: I0223 06:52:32.656487 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:32 crc kubenswrapper[5047]: I0223 06:52:32.703152 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:33 crc kubenswrapper[5047]: I0223 06:52:33.575183 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f7wgb" podUID="35593a4c-662a-4df0-8076-963962ffb460" containerName="registry-server" probeResult="failure" output=< Feb 23 06:52:33 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 06:52:33 crc kubenswrapper[5047]: > Feb 23 06:52:33 crc kubenswrapper[5047]: I0223 06:52:33.668873 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 06:52:42 crc kubenswrapper[5047]: I0223 06:52:42.561795 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:42 crc kubenswrapper[5047]: I0223 06:52:42.605313 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f7wgb" Feb 23 06:52:46 crc kubenswrapper[5047]: I0223 06:52:46.760376 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:52:46 crc kubenswrapper[5047]: I0223 06:52:46.762156 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:52:46 crc kubenswrapper[5047]: I0223 06:52:46.762438 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:52:46 crc kubenswrapper[5047]: I0223 06:52:46.763561 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e9070ba26d360a04508200057f939930be7816f548d48466f2bbab9cbbc2f3d"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 06:52:46 crc kubenswrapper[5047]: I0223 06:52:46.763824 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://5e9070ba26d360a04508200057f939930be7816f548d48466f2bbab9cbbc2f3d" gracePeriod=600 Feb 23 06:52:47 crc kubenswrapper[5047]: I0223 06:52:47.730658 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="5e9070ba26d360a04508200057f939930be7816f548d48466f2bbab9cbbc2f3d" exitCode=0 Feb 23 06:52:47 crc kubenswrapper[5047]: I0223 06:52:47.730782 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"5e9070ba26d360a04508200057f939930be7816f548d48466f2bbab9cbbc2f3d"} Feb 23 06:52:47 crc kubenswrapper[5047]: I0223 06:52:47.731707 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"8fef5335fab11be06cd5e79850ed7338a8964bb949c146ab6f73caf5aceecdbf"} Feb 23 06:52:47 crc kubenswrapper[5047]: I0223 06:52:47.731750 5047 scope.go:117] "RemoveContainer" containerID="baa052b71c3fd14ca2c5364231c35ecf105064f19c24059ef5a19025acb606ee" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.121379 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" podUID="a75720bd-50a7-4a3c-b12c-e901126d4382" containerName="registry" containerID="cri-o://6cb6c058a68e75cd3c0e55ef29a764d7b079e59724f259a6ad417baa85dc1029" gracePeriod=30 Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.601343 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.724193 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a75720bd-50a7-4a3c-b12c-e901126d4382\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.724299 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-bound-sa-token\") pod \"a75720bd-50a7-4a3c-b12c-e901126d4382\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.724386 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a75720bd-50a7-4a3c-b12c-e901126d4382-trusted-ca\") pod \"a75720bd-50a7-4a3c-b12c-e901126d4382\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.724426 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-registry-tls\") pod \"a75720bd-50a7-4a3c-b12c-e901126d4382\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.724460 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a75720bd-50a7-4a3c-b12c-e901126d4382-ca-trust-extracted\") pod \"a75720bd-50a7-4a3c-b12c-e901126d4382\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.724501 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a75720bd-50a7-4a3c-b12c-e901126d4382-installation-pull-secrets\") pod \"a75720bd-50a7-4a3c-b12c-e901126d4382\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.724566 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a75720bd-50a7-4a3c-b12c-e901126d4382-registry-certificates\") pod \"a75720bd-50a7-4a3c-b12c-e901126d4382\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.724618 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdxvf\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-kube-api-access-jdxvf\") pod \"a75720bd-50a7-4a3c-b12c-e901126d4382\" (UID: \"a75720bd-50a7-4a3c-b12c-e901126d4382\") " Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.726220 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75720bd-50a7-4a3c-b12c-e901126d4382-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a75720bd-50a7-4a3c-b12c-e901126d4382" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.726246 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75720bd-50a7-4a3c-b12c-e901126d4382-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a75720bd-50a7-4a3c-b12c-e901126d4382" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.734645 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a75720bd-50a7-4a3c-b12c-e901126d4382" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.734743 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75720bd-50a7-4a3c-b12c-e901126d4382-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a75720bd-50a7-4a3c-b12c-e901126d4382" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.735074 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a75720bd-50a7-4a3c-b12c-e901126d4382" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.735690 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-kube-api-access-jdxvf" (OuterVolumeSpecName: "kube-api-access-jdxvf") pod "a75720bd-50a7-4a3c-b12c-e901126d4382" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382"). InnerVolumeSpecName "kube-api-access-jdxvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.747374 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a75720bd-50a7-4a3c-b12c-e901126d4382-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a75720bd-50a7-4a3c-b12c-e901126d4382" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.751847 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a75720bd-50a7-4a3c-b12c-e901126d4382" (UID: "a75720bd-50a7-4a3c-b12c-e901126d4382"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.788061 5047 generic.go:334] "Generic (PLEG): container finished" podID="a75720bd-50a7-4a3c-b12c-e901126d4382" containerID="6cb6c058a68e75cd3c0e55ef29a764d7b079e59724f259a6ad417baa85dc1029" exitCode=0 Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.788100 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.788127 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" event={"ID":"a75720bd-50a7-4a3c-b12c-e901126d4382","Type":"ContainerDied","Data":"6cb6c058a68e75cd3c0e55ef29a764d7b079e59724f259a6ad417baa85dc1029"} Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.788192 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8fh6b" event={"ID":"a75720bd-50a7-4a3c-b12c-e901126d4382","Type":"ContainerDied","Data":"e24294f2db085f188d0a25250af1c632457d9e8508965092c6b6915e2d425e7c"} Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.788226 5047 scope.go:117] "RemoveContainer" containerID="6cb6c058a68e75cd3c0e55ef29a764d7b079e59724f259a6ad417baa85dc1029" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.826670 5047 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a75720bd-50a7-4a3c-b12c-e901126d4382-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.826710 5047 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.826723 5047 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a75720bd-50a7-4a3c-b12c-e901126d4382-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.826758 5047 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a75720bd-50a7-4a3c-b12c-e901126d4382-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.826770 5047 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a75720bd-50a7-4a3c-b12c-e901126d4382-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.826782 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdxvf\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-kube-api-access-jdxvf\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.826794 5047 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a75720bd-50a7-4a3c-b12c-e901126d4382-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.835979 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8fh6b"] Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.839795 5047 scope.go:117] "RemoveContainer" containerID="6cb6c058a68e75cd3c0e55ef29a764d7b079e59724f259a6ad417baa85dc1029" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.840170 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8fh6b"] Feb 23 06:52:54 crc kubenswrapper[5047]: E0223 06:52:54.840807 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cb6c058a68e75cd3c0e55ef29a764d7b079e59724f259a6ad417baa85dc1029\": container with ID starting with 6cb6c058a68e75cd3c0e55ef29a764d7b079e59724f259a6ad417baa85dc1029 not found: ID does not exist" containerID="6cb6c058a68e75cd3c0e55ef29a764d7b079e59724f259a6ad417baa85dc1029" Feb 23 06:52:54 crc kubenswrapper[5047]: I0223 06:52:54.840976 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cb6c058a68e75cd3c0e55ef29a764d7b079e59724f259a6ad417baa85dc1029"} err="failed to get container status \"6cb6c058a68e75cd3c0e55ef29a764d7b079e59724f259a6ad417baa85dc1029\": rpc error: code = NotFound desc = could not find container \"6cb6c058a68e75cd3c0e55ef29a764d7b079e59724f259a6ad417baa85dc1029\": container with ID starting with 6cb6c058a68e75cd3c0e55ef29a764d7b079e59724f259a6ad417baa85dc1029 not found: ID does not exist" Feb 23 06:52:56 crc kubenswrapper[5047]: I0223 06:52:56.354588 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75720bd-50a7-4a3c-b12c-e901126d4382" path="/var/lib/kubelet/pods/a75720bd-50a7-4a3c-b12c-e901126d4382/volumes" Feb 23 06:55:16 crc kubenswrapper[5047]: I0223 06:55:16.760517 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:55:16 crc kubenswrapper[5047]: I0223 06:55:16.761627 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:55:46 crc kubenswrapper[5047]: I0223 06:55:46.759794 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:55:46 crc kubenswrapper[5047]: I0223 06:55:46.760721 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:56:16 crc kubenswrapper[5047]: I0223 06:56:16.759718 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:56:16 crc kubenswrapper[5047]: I0223 06:56:16.761089 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:56:16 crc kubenswrapper[5047]: I0223 06:56:16.761182 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:56:16 crc kubenswrapper[5047]: I0223 06:56:16.762352 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8fef5335fab11be06cd5e79850ed7338a8964bb949c146ab6f73caf5aceecdbf"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 06:56:16 crc kubenswrapper[5047]: I0223 06:56:16.762568 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://8fef5335fab11be06cd5e79850ed7338a8964bb949c146ab6f73caf5aceecdbf" gracePeriod=600 Feb 23 06:56:17 crc kubenswrapper[5047]: I0223 06:56:17.343248 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="8fef5335fab11be06cd5e79850ed7338a8964bb949c146ab6f73caf5aceecdbf" exitCode=0 Feb 23 06:56:17 crc kubenswrapper[5047]: I0223 06:56:17.343317 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"8fef5335fab11be06cd5e79850ed7338a8964bb949c146ab6f73caf5aceecdbf"} Feb 23 06:56:17 crc kubenswrapper[5047]: I0223 06:56:17.343797 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"bf0d472a1be69e8faaec3de3ca52b467dc0e01107e59ea30330ccf2c4ba1980f"} Feb 23 06:56:17 crc kubenswrapper[5047]: I0223 06:56:17.343837 5047 scope.go:117] "RemoveContainer" containerID="5e9070ba26d360a04508200057f939930be7816f548d48466f2bbab9cbbc2f3d" Feb 23 06:58:31 crc kubenswrapper[5047]: I0223 06:58:31.104046 5047 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.729557 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n2hl2"] Feb 23 06:58:42 crc kubenswrapper[5047]: E0223 06:58:42.735870 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75720bd-50a7-4a3c-b12c-e901126d4382" containerName="registry" Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.736140 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75720bd-50a7-4a3c-b12c-e901126d4382" containerName="registry" Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.736579 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75720bd-50a7-4a3c-b12c-e901126d4382" containerName="registry" Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.738796 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.770100 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n2hl2"] Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.835579 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqghb\" (UniqueName: \"kubernetes.io/projected/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-kube-api-access-qqghb\") pod \"certified-operators-n2hl2\" (UID: \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\") " pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.836383 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-utilities\") pod \"certified-operators-n2hl2\" (UID: \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\") " pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.836544 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-catalog-content\") pod \"certified-operators-n2hl2\" (UID: \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\") " pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.937590 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-catalog-content\") pod \"certified-operators-n2hl2\" (UID: \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\") " pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.937710 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqghb\" (UniqueName: \"kubernetes.io/projected/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-kube-api-access-qqghb\") pod \"certified-operators-n2hl2\" (UID: \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\") " pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.937786 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-utilities\") pod \"certified-operators-n2hl2\" (UID: \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\") " pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.938708 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-utilities\") pod \"certified-operators-n2hl2\" (UID: \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\") " pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.939266 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-catalog-content\") pod \"certified-operators-n2hl2\" (UID: \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\") " pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:42 crc kubenswrapper[5047]: I0223 06:58:42.966037 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqghb\" (UniqueName: \"kubernetes.io/projected/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-kube-api-access-qqghb\") pod \"certified-operators-n2hl2\" (UID: \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\") " pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:43 crc kubenswrapper[5047]: I0223 06:58:43.065644 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:43 crc kubenswrapper[5047]: I0223 06:58:43.366817 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n2hl2"] Feb 23 06:58:43 crc kubenswrapper[5047]: I0223 06:58:43.592331 5047 generic.go:334] "Generic (PLEG): container finished" podID="6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" containerID="972edac0169f670b77b5dd11a2d95a129ea70c023a04aee54a1ea348b2d38d2a" exitCode=0 Feb 23 06:58:43 crc kubenswrapper[5047]: I0223 06:58:43.592378 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2hl2" event={"ID":"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d","Type":"ContainerDied","Data":"972edac0169f670b77b5dd11a2d95a129ea70c023a04aee54a1ea348b2d38d2a"} Feb 23 06:58:43 crc kubenswrapper[5047]: I0223 06:58:43.592411 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2hl2" event={"ID":"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d","Type":"ContainerStarted","Data":"3fb8a62965bb28259d5828d11725b0c86e011084ccd5bea9648910fcf4f1fbc0"} Feb 23 06:58:43 crc kubenswrapper[5047]: I0223 06:58:43.594776 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 06:58:44 crc kubenswrapper[5047]: I0223 06:58:44.605207 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2hl2" event={"ID":"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d","Type":"ContainerStarted","Data":"44d3745104082c94918700af1285d97f278b5451e64b7fef351eb4954c37b9f7"} Feb 23 06:58:45 crc kubenswrapper[5047]: I0223 06:58:45.617052 5047 generic.go:334] "Generic (PLEG): container finished" podID="6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" containerID="44d3745104082c94918700af1285d97f278b5451e64b7fef351eb4954c37b9f7" exitCode=0 Feb 23 06:58:45 crc kubenswrapper[5047]: I0223 06:58:45.617141 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2hl2" event={"ID":"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d","Type":"ContainerDied","Data":"44d3745104082c94918700af1285d97f278b5451e64b7fef351eb4954c37b9f7"} Feb 23 06:58:46 crc kubenswrapper[5047]: I0223 06:58:46.628021 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2hl2" event={"ID":"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d","Type":"ContainerStarted","Data":"74a37e7c9899585a9088f3ccee1ed902a584c719ba590adcc88b4bde41e71ae8"} Feb 23 06:58:46 crc kubenswrapper[5047]: I0223 06:58:46.653950 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n2hl2" podStartSLOduration=2.248354501 podStartE2EDuration="4.6539284s" podCreationTimestamp="2026-02-23 06:58:42 +0000 UTC" firstStartedPulling="2026-02-23 06:58:43.59456095 +0000 UTC m=+845.845888084" lastFinishedPulling="2026-02-23 06:58:46.000134789 +0000 UTC m=+848.251461983" observedRunningTime="2026-02-23 06:58:46.653348974 +0000 UTC m=+848.904676108" watchObservedRunningTime="2026-02-23 06:58:46.6539284 +0000 UTC m=+848.905255524" Feb 23 06:58:46 crc kubenswrapper[5047]: I0223 06:58:46.760124 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:58:46 crc kubenswrapper[5047]: I0223 06:58:46.760195 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:58:53 crc kubenswrapper[5047]: I0223 06:58:53.070285 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:53 crc kubenswrapper[5047]: I0223 06:58:53.071199 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:53 crc kubenswrapper[5047]: I0223 06:58:53.115718 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:53 crc kubenswrapper[5047]: I0223 06:58:53.718191 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:53 crc kubenswrapper[5047]: I0223 06:58:53.773255 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n2hl2"] Feb 23 06:58:55 crc kubenswrapper[5047]: I0223 06:58:55.693865 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n2hl2" podUID="6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" containerName="registry-server" containerID="cri-o://74a37e7c9899585a9088f3ccee1ed902a584c719ba590adcc88b4bde41e71ae8" gracePeriod=2 Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.071561 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.150509 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqghb\" (UniqueName: \"kubernetes.io/projected/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-kube-api-access-qqghb\") pod \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\" (UID: \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\") " Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.150638 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-catalog-content\") pod \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\" (UID: \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\") " Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.150698 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-utilities\") pod \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\" (UID: \"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d\") " Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.152244 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-utilities" (OuterVolumeSpecName: "utilities") pod "6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" (UID: "6ed70514-e0d2-4e31-88be-0c4bcef7fb9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.158342 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-kube-api-access-qqghb" (OuterVolumeSpecName: "kube-api-access-qqghb") pod "6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" (UID: "6ed70514-e0d2-4e31-88be-0c4bcef7fb9d"). InnerVolumeSpecName "kube-api-access-qqghb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.215421 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" (UID: "6ed70514-e0d2-4e31-88be-0c4bcef7fb9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.252657 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.252701 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.252724 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqghb\" (UniqueName: \"kubernetes.io/projected/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d-kube-api-access-qqghb\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.707218 5047 generic.go:334] "Generic (PLEG): container finished" podID="6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" containerID="74a37e7c9899585a9088f3ccee1ed902a584c719ba590adcc88b4bde41e71ae8" exitCode=0 Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.707300 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2hl2" event={"ID":"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d","Type":"ContainerDied","Data":"74a37e7c9899585a9088f3ccee1ed902a584c719ba590adcc88b4bde41e71ae8"} Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.707354 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2hl2" event={"ID":"6ed70514-e0d2-4e31-88be-0c4bcef7fb9d","Type":"ContainerDied","Data":"3fb8a62965bb28259d5828d11725b0c86e011084ccd5bea9648910fcf4f1fbc0"} Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.707387 5047 scope.go:117] "RemoveContainer" containerID="74a37e7c9899585a9088f3ccee1ed902a584c719ba590adcc88b4bde41e71ae8" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.708072 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2hl2" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.737443 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n2hl2"] Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.742191 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n2hl2"] Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.744384 5047 scope.go:117] "RemoveContainer" containerID="44d3745104082c94918700af1285d97f278b5451e64b7fef351eb4954c37b9f7" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.777843 5047 scope.go:117] "RemoveContainer" containerID="972edac0169f670b77b5dd11a2d95a129ea70c023a04aee54a1ea348b2d38d2a" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.800691 5047 scope.go:117] "RemoveContainer" containerID="74a37e7c9899585a9088f3ccee1ed902a584c719ba590adcc88b4bde41e71ae8" Feb 23 06:58:56 crc kubenswrapper[5047]: E0223 06:58:56.801397 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a37e7c9899585a9088f3ccee1ed902a584c719ba590adcc88b4bde41e71ae8\": container with ID starting with 74a37e7c9899585a9088f3ccee1ed902a584c719ba590adcc88b4bde41e71ae8 not found: ID does not exist" containerID="74a37e7c9899585a9088f3ccee1ed902a584c719ba590adcc88b4bde41e71ae8" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.801475 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a37e7c9899585a9088f3ccee1ed902a584c719ba590adcc88b4bde41e71ae8"} err="failed to get container status \"74a37e7c9899585a9088f3ccee1ed902a584c719ba590adcc88b4bde41e71ae8\": rpc error: code = NotFound desc = could not find container \"74a37e7c9899585a9088f3ccee1ed902a584c719ba590adcc88b4bde41e71ae8\": container with ID starting with 74a37e7c9899585a9088f3ccee1ed902a584c719ba590adcc88b4bde41e71ae8 not found: ID does not exist" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.801547 5047 scope.go:117] "RemoveContainer" containerID="44d3745104082c94918700af1285d97f278b5451e64b7fef351eb4954c37b9f7" Feb 23 06:58:56 crc kubenswrapper[5047]: E0223 06:58:56.802020 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44d3745104082c94918700af1285d97f278b5451e64b7fef351eb4954c37b9f7\": container with ID starting with 44d3745104082c94918700af1285d97f278b5451e64b7fef351eb4954c37b9f7 not found: ID does not exist" containerID="44d3745104082c94918700af1285d97f278b5451e64b7fef351eb4954c37b9f7" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.802053 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d3745104082c94918700af1285d97f278b5451e64b7fef351eb4954c37b9f7"} err="failed to get container status \"44d3745104082c94918700af1285d97f278b5451e64b7fef351eb4954c37b9f7\": rpc error: code = NotFound desc = could not find container \"44d3745104082c94918700af1285d97f278b5451e64b7fef351eb4954c37b9f7\": container with ID starting with 44d3745104082c94918700af1285d97f278b5451e64b7fef351eb4954c37b9f7 not found: ID does not exist" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.802070 5047 scope.go:117] "RemoveContainer" containerID="972edac0169f670b77b5dd11a2d95a129ea70c023a04aee54a1ea348b2d38d2a" Feb 23 06:58:56 crc kubenswrapper[5047]: E0223 06:58:56.802473 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972edac0169f670b77b5dd11a2d95a129ea70c023a04aee54a1ea348b2d38d2a\": container with ID starting with 972edac0169f670b77b5dd11a2d95a129ea70c023a04aee54a1ea348b2d38d2a not found: ID does not exist" containerID="972edac0169f670b77b5dd11a2d95a129ea70c023a04aee54a1ea348b2d38d2a" Feb 23 06:58:56 crc kubenswrapper[5047]: I0223 06:58:56.802501 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972edac0169f670b77b5dd11a2d95a129ea70c023a04aee54a1ea348b2d38d2a"} err="failed to get container status \"972edac0169f670b77b5dd11a2d95a129ea70c023a04aee54a1ea348b2d38d2a\": rpc error: code = NotFound desc = could not find container \"972edac0169f670b77b5dd11a2d95a129ea70c023a04aee54a1ea348b2d38d2a\": container with ID starting with 972edac0169f670b77b5dd11a2d95a129ea70c023a04aee54a1ea348b2d38d2a not found: ID does not exist" Feb 23 06:58:58 crc kubenswrapper[5047]: I0223 06:58:58.355802 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" path="/var/lib/kubelet/pods/6ed70514-e0d2-4e31-88be-0c4bcef7fb9d/volumes" Feb 23 06:59:16 crc kubenswrapper[5047]: I0223 06:59:16.761415 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:59:16 crc kubenswrapper[5047]: I0223 06:59:16.762096 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.155685 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rklm9"] Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.156832 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovn-controller" containerID="cri-o://00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2" gracePeriod=30 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.157001 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="northd" containerID="cri-o://60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087" gracePeriod=30 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.156985 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="nbdb" containerID="cri-o://a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117" gracePeriod=30 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.157049 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626" gracePeriod=30 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.157072 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="kube-rbac-proxy-node" containerID="cri-o://68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e" gracePeriod=30 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.157111 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovn-acl-logging" containerID="cri-o://d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624" gracePeriod=30 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.157348 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="sbdb" containerID="cri-o://7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016" gracePeriod=30 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.198589 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" containerID="cri-o://990884e87515f311025d6cd5914b187883806cd5a9b1797ec1cebb761603db6c" gracePeriod=30 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.512728 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n5dz9_e0fbd5e6-7dcc-4a13-936e-0db2e66394e8/kube-multus/2.log" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.513762 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n5dz9_e0fbd5e6-7dcc-4a13-936e-0db2e66394e8/kube-multus/1.log" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.513839 5047 generic.go:334] "Generic (PLEG): container finished" podID="e0fbd5e6-7dcc-4a13-936e-0db2e66394e8" containerID="87ca2a1c87094d3f86cba1506eb761f303a7247c006a0f88e9dab663f29c5209" exitCode=2 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.513930 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n5dz9" event={"ID":"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8","Type":"ContainerDied","Data":"87ca2a1c87094d3f86cba1506eb761f303a7247c006a0f88e9dab663f29c5209"} Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.514001 5047 scope.go:117] "RemoveContainer" containerID="86e559edb20cfecd437f4e96833d020bea8dda6dd89ead3660ea47947a6c0efd" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.514649 5047 scope.go:117] "RemoveContainer" containerID="87ca2a1c87094d3f86cba1506eb761f303a7247c006a0f88e9dab663f29c5209" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.516446 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/3.log" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.519867 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovn-acl-logging/0.log" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.520449 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovn-controller/0.log" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521184 5047 generic.go:334] "Generic (PLEG): container finished" podID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerID="990884e87515f311025d6cd5914b187883806cd5a9b1797ec1cebb761603db6c" exitCode=0 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521239 5047 generic.go:334] "Generic (PLEG): container finished" podID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerID="7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016" exitCode=0 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521250 5047 generic.go:334] "Generic (PLEG): container finished" podID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerID="a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117" exitCode=0 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521259 5047 generic.go:334] "Generic (PLEG): container finished" podID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerID="60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087" exitCode=0 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521267 5047 generic.go:334] "Generic (PLEG): container finished" podID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerID="e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626" exitCode=0 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521278 5047 generic.go:334] "Generic (PLEG): container finished" podID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerID="68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e" exitCode=0 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521287 5047 generic.go:334] "Generic (PLEG): container finished" podID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerID="d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624" exitCode=143 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521320 5047 generic.go:334] "Generic (PLEG): container finished" podID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerID="00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2" exitCode=143 Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521346 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"990884e87515f311025d6cd5914b187883806cd5a9b1797ec1cebb761603db6c"} Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521381 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016"} Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521394 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117"} Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521405 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087"} Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521417 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626"} Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521428 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e"} Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521439 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624"} Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.521451 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2"} Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.523769 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovnkube-controller/3.log" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.525689 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovn-acl-logging/0.log" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.526125 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovn-controller/0.log" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.526613 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.551688 5047 scope.go:117] "RemoveContainer" containerID="a6d64d637e5e9304432a989eecfbef377d4f828650d72ff59119aa2a1143b1e3" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.596411 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4f2gn"] Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.596659 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" containerName="registry-server" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.596678 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" containerName="registry-server" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.596692 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.596702 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.596869 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.596881 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.596893 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovn-acl-logging" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.596924 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovn-acl-logging" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.596936 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="northd" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.596943 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="northd" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.596953 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="sbdb" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.596960 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="sbdb" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.596970 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.596979 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.596991 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" containerName="extract-content" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.596999 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" containerName="extract-content" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.597010 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="nbdb" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597017 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="nbdb" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.597027 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597035 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.597044 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597054 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.597066 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="kube-rbac-proxy-node" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597081 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="kube-rbac-proxy-node" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.597097 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovn-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597106 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovn-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.597118 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="kubecfg-setup" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597126 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="kubecfg-setup" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.597140 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" containerName="extract-utilities" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597147 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" containerName="extract-utilities" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597402 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597419 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="sbdb" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597429 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597440 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ed70514-e0d2-4e31-88be-0c4bcef7fb9d" containerName="registry-server" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597453 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597462 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="kube-rbac-proxy-node" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597470 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597477 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovn-acl-logging" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597489 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597497 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovn-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597507 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="northd" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597515 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="nbdb" Feb 23 06:59:29 crc kubenswrapper[5047]: E0223 06:59:29.597625 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597635 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.597737 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" containerName="ovnkube-controller" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.599461 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.617816 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-etc-openvswitch\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.617855 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-var-lib-openvswitch\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.617889 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt5r2\" (UniqueName: \"kubernetes.io/projected/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-kube-api-access-kt5r2\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618008 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-node-log\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618105 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-slash\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618157 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-kubelet\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618179 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-run-ovn-kubernetes\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618198 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-cni-netd\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618229 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-run-netns\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618249 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-run-ovn\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618457 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-run-openvswitch\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618480 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-ovnkube-script-lib\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618500 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-env-overrides\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618549 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-run-systemd\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618567 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-ovnkube-config\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618583 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618603 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-cni-bin\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618619 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-log-socket\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618662 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-systemd-units\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.618687 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-ovn-node-metrics-cert\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.719286 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d56904fe-1a5a-4fde-b122-947fd9a28130-ovn-node-metrics-cert\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.719369 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.719401 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-openvswitch\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.719461 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.719498 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-systemd\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.719947 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720701 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-ovnkube-config\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720736 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-etc-openvswitch\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720758 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-systemd-units\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720777 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-cni-bin\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720795 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-ovn\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720822 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-kubelet\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720835 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-cni-netd\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720849 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-run-netns\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720844 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720868 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-ovnkube-script-lib\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720887 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-env-overrides\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720920 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-run-ovn-kubernetes\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720946 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-var-lib-openvswitch\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720943 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.720988 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-log-socket\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721007 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-slash\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721006 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721043 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qs44\" (UniqueName: \"kubernetes.io/projected/d56904fe-1a5a-4fde-b122-947fd9a28130-kube-api-access-7qs44\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721046 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721061 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-node-log\") pod \"d56904fe-1a5a-4fde-b122-947fd9a28130\" (UID: \"d56904fe-1a5a-4fde-b122-947fd9a28130\") " Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721084 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721085 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721111 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721139 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721137 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721179 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-log-socket" (OuterVolumeSpecName: "log-socket") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721202 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-kubelet\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721217 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721226 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-run-ovn-kubernetes\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721256 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-run-ovn-kubernetes\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721293 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-slash" (OuterVolumeSpecName: "host-slash") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721293 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-cni-netd\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721359 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721372 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-run-netns\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721411 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-run-ovn\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721466 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-run-openvswitch\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721499 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-ovnkube-script-lib\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721521 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-run-netns\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721549 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-env-overrides\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721574 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-cni-netd\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721581 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-run-openvswitch\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721610 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-run-ovn\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721635 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-kubelet\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721650 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-run-systemd\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721677 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-ovnkube-config\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721699 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721722 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-cni-bin\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721737 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-log-socket\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721755 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-systemd-units\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721769 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-ovn-node-metrics-cert\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721802 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-etc-openvswitch\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721817 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-var-lib-openvswitch\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721844 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt5r2\" (UniqueName: \"kubernetes.io/projected/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-kube-api-access-kt5r2\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721866 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-node-log\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721893 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-slash\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721963 5047 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721975 5047 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721984 5047 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721993 5047 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722001 5047 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722010 5047 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722018 5047 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722027 5047 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722037 5047 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722045 5047 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722055 5047 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722063 5047 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-log-socket\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722071 5047 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-slash\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722080 5047 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722088 5047 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722113 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-slash\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721552 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722112 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-env-overrides\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.721609 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-node-log" (OuterVolumeSpecName: "node-log") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722147 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-run-systemd\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722151 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-systemd-units\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722571 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-ovnkube-config\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722603 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722626 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-host-cni-bin\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722650 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-log-socket\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722671 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-var-lib-openvswitch\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722694 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-etc-openvswitch\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722747 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-ovnkube-script-lib\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.722838 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-node-log\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.727648 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56904fe-1a5a-4fde-b122-947fd9a28130-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.727875 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56904fe-1a5a-4fde-b122-947fd9a28130-kube-api-access-7qs44" (OuterVolumeSpecName: "kube-api-access-7qs44") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "kube-api-access-7qs44". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.728000 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-ovn-node-metrics-cert\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.741157 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt5r2\" (UniqueName: \"kubernetes.io/projected/8ce27778-db6a-4a48-9e3b-d3b9c23ef610-kube-api-access-kt5r2\") pod \"ovnkube-node-4f2gn\" (UID: \"8ce27778-db6a-4a48-9e3b-d3b9c23ef610\") " pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.742848 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d56904fe-1a5a-4fde-b122-947fd9a28130" (UID: "d56904fe-1a5a-4fde-b122-947fd9a28130"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.822410 5047 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d56904fe-1a5a-4fde-b122-947fd9a28130-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.822447 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qs44\" (UniqueName: \"kubernetes.io/projected/d56904fe-1a5a-4fde-b122-947fd9a28130-kube-api-access-7qs44\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.822457 5047 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-node-log\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.822466 5047 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d56904fe-1a5a-4fde-b122-947fd9a28130-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.822477 5047 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d56904fe-1a5a-4fde-b122-947fd9a28130-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[5047]: I0223 06:59:29.936547 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:29 crc kubenswrapper[5047]: W0223 06:59:29.958024 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ce27778_db6a_4a48_9e3b_d3b9c23ef610.slice/crio-2068e1de3eb2854f2f12778223f41c99d53e1b76d2abe4dbbdb5318a048b16ac WatchSource:0}: Error finding container 2068e1de3eb2854f2f12778223f41c99d53e1b76d2abe4dbbdb5318a048b16ac: Status 404 returned error can't find the container with id 2068e1de3eb2854f2f12778223f41c99d53e1b76d2abe4dbbdb5318a048b16ac Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.544340 5047 generic.go:334] "Generic (PLEG): container finished" podID="8ce27778-db6a-4a48-9e3b-d3b9c23ef610" containerID="6f07599246c3a19cdf253e0d560a067c79d6f7994286c4858d24a956073a057c" exitCode=0 Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.544808 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" event={"ID":"8ce27778-db6a-4a48-9e3b-d3b9c23ef610","Type":"ContainerDied","Data":"6f07599246c3a19cdf253e0d560a067c79d6f7994286c4858d24a956073a057c"} Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.544853 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" event={"ID":"8ce27778-db6a-4a48-9e3b-d3b9c23ef610","Type":"ContainerStarted","Data":"2068e1de3eb2854f2f12778223f41c99d53e1b76d2abe4dbbdb5318a048b16ac"} Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.556630 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovn-acl-logging/0.log" Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.557309 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rklm9_d56904fe-1a5a-4fde-b122-947fd9a28130/ovn-controller/0.log" Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.557777 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" event={"ID":"d56904fe-1a5a-4fde-b122-947fd9a28130","Type":"ContainerDied","Data":"ea324bcdbb4441d059365008950e26825b4335a7f697e202136bcde5032be187"} Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.557832 5047 scope.go:117] "RemoveContainer" containerID="990884e87515f311025d6cd5914b187883806cd5a9b1797ec1cebb761603db6c" Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.558049 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rklm9" Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.563746 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n5dz9_e0fbd5e6-7dcc-4a13-936e-0db2e66394e8/kube-multus/2.log" Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.563835 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n5dz9" event={"ID":"e0fbd5e6-7dcc-4a13-936e-0db2e66394e8","Type":"ContainerStarted","Data":"942b86b6a78424d3cb1d1987d7cc35e0bc51cb55810bb1935a04abe7a2f5a59a"} Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.596581 5047 scope.go:117] "RemoveContainer" containerID="7420ff3367aeb77d4d629dcd5c7ca0e444666f2f1d41cc599fefefead9349016" Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.630668 5047 scope.go:117] "RemoveContainer" containerID="a54462bc0f6dcb8fe02a05fca0a14929e63f23a0f8bd94805aa550f4f2b12117" Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.661617 5047 scope.go:117] "RemoveContainer" containerID="60892f1be5d186d6122c362810692f1442d592e71433006d2b16a6440baf4087" Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.687966 5047 scope.go:117] "RemoveContainer" containerID="e76d096bd285623a6fa697153f05a09bf9367b96fc2859f70e77e4f49b9e1626" Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.690302 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rklm9"] Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.693925 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rklm9"] Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.706182 5047 scope.go:117] "RemoveContainer" containerID="68d052f44a5eb9a9fac44842cc095b2c0dc86854a47fdcae2fae3857eaef173e" Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.718541 5047 scope.go:117] "RemoveContainer" containerID="d319ecb1a636f57dd8ed00ff1fc044b59087197a55869815153ba738567fc624" Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.734149 5047 scope.go:117] "RemoveContainer" containerID="00d37a2dde2a97a231fefd77808ee27f1ff52045e9afcf0763a91b14d9a25eb2" Feb 23 06:59:30 crc kubenswrapper[5047]: I0223 06:59:30.761313 5047 scope.go:117] "RemoveContainer" containerID="b25772e94627fc64010897e81c05ad3a51691044ea91e42a61a02318d5c6b15f" Feb 23 06:59:31 crc kubenswrapper[5047]: I0223 06:59:31.575295 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" event={"ID":"8ce27778-db6a-4a48-9e3b-d3b9c23ef610","Type":"ContainerStarted","Data":"eb4516c44661fb4241d9dd60508256b05b0838816830c1dcd72d338e122e3c81"} Feb 23 06:59:31 crc kubenswrapper[5047]: I0223 06:59:31.575867 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" event={"ID":"8ce27778-db6a-4a48-9e3b-d3b9c23ef610","Type":"ContainerStarted","Data":"c376b63bd97cbf8be78565eb4a67903155d5b1f9b1d8e4535f575937182d5866"} Feb 23 06:59:31 crc kubenswrapper[5047]: I0223 06:59:31.575886 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" event={"ID":"8ce27778-db6a-4a48-9e3b-d3b9c23ef610","Type":"ContainerStarted","Data":"188aa00d27391c07bcefc900b935d2cb87942cd6e93554e7f4ca5b59bfd7122d"} Feb 23 06:59:31 crc kubenswrapper[5047]: I0223 06:59:31.576333 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" event={"ID":"8ce27778-db6a-4a48-9e3b-d3b9c23ef610","Type":"ContainerStarted","Data":"e903998bbdd1daa3aef25ff2504388439f13941caff98d47dfcb857b52cf1731"} Feb 23 06:59:31 crc kubenswrapper[5047]: I0223 06:59:31.576345 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" event={"ID":"8ce27778-db6a-4a48-9e3b-d3b9c23ef610","Type":"ContainerStarted","Data":"4cbb43f0fd9f109dae0c7959838111ea34f1f770784ce6c37897d779c875d6ac"} Feb 23 06:59:31 crc kubenswrapper[5047]: I0223 06:59:31.576357 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" event={"ID":"8ce27778-db6a-4a48-9e3b-d3b9c23ef610","Type":"ContainerStarted","Data":"e3e1354bd6339f97954d008d4f0e87c2a42ebaccf78966c003d0af0ccff47016"} Feb 23 06:59:32 crc kubenswrapper[5047]: I0223 06:59:32.361377 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d56904fe-1a5a-4fde-b122-947fd9a28130" path="/var/lib/kubelet/pods/d56904fe-1a5a-4fde-b122-947fd9a28130/volumes" Feb 23 06:59:34 crc kubenswrapper[5047]: I0223 06:59:34.612213 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" event={"ID":"8ce27778-db6a-4a48-9e3b-d3b9c23ef610","Type":"ContainerStarted","Data":"5b94e98acafe4a2a8a5d774efa9367009ef231fda6c91545256195ee278e2562"} Feb 23 06:59:36 crc kubenswrapper[5047]: I0223 06:59:36.631294 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" event={"ID":"8ce27778-db6a-4a48-9e3b-d3b9c23ef610","Type":"ContainerStarted","Data":"80b23c0a497c70d0cd14e8c8f5c56ccabad54355dc2d955dbece7535d5d252bb"} Feb 23 06:59:36 crc kubenswrapper[5047]: I0223 06:59:36.632923 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:36 crc kubenswrapper[5047]: I0223 06:59:36.632979 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:36 crc kubenswrapper[5047]: I0223 06:59:36.633036 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:36 crc kubenswrapper[5047]: I0223 06:59:36.677563 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:36 crc kubenswrapper[5047]: I0223 06:59:36.686591 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" podStartSLOduration=7.686567203 podStartE2EDuration="7.686567203s" podCreationTimestamp="2026-02-23 06:59:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:36.678312767 +0000 UTC m=+898.929639911" watchObservedRunningTime="2026-02-23 06:59:36.686567203 +0000 UTC m=+898.937894357" Feb 23 06:59:36 crc kubenswrapper[5047]: I0223 06:59:36.687188 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.541396 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-rgc85"] Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.549863 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.557434 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.557485 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.557982 5047 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-z68qx" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.566353 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.566721 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-rgc85"] Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.585560 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8b5e7015-387b-4bed-893c-0c80220d98e5-node-mnt\") pod \"crc-storage-crc-rgc85\" (UID: \"8b5e7015-387b-4bed-893c-0c80220d98e5\") " pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.585624 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5d4n\" (UniqueName: \"kubernetes.io/projected/8b5e7015-387b-4bed-893c-0c80220d98e5-kube-api-access-w5d4n\") pod \"crc-storage-crc-rgc85\" (UID: \"8b5e7015-387b-4bed-893c-0c80220d98e5\") " pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.585708 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8b5e7015-387b-4bed-893c-0c80220d98e5-crc-storage\") pod \"crc-storage-crc-rgc85\" (UID: \"8b5e7015-387b-4bed-893c-0c80220d98e5\") " pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.686735 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8b5e7015-387b-4bed-893c-0c80220d98e5-node-mnt\") pod \"crc-storage-crc-rgc85\" (UID: \"8b5e7015-387b-4bed-893c-0c80220d98e5\") " pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.686804 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5d4n\" (UniqueName: \"kubernetes.io/projected/8b5e7015-387b-4bed-893c-0c80220d98e5-kube-api-access-w5d4n\") pod \"crc-storage-crc-rgc85\" (UID: \"8b5e7015-387b-4bed-893c-0c80220d98e5\") " pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.686848 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8b5e7015-387b-4bed-893c-0c80220d98e5-crc-storage\") pod \"crc-storage-crc-rgc85\" (UID: \"8b5e7015-387b-4bed-893c-0c80220d98e5\") " pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.687411 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8b5e7015-387b-4bed-893c-0c80220d98e5-node-mnt\") pod \"crc-storage-crc-rgc85\" (UID: \"8b5e7015-387b-4bed-893c-0c80220d98e5\") " pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.687874 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8b5e7015-387b-4bed-893c-0c80220d98e5-crc-storage\") pod \"crc-storage-crc-rgc85\" (UID: \"8b5e7015-387b-4bed-893c-0c80220d98e5\") " pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.719690 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5d4n\" (UniqueName: \"kubernetes.io/projected/8b5e7015-387b-4bed-893c-0c80220d98e5-kube-api-access-w5d4n\") pod \"crc-storage-crc-rgc85\" (UID: \"8b5e7015-387b-4bed-893c-0c80220d98e5\") " pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:39 crc kubenswrapper[5047]: I0223 06:59:39.886884 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:39 crc kubenswrapper[5047]: E0223 06:59:39.924737 5047 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-rgc85_crc-storage_8b5e7015-387b-4bed-893c-0c80220d98e5_0(08149af823ba6b288d632526049dceb5c2850ea2f5887132af9d07b8cf088f1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:59:39 crc kubenswrapper[5047]: E0223 06:59:39.924833 5047 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-rgc85_crc-storage_8b5e7015-387b-4bed-893c-0c80220d98e5_0(08149af823ba6b288d632526049dceb5c2850ea2f5887132af9d07b8cf088f1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:39 crc kubenswrapper[5047]: E0223 06:59:39.924859 5047 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-rgc85_crc-storage_8b5e7015-387b-4bed-893c-0c80220d98e5_0(08149af823ba6b288d632526049dceb5c2850ea2f5887132af9d07b8cf088f1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:39 crc kubenswrapper[5047]: E0223 06:59:39.924943 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-rgc85_crc-storage(8b5e7015-387b-4bed-893c-0c80220d98e5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-rgc85_crc-storage(8b5e7015-387b-4bed-893c-0c80220d98e5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-rgc85_crc-storage_8b5e7015-387b-4bed-893c-0c80220d98e5_0(08149af823ba6b288d632526049dceb5c2850ea2f5887132af9d07b8cf088f1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-rgc85" podUID="8b5e7015-387b-4bed-893c-0c80220d98e5" Feb 23 06:59:40 crc kubenswrapper[5047]: I0223 06:59:40.660357 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:40 crc kubenswrapper[5047]: I0223 06:59:40.661541 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:40 crc kubenswrapper[5047]: I0223 06:59:40.972369 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-rgc85"] Feb 23 06:59:40 crc kubenswrapper[5047]: W0223 06:59:40.988372 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b5e7015_387b_4bed_893c_0c80220d98e5.slice/crio-0b6204337d9c2ea0e60cbca8be1edcc702fef56daa1d6a7c1539d1980139dcf4 WatchSource:0}: Error finding container 0b6204337d9c2ea0e60cbca8be1edcc702fef56daa1d6a7c1539d1980139dcf4: Status 404 returned error can't find the container with id 0b6204337d9c2ea0e60cbca8be1edcc702fef56daa1d6a7c1539d1980139dcf4 Feb 23 06:59:41 crc kubenswrapper[5047]: I0223 06:59:41.668315 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rgc85" event={"ID":"8b5e7015-387b-4bed-893c-0c80220d98e5","Type":"ContainerStarted","Data":"0b6204337d9c2ea0e60cbca8be1edcc702fef56daa1d6a7c1539d1980139dcf4"} Feb 23 06:59:42 crc kubenswrapper[5047]: I0223 06:59:42.679036 5047 generic.go:334] "Generic (PLEG): container finished" podID="8b5e7015-387b-4bed-893c-0c80220d98e5" containerID="bfb3b1b6c1bb52e7fa3129c3b4e49a5d31e93d027bc077159a879900e4e36f7e" exitCode=0 Feb 23 06:59:42 crc kubenswrapper[5047]: I0223 06:59:42.679132 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rgc85" event={"ID":"8b5e7015-387b-4bed-893c-0c80220d98e5","Type":"ContainerDied","Data":"bfb3b1b6c1bb52e7fa3129c3b4e49a5d31e93d027bc077159a879900e4e36f7e"} Feb 23 06:59:43 crc kubenswrapper[5047]: I0223 06:59:43.975395 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:44 crc kubenswrapper[5047]: I0223 06:59:44.157101 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8b5e7015-387b-4bed-893c-0c80220d98e5-crc-storage\") pod \"8b5e7015-387b-4bed-893c-0c80220d98e5\" (UID: \"8b5e7015-387b-4bed-893c-0c80220d98e5\") " Feb 23 06:59:44 crc kubenswrapper[5047]: I0223 06:59:44.157284 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8b5e7015-387b-4bed-893c-0c80220d98e5-node-mnt\") pod \"8b5e7015-387b-4bed-893c-0c80220d98e5\" (UID: \"8b5e7015-387b-4bed-893c-0c80220d98e5\") " Feb 23 06:59:44 crc kubenswrapper[5047]: I0223 06:59:44.157357 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5d4n\" (UniqueName: \"kubernetes.io/projected/8b5e7015-387b-4bed-893c-0c80220d98e5-kube-api-access-w5d4n\") pod \"8b5e7015-387b-4bed-893c-0c80220d98e5\" (UID: \"8b5e7015-387b-4bed-893c-0c80220d98e5\") " Feb 23 06:59:44 crc kubenswrapper[5047]: I0223 06:59:44.162012 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b5e7015-387b-4bed-893c-0c80220d98e5-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "8b5e7015-387b-4bed-893c-0c80220d98e5" (UID: "8b5e7015-387b-4bed-893c-0c80220d98e5"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:44 crc kubenswrapper[5047]: I0223 06:59:44.168830 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b5e7015-387b-4bed-893c-0c80220d98e5-kube-api-access-w5d4n" (OuterVolumeSpecName: "kube-api-access-w5d4n") pod "8b5e7015-387b-4bed-893c-0c80220d98e5" (UID: "8b5e7015-387b-4bed-893c-0c80220d98e5"). InnerVolumeSpecName "kube-api-access-w5d4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:44 crc kubenswrapper[5047]: I0223 06:59:44.187966 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b5e7015-387b-4bed-893c-0c80220d98e5-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "8b5e7015-387b-4bed-893c-0c80220d98e5" (UID: "8b5e7015-387b-4bed-893c-0c80220d98e5"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:44 crc kubenswrapper[5047]: I0223 06:59:44.258910 5047 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8b5e7015-387b-4bed-893c-0c80220d98e5-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[5047]: I0223 06:59:44.259012 5047 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8b5e7015-387b-4bed-893c-0c80220d98e5-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[5047]: I0223 06:59:44.259032 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5d4n\" (UniqueName: \"kubernetes.io/projected/8b5e7015-387b-4bed-893c-0c80220d98e5-kube-api-access-w5d4n\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[5047]: I0223 06:59:44.697058 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rgc85" event={"ID":"8b5e7015-387b-4bed-893c-0c80220d98e5","Type":"ContainerDied","Data":"0b6204337d9c2ea0e60cbca8be1edcc702fef56daa1d6a7c1539d1980139dcf4"} Feb 23 06:59:44 crc kubenswrapper[5047]: I0223 06:59:44.697143 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b6204337d9c2ea0e60cbca8be1edcc702fef56daa1d6a7c1539d1980139dcf4" Feb 23 06:59:44 crc kubenswrapper[5047]: I0223 06:59:44.697169 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rgc85" Feb 23 06:59:46 crc kubenswrapper[5047]: I0223 06:59:46.760288 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:59:46 crc kubenswrapper[5047]: I0223 06:59:46.761461 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:59:46 crc kubenswrapper[5047]: I0223 06:59:46.761668 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 06:59:46 crc kubenswrapper[5047]: I0223 06:59:46.763050 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf0d472a1be69e8faaec3de3ca52b467dc0e01107e59ea30330ccf2c4ba1980f"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 06:59:46 crc kubenswrapper[5047]: I0223 06:59:46.763181 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://bf0d472a1be69e8faaec3de3ca52b467dc0e01107e59ea30330ccf2c4ba1980f" gracePeriod=600 Feb 23 06:59:47 crc kubenswrapper[5047]: I0223 06:59:47.723140 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="bf0d472a1be69e8faaec3de3ca52b467dc0e01107e59ea30330ccf2c4ba1980f" exitCode=0 Feb 23 06:59:47 crc kubenswrapper[5047]: I0223 06:59:47.723272 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"bf0d472a1be69e8faaec3de3ca52b467dc0e01107e59ea30330ccf2c4ba1980f"} Feb 23 06:59:47 crc kubenswrapper[5047]: I0223 06:59:47.723713 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"3620b5bb5ccf16efd5aeafb9b69b3b4d8f6c25db9ca9c3c473e293ae127f043c"} Feb 23 06:59:47 crc kubenswrapper[5047]: I0223 06:59:47.723768 5047 scope.go:117] "RemoveContainer" containerID="8fef5335fab11be06cd5e79850ed7338a8964bb949c146ab6f73caf5aceecdbf" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.017827 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n"] Feb 23 06:59:51 crc kubenswrapper[5047]: E0223 06:59:51.018736 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5e7015-387b-4bed-893c-0c80220d98e5" containerName="storage" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.018753 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5e7015-387b-4bed-893c-0c80220d98e5" containerName="storage" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.018904 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b5e7015-387b-4bed-893c-0c80220d98e5" containerName="storage" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.019745 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.021854 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.032634 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n"] Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.078660 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmpzq\" (UniqueName: \"kubernetes.io/projected/a40f73c1-7a41-4706-87fa-f5059a8adb4f-kube-api-access-rmpzq\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n\" (UID: \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.078755 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a40f73c1-7a41-4706-87fa-f5059a8adb4f-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n\" (UID: \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.078800 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a40f73c1-7a41-4706-87fa-f5059a8adb4f-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n\" (UID: \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.181446 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a40f73c1-7a41-4706-87fa-f5059a8adb4f-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n\" (UID: \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.181509 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a40f73c1-7a41-4706-87fa-f5059a8adb4f-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n\" (UID: \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.181592 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmpzq\" (UniqueName: \"kubernetes.io/projected/a40f73c1-7a41-4706-87fa-f5059a8adb4f-kube-api-access-rmpzq\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n\" (UID: \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.182132 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a40f73c1-7a41-4706-87fa-f5059a8adb4f-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n\" (UID: \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.182151 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a40f73c1-7a41-4706-87fa-f5059a8adb4f-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n\" (UID: \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.215082 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmpzq\" (UniqueName: \"kubernetes.io/projected/a40f73c1-7a41-4706-87fa-f5059a8adb4f-kube-api-access-rmpzq\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n\" (UID: \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.385018 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.622875 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n"] Feb 23 06:59:51 crc kubenswrapper[5047]: W0223 06:59:51.629029 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda40f73c1_7a41_4706_87fa_f5059a8adb4f.slice/crio-69cc8fe0069b138e43e6e92214da2cb6908cb1d7ac88872fd8d42521f613b4e0 WatchSource:0}: Error finding container 69cc8fe0069b138e43e6e92214da2cb6908cb1d7ac88872fd8d42521f613b4e0: Status 404 returned error can't find the container with id 69cc8fe0069b138e43e6e92214da2cb6908cb1d7ac88872fd8d42521f613b4e0 Feb 23 06:59:51 crc kubenswrapper[5047]: I0223 06:59:51.765696 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" event={"ID":"a40f73c1-7a41-4706-87fa-f5059a8adb4f","Type":"ContainerStarted","Data":"69cc8fe0069b138e43e6e92214da2cb6908cb1d7ac88872fd8d42521f613b4e0"} Feb 23 06:59:52 crc kubenswrapper[5047]: I0223 06:59:52.777440 5047 generic.go:334] "Generic (PLEG): container finished" podID="a40f73c1-7a41-4706-87fa-f5059a8adb4f" containerID="dc96948b978e9c6f168256f1419394b27965d02750c053d87a036cab1935c09d" exitCode=0 Feb 23 06:59:52 crc kubenswrapper[5047]: I0223 06:59:52.777580 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" event={"ID":"a40f73c1-7a41-4706-87fa-f5059a8adb4f","Type":"ContainerDied","Data":"dc96948b978e9c6f168256f1419394b27965d02750c053d87a036cab1935c09d"} Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.379440 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gd458"] Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.382038 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gd458" Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.393802 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gd458"] Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.516579 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb0ee07-15b4-4671-a292-a36f1a055e7b-utilities\") pod \"redhat-operators-gd458\" (UID: \"afb0ee07-15b4-4671-a292-a36f1a055e7b\") " pod="openshift-marketplace/redhat-operators-gd458" Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.516706 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd5bj\" (UniqueName: \"kubernetes.io/projected/afb0ee07-15b4-4671-a292-a36f1a055e7b-kube-api-access-cd5bj\") pod \"redhat-operators-gd458\" (UID: \"afb0ee07-15b4-4671-a292-a36f1a055e7b\") " pod="openshift-marketplace/redhat-operators-gd458" Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.516792 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb0ee07-15b4-4671-a292-a36f1a055e7b-catalog-content\") pod \"redhat-operators-gd458\" (UID: \"afb0ee07-15b4-4671-a292-a36f1a055e7b\") " pod="openshift-marketplace/redhat-operators-gd458" Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.618310 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb0ee07-15b4-4671-a292-a36f1a055e7b-catalog-content\") pod \"redhat-operators-gd458\" (UID: \"afb0ee07-15b4-4671-a292-a36f1a055e7b\") " pod="openshift-marketplace/redhat-operators-gd458" Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.618397 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb0ee07-15b4-4671-a292-a36f1a055e7b-utilities\") pod \"redhat-operators-gd458\" (UID: \"afb0ee07-15b4-4671-a292-a36f1a055e7b\") " pod="openshift-marketplace/redhat-operators-gd458" Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.618434 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd5bj\" (UniqueName: \"kubernetes.io/projected/afb0ee07-15b4-4671-a292-a36f1a055e7b-kube-api-access-cd5bj\") pod \"redhat-operators-gd458\" (UID: \"afb0ee07-15b4-4671-a292-a36f1a055e7b\") " pod="openshift-marketplace/redhat-operators-gd458" Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.618940 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb0ee07-15b4-4671-a292-a36f1a055e7b-catalog-content\") pod \"redhat-operators-gd458\" (UID: \"afb0ee07-15b4-4671-a292-a36f1a055e7b\") " pod="openshift-marketplace/redhat-operators-gd458" Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.618970 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb0ee07-15b4-4671-a292-a36f1a055e7b-utilities\") pod \"redhat-operators-gd458\" (UID: \"afb0ee07-15b4-4671-a292-a36f1a055e7b\") " pod="openshift-marketplace/redhat-operators-gd458" Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.650313 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd5bj\" (UniqueName: \"kubernetes.io/projected/afb0ee07-15b4-4671-a292-a36f1a055e7b-kube-api-access-cd5bj\") pod \"redhat-operators-gd458\" (UID: \"afb0ee07-15b4-4671-a292-a36f1a055e7b\") " pod="openshift-marketplace/redhat-operators-gd458" Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.719737 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gd458" Feb 23 06:59:53 crc kubenswrapper[5047]: I0223 06:59:53.939693 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gd458"] Feb 23 06:59:53 crc kubenswrapper[5047]: W0223 06:59:53.952781 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb0ee07_15b4_4671_a292_a36f1a055e7b.slice/crio-556717faf42d5ead060f93f59053b8bc5108cb05d7edf9741e2b22e90244359e WatchSource:0}: Error finding container 556717faf42d5ead060f93f59053b8bc5108cb05d7edf9741e2b22e90244359e: Status 404 returned error can't find the container with id 556717faf42d5ead060f93f59053b8bc5108cb05d7edf9741e2b22e90244359e Feb 23 06:59:54 crc kubenswrapper[5047]: I0223 06:59:54.793235 5047 generic.go:334] "Generic (PLEG): container finished" podID="afb0ee07-15b4-4671-a292-a36f1a055e7b" containerID="38dd5f05e35c0792ef3ecfdb69fa69f3f56ef3dc135ed0b289b5f368b997d4be" exitCode=0 Feb 23 06:59:54 crc kubenswrapper[5047]: I0223 06:59:54.793403 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gd458" event={"ID":"afb0ee07-15b4-4671-a292-a36f1a055e7b","Type":"ContainerDied","Data":"38dd5f05e35c0792ef3ecfdb69fa69f3f56ef3dc135ed0b289b5f368b997d4be"} Feb 23 06:59:54 crc kubenswrapper[5047]: I0223 06:59:54.793854 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gd458" event={"ID":"afb0ee07-15b4-4671-a292-a36f1a055e7b","Type":"ContainerStarted","Data":"556717faf42d5ead060f93f59053b8bc5108cb05d7edf9741e2b22e90244359e"} Feb 23 06:59:54 crc kubenswrapper[5047]: I0223 06:59:54.796432 5047 generic.go:334] "Generic (PLEG): container finished" podID="a40f73c1-7a41-4706-87fa-f5059a8adb4f" containerID="6691b1338432b18dc2677c39ff6a1b599c5359f1bdcc157625fef62968ed4997" exitCode=0 Feb 23 06:59:54 crc kubenswrapper[5047]: I0223 06:59:54.796496 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" event={"ID":"a40f73c1-7a41-4706-87fa-f5059a8adb4f","Type":"ContainerDied","Data":"6691b1338432b18dc2677c39ff6a1b599c5359f1bdcc157625fef62968ed4997"} Feb 23 06:59:55 crc kubenswrapper[5047]: I0223 06:59:55.808494 5047 generic.go:334] "Generic (PLEG): container finished" podID="a40f73c1-7a41-4706-87fa-f5059a8adb4f" containerID="7f899017f0d891bf9e1907e122ef4d8e3534d47cf388db99c03e37dbe6f5235a" exitCode=0 Feb 23 06:59:55 crc kubenswrapper[5047]: I0223 06:59:55.808621 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" event={"ID":"a40f73c1-7a41-4706-87fa-f5059a8adb4f","Type":"ContainerDied","Data":"7f899017f0d891bf9e1907e122ef4d8e3534d47cf388db99c03e37dbe6f5235a"} Feb 23 06:59:55 crc kubenswrapper[5047]: I0223 06:59:55.813029 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gd458" event={"ID":"afb0ee07-15b4-4671-a292-a36f1a055e7b","Type":"ContainerStarted","Data":"bc0001ee5edfa3b0a3920f3bb3cc1d4c1711628f0f8fccc986d107360f24b3f4"} Feb 23 06:59:56 crc kubenswrapper[5047]: I0223 06:59:56.823846 5047 generic.go:334] "Generic (PLEG): container finished" podID="afb0ee07-15b4-4671-a292-a36f1a055e7b" containerID="bc0001ee5edfa3b0a3920f3bb3cc1d4c1711628f0f8fccc986d107360f24b3f4" exitCode=0 Feb 23 06:59:56 crc kubenswrapper[5047]: I0223 06:59:56.824008 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gd458" event={"ID":"afb0ee07-15b4-4671-a292-a36f1a055e7b","Type":"ContainerDied","Data":"bc0001ee5edfa3b0a3920f3bb3cc1d4c1711628f0f8fccc986d107360f24b3f4"} Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.236599 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.380379 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a40f73c1-7a41-4706-87fa-f5059a8adb4f-util\") pod \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\" (UID: \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\") " Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.380482 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmpzq\" (UniqueName: \"kubernetes.io/projected/a40f73c1-7a41-4706-87fa-f5059a8adb4f-kube-api-access-rmpzq\") pod \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\" (UID: \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\") " Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.380591 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a40f73c1-7a41-4706-87fa-f5059a8adb4f-bundle\") pod \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\" (UID: \"a40f73c1-7a41-4706-87fa-f5059a8adb4f\") " Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.381257 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a40f73c1-7a41-4706-87fa-f5059a8adb4f-bundle" (OuterVolumeSpecName: "bundle") pod "a40f73c1-7a41-4706-87fa-f5059a8adb4f" (UID: "a40f73c1-7a41-4706-87fa-f5059a8adb4f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.388506 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a40f73c1-7a41-4706-87fa-f5059a8adb4f-kube-api-access-rmpzq" (OuterVolumeSpecName: "kube-api-access-rmpzq") pod "a40f73c1-7a41-4706-87fa-f5059a8adb4f" (UID: "a40f73c1-7a41-4706-87fa-f5059a8adb4f"). InnerVolumeSpecName "kube-api-access-rmpzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.391229 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a40f73c1-7a41-4706-87fa-f5059a8adb4f-util" (OuterVolumeSpecName: "util") pod "a40f73c1-7a41-4706-87fa-f5059a8adb4f" (UID: "a40f73c1-7a41-4706-87fa-f5059a8adb4f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.482513 5047 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a40f73c1-7a41-4706-87fa-f5059a8adb4f-util\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.482574 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmpzq\" (UniqueName: \"kubernetes.io/projected/a40f73c1-7a41-4706-87fa-f5059a8adb4f-kube-api-access-rmpzq\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.482599 5047 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a40f73c1-7a41-4706-87fa-f5059a8adb4f-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.834751 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" event={"ID":"a40f73c1-7a41-4706-87fa-f5059a8adb4f","Type":"ContainerDied","Data":"69cc8fe0069b138e43e6e92214da2cb6908cb1d7ac88872fd8d42521f613b4e0"} Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.834821 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n" Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.834840 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69cc8fe0069b138e43e6e92214da2cb6908cb1d7ac88872fd8d42521f613b4e0" Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.838011 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gd458" event={"ID":"afb0ee07-15b4-4671-a292-a36f1a055e7b","Type":"ContainerStarted","Data":"9bb48aef934b0989213a35ee1db46731fa95fe73efb1583255a33c46798e0231"} Feb 23 06:59:57 crc kubenswrapper[5047]: I0223 06:59:57.869313 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gd458" podStartSLOduration=2.384946976 podStartE2EDuration="4.869280714s" podCreationTimestamp="2026-02-23 06:59:53 +0000 UTC" firstStartedPulling="2026-02-23 06:59:54.795358974 +0000 UTC m=+917.046686108" lastFinishedPulling="2026-02-23 06:59:57.279692712 +0000 UTC m=+919.531019846" observedRunningTime="2026-02-23 06:59:57.865858289 +0000 UTC m=+920.117185443" watchObservedRunningTime="2026-02-23 06:59:57.869280714 +0000 UTC m=+920.120607858" Feb 23 06:59:59 crc kubenswrapper[5047]: I0223 06:59:59.978012 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4f2gn" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.191036 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9"] Feb 23 07:00:00 crc kubenswrapper[5047]: E0223 07:00:00.191366 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a40f73c1-7a41-4706-87fa-f5059a8adb4f" containerName="util" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.191390 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a40f73c1-7a41-4706-87fa-f5059a8adb4f" containerName="util" Feb 23 07:00:00 crc kubenswrapper[5047]: E0223 07:00:00.191402 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a40f73c1-7a41-4706-87fa-f5059a8adb4f" containerName="pull" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.191411 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a40f73c1-7a41-4706-87fa-f5059a8adb4f" containerName="pull" Feb 23 07:00:00 crc kubenswrapper[5047]: E0223 07:00:00.191425 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a40f73c1-7a41-4706-87fa-f5059a8adb4f" containerName="extract" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.191432 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a40f73c1-7a41-4706-87fa-f5059a8adb4f" containerName="extract" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.191555 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a40f73c1-7a41-4706-87fa-f5059a8adb4f" containerName="extract" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.192057 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.194571 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.194611 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.207479 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9"] Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.324402 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5340c59b-28d2-4263-9d55-44587a81ea28-config-volume\") pod \"collect-profiles-29530500-hchc9\" (UID: \"5340c59b-28d2-4263-9d55-44587a81ea28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.324476 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5340c59b-28d2-4263-9d55-44587a81ea28-secret-volume\") pod \"collect-profiles-29530500-hchc9\" (UID: \"5340c59b-28d2-4263-9d55-44587a81ea28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.324509 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfl82\" (UniqueName: \"kubernetes.io/projected/5340c59b-28d2-4263-9d55-44587a81ea28-kube-api-access-jfl82\") pod \"collect-profiles-29530500-hchc9\" (UID: \"5340c59b-28d2-4263-9d55-44587a81ea28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.425596 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5340c59b-28d2-4263-9d55-44587a81ea28-config-volume\") pod \"collect-profiles-29530500-hchc9\" (UID: \"5340c59b-28d2-4263-9d55-44587a81ea28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.425661 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5340c59b-28d2-4263-9d55-44587a81ea28-secret-volume\") pod \"collect-profiles-29530500-hchc9\" (UID: \"5340c59b-28d2-4263-9d55-44587a81ea28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.425693 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfl82\" (UniqueName: \"kubernetes.io/projected/5340c59b-28d2-4263-9d55-44587a81ea28-kube-api-access-jfl82\") pod \"collect-profiles-29530500-hchc9\" (UID: \"5340c59b-28d2-4263-9d55-44587a81ea28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.427447 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5340c59b-28d2-4263-9d55-44587a81ea28-config-volume\") pod \"collect-profiles-29530500-hchc9\" (UID: \"5340c59b-28d2-4263-9d55-44587a81ea28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.434222 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5340c59b-28d2-4263-9d55-44587a81ea28-secret-volume\") pod \"collect-profiles-29530500-hchc9\" (UID: \"5340c59b-28d2-4263-9d55-44587a81ea28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.451983 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfl82\" (UniqueName: \"kubernetes.io/projected/5340c59b-28d2-4263-9d55-44587a81ea28-kube-api-access-jfl82\") pod \"collect-profiles-29530500-hchc9\" (UID: \"5340c59b-28d2-4263-9d55-44587a81ea28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.512926 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" Feb 23 07:00:00 crc kubenswrapper[5047]: I0223 07:00:00.932326 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9"] Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.280486 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-xrldn"] Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.283665 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-xrldn" Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.286568 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.286826 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-67wk9" Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.287630 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.305033 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-xrldn"] Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.439936 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8sks\" (UniqueName: \"kubernetes.io/projected/82af618b-d2b3-4b63-8112-54073bbbea1e-kube-api-access-q8sks\") pod \"nmstate-operator-694c9596b7-xrldn\" (UID: \"82af618b-d2b3-4b63-8112-54073bbbea1e\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-xrldn" Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.540954 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8sks\" (UniqueName: \"kubernetes.io/projected/82af618b-d2b3-4b63-8112-54073bbbea1e-kube-api-access-q8sks\") pod \"nmstate-operator-694c9596b7-xrldn\" (UID: \"82af618b-d2b3-4b63-8112-54073bbbea1e\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-xrldn" Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.562279 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8sks\" (UniqueName: \"kubernetes.io/projected/82af618b-d2b3-4b63-8112-54073bbbea1e-kube-api-access-q8sks\") pod \"nmstate-operator-694c9596b7-xrldn\" (UID: \"82af618b-d2b3-4b63-8112-54073bbbea1e\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-xrldn" Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.599247 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-xrldn" Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.852040 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-xrldn"] Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.866309 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" event={"ID":"5340c59b-28d2-4263-9d55-44587a81ea28","Type":"ContainerStarted","Data":"1891e4a1c2ab2142005656591d4d959173de2074304278f15288f5e43fb22087"} Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.866394 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" event={"ID":"5340c59b-28d2-4263-9d55-44587a81ea28","Type":"ContainerStarted","Data":"2f4e24163416ffa7366546ddeafec4db7b1b971569afdc04da76eb10fbd22b20"} Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.871619 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-xrldn" event={"ID":"82af618b-d2b3-4b63-8112-54073bbbea1e","Type":"ContainerStarted","Data":"e9be6df71d630fdf2dec9f454d6898f0cb54a33907a9df1f54822575d3490c44"} Feb 23 07:00:01 crc kubenswrapper[5047]: I0223 07:00:01.888362 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" podStartSLOduration=1.8883236289999998 podStartE2EDuration="1.888323629s" podCreationTimestamp="2026-02-23 07:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:01.884542606 +0000 UTC m=+924.135869740" watchObservedRunningTime="2026-02-23 07:00:01.888323629 +0000 UTC m=+924.139650803" Feb 23 07:00:02 crc kubenswrapper[5047]: I0223 07:00:02.878753 5047 generic.go:334] "Generic (PLEG): container finished" podID="5340c59b-28d2-4263-9d55-44587a81ea28" containerID="1891e4a1c2ab2142005656591d4d959173de2074304278f15288f5e43fb22087" exitCode=0 Feb 23 07:00:02 crc kubenswrapper[5047]: I0223 07:00:02.878844 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" event={"ID":"5340c59b-28d2-4263-9d55-44587a81ea28","Type":"ContainerDied","Data":"1891e4a1c2ab2142005656591d4d959173de2074304278f15288f5e43fb22087"} Feb 23 07:00:03 crc kubenswrapper[5047]: I0223 07:00:03.720681 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gd458" Feb 23 07:00:03 crc kubenswrapper[5047]: I0223 07:00:03.720735 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gd458" Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.205921 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.283839 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5340c59b-28d2-4263-9d55-44587a81ea28-secret-volume\") pod \"5340c59b-28d2-4263-9d55-44587a81ea28\" (UID: \"5340c59b-28d2-4263-9d55-44587a81ea28\") " Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.284434 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfl82\" (UniqueName: \"kubernetes.io/projected/5340c59b-28d2-4263-9d55-44587a81ea28-kube-api-access-jfl82\") pod \"5340c59b-28d2-4263-9d55-44587a81ea28\" (UID: \"5340c59b-28d2-4263-9d55-44587a81ea28\") " Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.284485 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5340c59b-28d2-4263-9d55-44587a81ea28-config-volume\") pod \"5340c59b-28d2-4263-9d55-44587a81ea28\" (UID: \"5340c59b-28d2-4263-9d55-44587a81ea28\") " Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.285608 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5340c59b-28d2-4263-9d55-44587a81ea28-config-volume" (OuterVolumeSpecName: "config-volume") pod "5340c59b-28d2-4263-9d55-44587a81ea28" (UID: "5340c59b-28d2-4263-9d55-44587a81ea28"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.290588 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5340c59b-28d2-4263-9d55-44587a81ea28-kube-api-access-jfl82" (OuterVolumeSpecName: "kube-api-access-jfl82") pod "5340c59b-28d2-4263-9d55-44587a81ea28" (UID: "5340c59b-28d2-4263-9d55-44587a81ea28"). InnerVolumeSpecName "kube-api-access-jfl82". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.306271 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5340c59b-28d2-4263-9d55-44587a81ea28-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5340c59b-28d2-4263-9d55-44587a81ea28" (UID: "5340c59b-28d2-4263-9d55-44587a81ea28"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.385838 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfl82\" (UniqueName: \"kubernetes.io/projected/5340c59b-28d2-4263-9d55-44587a81ea28-kube-api-access-jfl82\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.385886 5047 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5340c59b-28d2-4263-9d55-44587a81ea28-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.385925 5047 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5340c59b-28d2-4263-9d55-44587a81ea28-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.759135 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gd458" podUID="afb0ee07-15b4-4671-a292-a36f1a055e7b" containerName="registry-server" probeResult="failure" output=< Feb 23 07:00:04 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 07:00:04 crc kubenswrapper[5047]: > Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.917763 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" event={"ID":"5340c59b-28d2-4263-9d55-44587a81ea28","Type":"ContainerDied","Data":"2f4e24163416ffa7366546ddeafec4db7b1b971569afdc04da76eb10fbd22b20"} Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.917836 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4e24163416ffa7366546ddeafec4db7b1b971569afdc04da76eb10fbd22b20" Feb 23 07:00:04 crc kubenswrapper[5047]: I0223 07:00:04.917886 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9" Feb 23 07:00:13 crc kubenswrapper[5047]: I0223 07:00:13.764118 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gd458" Feb 23 07:00:13 crc kubenswrapper[5047]: I0223 07:00:13.835173 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gd458" Feb 23 07:00:13 crc kubenswrapper[5047]: I0223 07:00:13.999724 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gd458"] Feb 23 07:00:14 crc kubenswrapper[5047]: I0223 07:00:14.996883 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gd458" podUID="afb0ee07-15b4-4671-a292-a36f1a055e7b" containerName="registry-server" containerID="cri-o://9bb48aef934b0989213a35ee1db46731fa95fe73efb1583255a33c46798e0231" gracePeriod=2 Feb 23 07:00:15 crc kubenswrapper[5047]: I0223 07:00:15.418749 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gd458" Feb 23 07:00:15 crc kubenswrapper[5047]: I0223 07:00:15.581249 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb0ee07-15b4-4671-a292-a36f1a055e7b-utilities\") pod \"afb0ee07-15b4-4671-a292-a36f1a055e7b\" (UID: \"afb0ee07-15b4-4671-a292-a36f1a055e7b\") " Feb 23 07:00:15 crc kubenswrapper[5047]: I0223 07:00:15.581330 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd5bj\" (UniqueName: \"kubernetes.io/projected/afb0ee07-15b4-4671-a292-a36f1a055e7b-kube-api-access-cd5bj\") pod \"afb0ee07-15b4-4671-a292-a36f1a055e7b\" (UID: \"afb0ee07-15b4-4671-a292-a36f1a055e7b\") " Feb 23 07:00:15 crc kubenswrapper[5047]: I0223 07:00:15.581531 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb0ee07-15b4-4671-a292-a36f1a055e7b-catalog-content\") pod \"afb0ee07-15b4-4671-a292-a36f1a055e7b\" (UID: \"afb0ee07-15b4-4671-a292-a36f1a055e7b\") " Feb 23 07:00:15 crc kubenswrapper[5047]: I0223 07:00:15.582682 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb0ee07-15b4-4671-a292-a36f1a055e7b-utilities" (OuterVolumeSpecName: "utilities") pod "afb0ee07-15b4-4671-a292-a36f1a055e7b" (UID: "afb0ee07-15b4-4671-a292-a36f1a055e7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:15 crc kubenswrapper[5047]: I0223 07:00:15.594960 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb0ee07-15b4-4671-a292-a36f1a055e7b-kube-api-access-cd5bj" (OuterVolumeSpecName: "kube-api-access-cd5bj") pod "afb0ee07-15b4-4671-a292-a36f1a055e7b" (UID: "afb0ee07-15b4-4671-a292-a36f1a055e7b"). InnerVolumeSpecName "kube-api-access-cd5bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:15 crc kubenswrapper[5047]: I0223 07:00:15.683486 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb0ee07-15b4-4671-a292-a36f1a055e7b-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:15 crc kubenswrapper[5047]: I0223 07:00:15.683535 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd5bj\" (UniqueName: \"kubernetes.io/projected/afb0ee07-15b4-4671-a292-a36f1a055e7b-kube-api-access-cd5bj\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:15 crc kubenswrapper[5047]: I0223 07:00:15.732791 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb0ee07-15b4-4671-a292-a36f1a055e7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afb0ee07-15b4-4671-a292-a36f1a055e7b" (UID: "afb0ee07-15b4-4671-a292-a36f1a055e7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:15 crc kubenswrapper[5047]: I0223 07:00:15.785874 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb0ee07-15b4-4671-a292-a36f1a055e7b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.007243 5047 generic.go:334] "Generic (PLEG): container finished" podID="afb0ee07-15b4-4671-a292-a36f1a055e7b" containerID="9bb48aef934b0989213a35ee1db46731fa95fe73efb1583255a33c46798e0231" exitCode=0 Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.007355 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gd458" event={"ID":"afb0ee07-15b4-4671-a292-a36f1a055e7b","Type":"ContainerDied","Data":"9bb48aef934b0989213a35ee1db46731fa95fe73efb1583255a33c46798e0231"} Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.007401 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gd458" event={"ID":"afb0ee07-15b4-4671-a292-a36f1a055e7b","Type":"ContainerDied","Data":"556717faf42d5ead060f93f59053b8bc5108cb05d7edf9741e2b22e90244359e"} Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.007432 5047 scope.go:117] "RemoveContainer" containerID="9bb48aef934b0989213a35ee1db46731fa95fe73efb1583255a33c46798e0231" Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.007603 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gd458" Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.043664 5047 scope.go:117] "RemoveContainer" containerID="bc0001ee5edfa3b0a3920f3bb3cc1d4c1711628f0f8fccc986d107360f24b3f4" Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.073044 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gd458"] Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.078078 5047 scope.go:117] "RemoveContainer" containerID="38dd5f05e35c0792ef3ecfdb69fa69f3f56ef3dc135ed0b289b5f368b997d4be" Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.080054 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gd458"] Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.098757 5047 scope.go:117] "RemoveContainer" containerID="9bb48aef934b0989213a35ee1db46731fa95fe73efb1583255a33c46798e0231" Feb 23 07:00:16 crc kubenswrapper[5047]: E0223 07:00:16.099405 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bb48aef934b0989213a35ee1db46731fa95fe73efb1583255a33c46798e0231\": container with ID starting with 9bb48aef934b0989213a35ee1db46731fa95fe73efb1583255a33c46798e0231 not found: ID does not exist" containerID="9bb48aef934b0989213a35ee1db46731fa95fe73efb1583255a33c46798e0231" Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.099459 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bb48aef934b0989213a35ee1db46731fa95fe73efb1583255a33c46798e0231"} err="failed to get container status \"9bb48aef934b0989213a35ee1db46731fa95fe73efb1583255a33c46798e0231\": rpc error: code = NotFound desc = could not find container \"9bb48aef934b0989213a35ee1db46731fa95fe73efb1583255a33c46798e0231\": container with ID starting with 9bb48aef934b0989213a35ee1db46731fa95fe73efb1583255a33c46798e0231 not found: ID does not exist" Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.099492 5047 scope.go:117] "RemoveContainer" containerID="bc0001ee5edfa3b0a3920f3bb3cc1d4c1711628f0f8fccc986d107360f24b3f4" Feb 23 07:00:16 crc kubenswrapper[5047]: E0223 07:00:16.099843 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc0001ee5edfa3b0a3920f3bb3cc1d4c1711628f0f8fccc986d107360f24b3f4\": container with ID starting with bc0001ee5edfa3b0a3920f3bb3cc1d4c1711628f0f8fccc986d107360f24b3f4 not found: ID does not exist" containerID="bc0001ee5edfa3b0a3920f3bb3cc1d4c1711628f0f8fccc986d107360f24b3f4" Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.099921 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc0001ee5edfa3b0a3920f3bb3cc1d4c1711628f0f8fccc986d107360f24b3f4"} err="failed to get container status \"bc0001ee5edfa3b0a3920f3bb3cc1d4c1711628f0f8fccc986d107360f24b3f4\": rpc error: code = NotFound desc = could not find container \"bc0001ee5edfa3b0a3920f3bb3cc1d4c1711628f0f8fccc986d107360f24b3f4\": container with ID starting with bc0001ee5edfa3b0a3920f3bb3cc1d4c1711628f0f8fccc986d107360f24b3f4 not found: ID does not exist" Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.099962 5047 scope.go:117] "RemoveContainer" containerID="38dd5f05e35c0792ef3ecfdb69fa69f3f56ef3dc135ed0b289b5f368b997d4be" Feb 23 07:00:16 crc kubenswrapper[5047]: E0223 07:00:16.100725 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38dd5f05e35c0792ef3ecfdb69fa69f3f56ef3dc135ed0b289b5f368b997d4be\": container with ID starting with 38dd5f05e35c0792ef3ecfdb69fa69f3f56ef3dc135ed0b289b5f368b997d4be not found: ID does not exist" containerID="38dd5f05e35c0792ef3ecfdb69fa69f3f56ef3dc135ed0b289b5f368b997d4be" Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.100756 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38dd5f05e35c0792ef3ecfdb69fa69f3f56ef3dc135ed0b289b5f368b997d4be"} err="failed to get container status \"38dd5f05e35c0792ef3ecfdb69fa69f3f56ef3dc135ed0b289b5f368b997d4be\": rpc error: code = NotFound desc = could not find container \"38dd5f05e35c0792ef3ecfdb69fa69f3f56ef3dc135ed0b289b5f368b997d4be\": container with ID starting with 38dd5f05e35c0792ef3ecfdb69fa69f3f56ef3dc135ed0b289b5f368b997d4be not found: ID does not exist" Feb 23 07:00:16 crc kubenswrapper[5047]: I0223 07:00:16.351480 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb0ee07-15b4-4671-a292-a36f1a055e7b" path="/var/lib/kubelet/pods/afb0ee07-15b4-4671-a292-a36f1a055e7b/volumes" Feb 23 07:00:22 crc kubenswrapper[5047]: I0223 07:00:22.057937 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-xrldn" event={"ID":"82af618b-d2b3-4b63-8112-54073bbbea1e","Type":"ContainerStarted","Data":"058d8547178aba9dea145f861272f39bbc80a1cb4463097252e268e647aacf46"} Feb 23 07:00:22 crc kubenswrapper[5047]: I0223 07:00:22.082160 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-xrldn" podStartSLOduration=1.30826242 podStartE2EDuration="21.082129562s" podCreationTimestamp="2026-02-23 07:00:01 +0000 UTC" firstStartedPulling="2026-02-23 07:00:01.860696902 +0000 UTC m=+924.112024076" lastFinishedPulling="2026-02-23 07:00:21.634564084 +0000 UTC m=+943.885891218" observedRunningTime="2026-02-23 07:00:22.078825082 +0000 UTC m=+944.330152246" watchObservedRunningTime="2026-02-23 07:00:22.082129562 +0000 UTC m=+944.333456726" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.105264 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-mwjdq"] Feb 23 07:00:23 crc kubenswrapper[5047]: E0223 07:00:23.106149 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb0ee07-15b4-4671-a292-a36f1a055e7b" containerName="extract-utilities" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.106174 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb0ee07-15b4-4671-a292-a36f1a055e7b" containerName="extract-utilities" Feb 23 07:00:23 crc kubenswrapper[5047]: E0223 07:00:23.106207 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb0ee07-15b4-4671-a292-a36f1a055e7b" containerName="extract-content" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.106219 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb0ee07-15b4-4671-a292-a36f1a055e7b" containerName="extract-content" Feb 23 07:00:23 crc kubenswrapper[5047]: E0223 07:00:23.106236 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5340c59b-28d2-4263-9d55-44587a81ea28" containerName="collect-profiles" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.106249 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5340c59b-28d2-4263-9d55-44587a81ea28" containerName="collect-profiles" Feb 23 07:00:23 crc kubenswrapper[5047]: E0223 07:00:23.106263 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb0ee07-15b4-4671-a292-a36f1a055e7b" containerName="registry-server" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.106275 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb0ee07-15b4-4671-a292-a36f1a055e7b" containerName="registry-server" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.106448 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb0ee07-15b4-4671-a292-a36f1a055e7b" containerName="registry-server" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.106481 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="5340c59b-28d2-4263-9d55-44587a81ea28" containerName="collect-profiles" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.107501 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwjdq" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.110735 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-z6j8m" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.113094 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42"] Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.114320 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.118296 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.131715 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-mwjdq"] Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.139809 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42"] Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.152213 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-kwhjc"] Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.153187 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.205866 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9g89\" (UniqueName: \"kubernetes.io/projected/56627564-fec8-4817-9883-639384c8c1ed-kube-api-access-n9g89\") pod \"nmstate-webhook-866bcb46dc-wqj42\" (UID: \"56627564-fec8-4817-9883-639384c8c1ed\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.206150 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcvhp\" (UniqueName: \"kubernetes.io/projected/c7816fce-0cd4-4edb-a9df-3589458e007a-kube-api-access-dcvhp\") pod \"nmstate-metrics-58c85c668d-mwjdq\" (UID: \"c7816fce-0cd4-4edb-a9df-3589458e007a\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwjdq" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.206206 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/56627564-fec8-4817-9883-639384c8c1ed-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-wqj42\" (UID: \"56627564-fec8-4817-9883-639384c8c1ed\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.263989 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs"] Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.264735 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.269633 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.269642 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.270431 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-sjgvt" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.281180 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs"] Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.308304 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/56627564-fec8-4817-9883-639384c8c1ed-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-wqj42\" (UID: \"56627564-fec8-4817-9883-639384c8c1ed\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.308371 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/48daf116-bd6d-462a-baf4-6dfb51456617-dbus-socket\") pod \"nmstate-handler-kwhjc\" (UID: \"48daf116-bd6d-462a-baf4-6dfb51456617\") " pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.308421 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/48daf116-bd6d-462a-baf4-6dfb51456617-nmstate-lock\") pod \"nmstate-handler-kwhjc\" (UID: \"48daf116-bd6d-462a-baf4-6dfb51456617\") " pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.308472 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgv7q\" (UniqueName: \"kubernetes.io/projected/48daf116-bd6d-462a-baf4-6dfb51456617-kube-api-access-mgv7q\") pod \"nmstate-handler-kwhjc\" (UID: \"48daf116-bd6d-462a-baf4-6dfb51456617\") " pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.308497 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/48daf116-bd6d-462a-baf4-6dfb51456617-ovs-socket\") pod \"nmstate-handler-kwhjc\" (UID: \"48daf116-bd6d-462a-baf4-6dfb51456617\") " pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: E0223 07:00:23.308669 5047 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.308744 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9g89\" (UniqueName: \"kubernetes.io/projected/56627564-fec8-4817-9883-639384c8c1ed-kube-api-access-n9g89\") pod \"nmstate-webhook-866bcb46dc-wqj42\" (UID: \"56627564-fec8-4817-9883-639384c8c1ed\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" Feb 23 07:00:23 crc kubenswrapper[5047]: E0223 07:00:23.309070 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56627564-fec8-4817-9883-639384c8c1ed-tls-key-pair podName:56627564-fec8-4817-9883-639384c8c1ed nodeName:}" failed. No retries permitted until 2026-02-23 07:00:23.809023853 +0000 UTC m=+946.060351147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/56627564-fec8-4817-9883-639384c8c1ed-tls-key-pair") pod "nmstate-webhook-866bcb46dc-wqj42" (UID: "56627564-fec8-4817-9883-639384c8c1ed") : secret "openshift-nmstate-webhook" not found Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.309107 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcvhp\" (UniqueName: \"kubernetes.io/projected/c7816fce-0cd4-4edb-a9df-3589458e007a-kube-api-access-dcvhp\") pod \"nmstate-metrics-58c85c668d-mwjdq\" (UID: \"c7816fce-0cd4-4edb-a9df-3589458e007a\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwjdq" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.329707 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9g89\" (UniqueName: \"kubernetes.io/projected/56627564-fec8-4817-9883-639384c8c1ed-kube-api-access-n9g89\") pod \"nmstate-webhook-866bcb46dc-wqj42\" (UID: \"56627564-fec8-4817-9883-639384c8c1ed\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.331853 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcvhp\" (UniqueName: \"kubernetes.io/projected/c7816fce-0cd4-4edb-a9df-3589458e007a-kube-api-access-dcvhp\") pod \"nmstate-metrics-58c85c668d-mwjdq\" (UID: \"c7816fce-0cd4-4edb-a9df-3589458e007a\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwjdq" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.409937 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/48daf116-bd6d-462a-baf4-6dfb51456617-dbus-socket\") pod \"nmstate-handler-kwhjc\" (UID: \"48daf116-bd6d-462a-baf4-6dfb51456617\") " pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.409993 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4a3dbc97-61d9-4247-b652-f136cfc02688-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-jrmbs\" (UID: \"4a3dbc97-61d9-4247-b652-f136cfc02688\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.410031 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/48daf116-bd6d-462a-baf4-6dfb51456617-nmstate-lock\") pod \"nmstate-handler-kwhjc\" (UID: \"48daf116-bd6d-462a-baf4-6dfb51456617\") " pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.410049 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgv7q\" (UniqueName: \"kubernetes.io/projected/48daf116-bd6d-462a-baf4-6dfb51456617-kube-api-access-mgv7q\") pod \"nmstate-handler-kwhjc\" (UID: \"48daf116-bd6d-462a-baf4-6dfb51456617\") " pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.410072 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/48daf116-bd6d-462a-baf4-6dfb51456617-ovs-socket\") pod \"nmstate-handler-kwhjc\" (UID: \"48daf116-bd6d-462a-baf4-6dfb51456617\") " pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.410110 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxf2h\" (UniqueName: \"kubernetes.io/projected/4a3dbc97-61d9-4247-b652-f136cfc02688-kube-api-access-rxf2h\") pod \"nmstate-console-plugin-5c78fc5d65-jrmbs\" (UID: \"4a3dbc97-61d9-4247-b652-f136cfc02688\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.410137 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a3dbc97-61d9-4247-b652-f136cfc02688-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-jrmbs\" (UID: \"4a3dbc97-61d9-4247-b652-f136cfc02688\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.410228 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/48daf116-bd6d-462a-baf4-6dfb51456617-nmstate-lock\") pod \"nmstate-handler-kwhjc\" (UID: \"48daf116-bd6d-462a-baf4-6dfb51456617\") " pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.410332 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/48daf116-bd6d-462a-baf4-6dfb51456617-ovs-socket\") pod \"nmstate-handler-kwhjc\" (UID: \"48daf116-bd6d-462a-baf4-6dfb51456617\") " pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.410431 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/48daf116-bd6d-462a-baf4-6dfb51456617-dbus-socket\") pod \"nmstate-handler-kwhjc\" (UID: \"48daf116-bd6d-462a-baf4-6dfb51456617\") " pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.430029 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgv7q\" (UniqueName: \"kubernetes.io/projected/48daf116-bd6d-462a-baf4-6dfb51456617-kube-api-access-mgv7q\") pod \"nmstate-handler-kwhjc\" (UID: \"48daf116-bd6d-462a-baf4-6dfb51456617\") " pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.431947 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwjdq" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.470803 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.512720 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxf2h\" (UniqueName: \"kubernetes.io/projected/4a3dbc97-61d9-4247-b652-f136cfc02688-kube-api-access-rxf2h\") pod \"nmstate-console-plugin-5c78fc5d65-jrmbs\" (UID: \"4a3dbc97-61d9-4247-b652-f136cfc02688\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.513224 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a3dbc97-61d9-4247-b652-f136cfc02688-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-jrmbs\" (UID: \"4a3dbc97-61d9-4247-b652-f136cfc02688\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.513296 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4a3dbc97-61d9-4247-b652-f136cfc02688-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-jrmbs\" (UID: \"4a3dbc97-61d9-4247-b652-f136cfc02688\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.513112 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-75bc6c8444-266zl"] Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.514494 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.514961 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4a3dbc97-61d9-4247-b652-f136cfc02688-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-jrmbs\" (UID: \"4a3dbc97-61d9-4247-b652-f136cfc02688\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.520814 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a3dbc97-61d9-4247-b652-f136cfc02688-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-jrmbs\" (UID: \"4a3dbc97-61d9-4247-b652-f136cfc02688\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.533131 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75bc6c8444-266zl"] Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.539530 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxf2h\" (UniqueName: \"kubernetes.io/projected/4a3dbc97-61d9-4247-b652-f136cfc02688-kube-api-access-rxf2h\") pod \"nmstate-console-plugin-5c78fc5d65-jrmbs\" (UID: \"4a3dbc97-61d9-4247-b652-f136cfc02688\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.579705 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.616708 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/06d8380a-7cbc-44cf-986a-4597f832dfb9-oauth-serving-cert\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.616769 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/06d8380a-7cbc-44cf-986a-4597f832dfb9-console-serving-cert\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.616793 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/06d8380a-7cbc-44cf-986a-4597f832dfb9-console-config\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.616916 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/06d8380a-7cbc-44cf-986a-4597f832dfb9-service-ca\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.616944 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s575r\" (UniqueName: \"kubernetes.io/projected/06d8380a-7cbc-44cf-986a-4597f832dfb9-kube-api-access-s575r\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.616964 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06d8380a-7cbc-44cf-986a-4597f832dfb9-trusted-ca-bundle\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.616986 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/06d8380a-7cbc-44cf-986a-4597f832dfb9-console-oauth-config\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.718480 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/06d8380a-7cbc-44cf-986a-4597f832dfb9-console-serving-cert\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.718544 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/06d8380a-7cbc-44cf-986a-4597f832dfb9-console-config\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.718617 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/06d8380a-7cbc-44cf-986a-4597f832dfb9-service-ca\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.718654 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s575r\" (UniqueName: \"kubernetes.io/projected/06d8380a-7cbc-44cf-986a-4597f832dfb9-kube-api-access-s575r\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.718685 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06d8380a-7cbc-44cf-986a-4597f832dfb9-trusted-ca-bundle\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.718722 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/06d8380a-7cbc-44cf-986a-4597f832dfb9-console-oauth-config\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.718791 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/06d8380a-7cbc-44cf-986a-4597f832dfb9-oauth-serving-cert\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.719959 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/06d8380a-7cbc-44cf-986a-4597f832dfb9-service-ca\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.720246 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/06d8380a-7cbc-44cf-986a-4597f832dfb9-oauth-serving-cert\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.722278 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/06d8380a-7cbc-44cf-986a-4597f832dfb9-console-config\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.725218 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06d8380a-7cbc-44cf-986a-4597f832dfb9-trusted-ca-bundle\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.727137 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/06d8380a-7cbc-44cf-986a-4597f832dfb9-console-serving-cert\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.732619 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/06d8380a-7cbc-44cf-986a-4597f832dfb9-console-oauth-config\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.748267 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s575r\" (UniqueName: \"kubernetes.io/projected/06d8380a-7cbc-44cf-986a-4597f832dfb9-kube-api-access-s575r\") pod \"console-75bc6c8444-266zl\" (UID: \"06d8380a-7cbc-44cf-986a-4597f832dfb9\") " pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.820472 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/56627564-fec8-4817-9883-639384c8c1ed-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-wqj42\" (UID: \"56627564-fec8-4817-9883-639384c8c1ed\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.830814 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/56627564-fec8-4817-9883-639384c8c1ed-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-wqj42\" (UID: \"56627564-fec8-4817-9883-639384c8c1ed\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.839216 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:23 crc kubenswrapper[5047]: W0223 07:00:23.853072 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a3dbc97_61d9_4247_b652_f136cfc02688.slice/crio-fadca0399dde9dcee588e52e559da18fb0d6a3041eb7fe1d726456532356091d WatchSource:0}: Error finding container fadca0399dde9dcee588e52e559da18fb0d6a3041eb7fe1d726456532356091d: Status 404 returned error can't find the container with id fadca0399dde9dcee588e52e559da18fb0d6a3041eb7fe1d726456532356091d Feb 23 07:00:23 crc kubenswrapper[5047]: I0223 07:00:23.854007 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs"] Feb 23 07:00:24 crc kubenswrapper[5047]: I0223 07:00:24.007301 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-mwjdq"] Feb 23 07:00:24 crc kubenswrapper[5047]: W0223 07:00:24.019170 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7816fce_0cd4_4edb_a9df_3589458e007a.slice/crio-17ae281a7f08b5ba932bac0d6b28117d7d9cb8d43b95f30cdf136d82dbc7139e WatchSource:0}: Error finding container 17ae281a7f08b5ba932bac0d6b28117d7d9cb8d43b95f30cdf136d82dbc7139e: Status 404 returned error can't find the container with id 17ae281a7f08b5ba932bac0d6b28117d7d9cb8d43b95f30cdf136d82dbc7139e Feb 23 07:00:24 crc kubenswrapper[5047]: I0223 07:00:24.051204 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" Feb 23 07:00:24 crc kubenswrapper[5047]: I0223 07:00:24.080175 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwjdq" event={"ID":"c7816fce-0cd4-4edb-a9df-3589458e007a","Type":"ContainerStarted","Data":"17ae281a7f08b5ba932bac0d6b28117d7d9cb8d43b95f30cdf136d82dbc7139e"} Feb 23 07:00:24 crc kubenswrapper[5047]: I0223 07:00:24.082595 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" event={"ID":"4a3dbc97-61d9-4247-b652-f136cfc02688","Type":"ContainerStarted","Data":"fadca0399dde9dcee588e52e559da18fb0d6a3041eb7fe1d726456532356091d"} Feb 23 07:00:24 crc kubenswrapper[5047]: I0223 07:00:24.084232 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kwhjc" event={"ID":"48daf116-bd6d-462a-baf4-6dfb51456617","Type":"ContainerStarted","Data":"66aa72c2df71284c7315a278d2246a902234af4a15084cfcd74e1df388b649eb"} Feb 23 07:00:24 crc kubenswrapper[5047]: I0223 07:00:24.118382 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75bc6c8444-266zl"] Feb 23 07:00:24 crc kubenswrapper[5047]: I0223 07:00:24.296864 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42"] Feb 23 07:00:24 crc kubenswrapper[5047]: W0223 07:00:24.309576 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56627564_fec8_4817_9883_639384c8c1ed.slice/crio-879af5a4111f187515d5970c404e60679f92eeef0fff612ffaa989284a218545 WatchSource:0}: Error finding container 879af5a4111f187515d5970c404e60679f92eeef0fff612ffaa989284a218545: Status 404 returned error can't find the container with id 879af5a4111f187515d5970c404e60679f92eeef0fff612ffaa989284a218545 Feb 23 07:00:25 crc kubenswrapper[5047]: I0223 07:00:25.094599 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75bc6c8444-266zl" event={"ID":"06d8380a-7cbc-44cf-986a-4597f832dfb9","Type":"ContainerStarted","Data":"21e02f796fd802d0e1bac7e29adf67fc574c3ef681236f20d81a5dea902eeba5"} Feb 23 07:00:25 crc kubenswrapper[5047]: I0223 07:00:25.095029 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75bc6c8444-266zl" event={"ID":"06d8380a-7cbc-44cf-986a-4597f832dfb9","Type":"ContainerStarted","Data":"48cac5b9897aaf762bfbbb23805b9d9390688c71b224a4e7166d60b5f97f854e"} Feb 23 07:00:25 crc kubenswrapper[5047]: I0223 07:00:25.096075 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" event={"ID":"56627564-fec8-4817-9883-639384c8c1ed","Type":"ContainerStarted","Data":"879af5a4111f187515d5970c404e60679f92eeef0fff612ffaa989284a218545"} Feb 23 07:00:25 crc kubenswrapper[5047]: I0223 07:00:25.120931 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75bc6c8444-266zl" podStartSLOduration=2.120893547 podStartE2EDuration="2.120893547s" podCreationTimestamp="2026-02-23 07:00:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:25.118734789 +0000 UTC m=+947.370061913" watchObservedRunningTime="2026-02-23 07:00:25.120893547 +0000 UTC m=+947.372220681" Feb 23 07:00:27 crc kubenswrapper[5047]: I0223 07:00:27.111363 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" event={"ID":"4a3dbc97-61d9-4247-b652-f136cfc02688","Type":"ContainerStarted","Data":"3da0d683c591d96492bbf0d7e53fe09d6440d0fc3bed44f7c20c8682c8923502"} Feb 23 07:00:27 crc kubenswrapper[5047]: I0223 07:00:27.113204 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kwhjc" event={"ID":"48daf116-bd6d-462a-baf4-6dfb51456617","Type":"ContainerStarted","Data":"1f01ff95cdb94b79ae307bb8c889fe35a5aaefde3c685e78a09e2a204b631bc3"} Feb 23 07:00:27 crc kubenswrapper[5047]: I0223 07:00:27.113409 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:27 crc kubenswrapper[5047]: I0223 07:00:27.115113 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" event={"ID":"56627564-fec8-4817-9883-639384c8c1ed","Type":"ContainerStarted","Data":"3e4b23689f52d007288ae77e3cbbc077d8e4eff067e0c6ba15aa2560dbfa4eab"} Feb 23 07:00:27 crc kubenswrapper[5047]: I0223 07:00:27.115952 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" Feb 23 07:00:27 crc kubenswrapper[5047]: I0223 07:00:27.117972 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwjdq" event={"ID":"c7816fce-0cd4-4edb-a9df-3589458e007a","Type":"ContainerStarted","Data":"c603dc99f28a7dc8910448eabdaa2ac4cd95d55a93352ff8d29e8df72b024ab4"} Feb 23 07:00:27 crc kubenswrapper[5047]: I0223 07:00:27.141148 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-jrmbs" podStartSLOduration=1.616730738 podStartE2EDuration="4.141121794s" podCreationTimestamp="2026-02-23 07:00:23 +0000 UTC" firstStartedPulling="2026-02-23 07:00:23.857785195 +0000 UTC m=+946.109112329" lastFinishedPulling="2026-02-23 07:00:26.382176211 +0000 UTC m=+948.633503385" observedRunningTime="2026-02-23 07:00:27.140193948 +0000 UTC m=+949.391521102" watchObservedRunningTime="2026-02-23 07:00:27.141121794 +0000 UTC m=+949.392448938" Feb 23 07:00:27 crc kubenswrapper[5047]: I0223 07:00:27.172055 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" podStartSLOduration=2.101079654 podStartE2EDuration="4.172026451s" podCreationTimestamp="2026-02-23 07:00:23 +0000 UTC" firstStartedPulling="2026-02-23 07:00:24.313001442 +0000 UTC m=+946.564328576" lastFinishedPulling="2026-02-23 07:00:26.383948239 +0000 UTC m=+948.635275373" observedRunningTime="2026-02-23 07:00:27.171325721 +0000 UTC m=+949.422652875" watchObservedRunningTime="2026-02-23 07:00:27.172026451 +0000 UTC m=+949.423353595" Feb 23 07:00:27 crc kubenswrapper[5047]: I0223 07:00:27.197337 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-kwhjc" podStartSLOduration=1.309229249 podStartE2EDuration="4.197310664s" podCreationTimestamp="2026-02-23 07:00:23 +0000 UTC" firstStartedPulling="2026-02-23 07:00:23.500017248 +0000 UTC m=+945.751344382" lastFinishedPulling="2026-02-23 07:00:26.388098663 +0000 UTC m=+948.639425797" observedRunningTime="2026-02-23 07:00:27.188940275 +0000 UTC m=+949.440267419" watchObservedRunningTime="2026-02-23 07:00:27.197310664 +0000 UTC m=+949.448637788" Feb 23 07:00:29 crc kubenswrapper[5047]: I0223 07:00:29.137135 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwjdq" event={"ID":"c7816fce-0cd4-4edb-a9df-3589458e007a","Type":"ContainerStarted","Data":"c98b3240baaa00127c831b4515e6d79afb4e6195d0e3798cbcbb1483848ee4b3"} Feb 23 07:00:29 crc kubenswrapper[5047]: I0223 07:00:29.166721 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwjdq" podStartSLOduration=1.673052802 podStartE2EDuration="6.166683337s" podCreationTimestamp="2026-02-23 07:00:23 +0000 UTC" firstStartedPulling="2026-02-23 07:00:24.021992265 +0000 UTC m=+946.273319399" lastFinishedPulling="2026-02-23 07:00:28.5156228 +0000 UTC m=+950.766949934" observedRunningTime="2026-02-23 07:00:29.163240262 +0000 UTC m=+951.414567446" watchObservedRunningTime="2026-02-23 07:00:29.166683337 +0000 UTC m=+951.418010551" Feb 23 07:00:33 crc kubenswrapper[5047]: I0223 07:00:33.507633 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-kwhjc" Feb 23 07:00:33 crc kubenswrapper[5047]: I0223 07:00:33.840029 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:33 crc kubenswrapper[5047]: I0223 07:00:33.840117 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:33 crc kubenswrapper[5047]: I0223 07:00:33.845860 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:34 crc kubenswrapper[5047]: I0223 07:00:34.179401 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75bc6c8444-266zl" Feb 23 07:00:34 crc kubenswrapper[5047]: I0223 07:00:34.245786 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v5zcl"] Feb 23 07:00:44 crc kubenswrapper[5047]: I0223 07:00:44.061227 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-wqj42" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.100723 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps"] Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.102769 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.105392 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.114932 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps"] Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.153003 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e914782d-7780-4bd6-947e-7d381f18c574-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps\" (UID: \"e914782d-7780-4bd6-947e-7d381f18c574\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.153384 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trghr\" (UniqueName: \"kubernetes.io/projected/e914782d-7780-4bd6-947e-7d381f18c574-kube-api-access-trghr\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps\" (UID: \"e914782d-7780-4bd6-947e-7d381f18c574\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.153494 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e914782d-7780-4bd6-947e-7d381f18c574-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps\" (UID: \"e914782d-7780-4bd6-947e-7d381f18c574\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.255610 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e914782d-7780-4bd6-947e-7d381f18c574-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps\" (UID: \"e914782d-7780-4bd6-947e-7d381f18c574\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.255883 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e914782d-7780-4bd6-947e-7d381f18c574-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps\" (UID: \"e914782d-7780-4bd6-947e-7d381f18c574\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.256042 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trghr\" (UniqueName: \"kubernetes.io/projected/e914782d-7780-4bd6-947e-7d381f18c574-kube-api-access-trghr\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps\" (UID: \"e914782d-7780-4bd6-947e-7d381f18c574\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.256811 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e914782d-7780-4bd6-947e-7d381f18c574-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps\" (UID: \"e914782d-7780-4bd6-947e-7d381f18c574\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.256960 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e914782d-7780-4bd6-947e-7d381f18c574-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps\" (UID: \"e914782d-7780-4bd6-947e-7d381f18c574\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.294417 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trghr\" (UniqueName: \"kubernetes.io/projected/e914782d-7780-4bd6-947e-7d381f18c574-kube-api-access-trghr\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps\" (UID: \"e914782d-7780-4bd6-947e-7d381f18c574\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.302312 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-v5zcl" podUID="cd20eb8d-5a96-407f-a898-1dad49ba8355" containerName="console" containerID="cri-o://3d8ef04e55486d0d69c506579d29f5ba19fc94ad45ea8f63f5378e06e54b6d12" gracePeriod=15 Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.433835 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.710483 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps"] Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.746072 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v5zcl_cd20eb8d-5a96-407f-a898-1dad49ba8355/console/0.log" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.746167 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.865179 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7wds\" (UniqueName: \"kubernetes.io/projected/cd20eb8d-5a96-407f-a898-1dad49ba8355-kube-api-access-j7wds\") pod \"cd20eb8d-5a96-407f-a898-1dad49ba8355\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.865297 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-trusted-ca-bundle\") pod \"cd20eb8d-5a96-407f-a898-1dad49ba8355\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.865418 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-oauth-serving-cert\") pod \"cd20eb8d-5a96-407f-a898-1dad49ba8355\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.865447 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-service-ca\") pod \"cd20eb8d-5a96-407f-a898-1dad49ba8355\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.865514 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-config\") pod \"cd20eb8d-5a96-407f-a898-1dad49ba8355\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.865586 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-serving-cert\") pod \"cd20eb8d-5a96-407f-a898-1dad49ba8355\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.865625 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-oauth-config\") pod \"cd20eb8d-5a96-407f-a898-1dad49ba8355\" (UID: \"cd20eb8d-5a96-407f-a898-1dad49ba8355\") " Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.866325 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-service-ca" (OuterVolumeSpecName: "service-ca") pod "cd20eb8d-5a96-407f-a898-1dad49ba8355" (UID: "cd20eb8d-5a96-407f-a898-1dad49ba8355"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.866420 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cd20eb8d-5a96-407f-a898-1dad49ba8355" (UID: "cd20eb8d-5a96-407f-a898-1dad49ba8355"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.866560 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-config" (OuterVolumeSpecName: "console-config") pod "cd20eb8d-5a96-407f-a898-1dad49ba8355" (UID: "cd20eb8d-5a96-407f-a898-1dad49ba8355"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.866602 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cd20eb8d-5a96-407f-a898-1dad49ba8355" (UID: "cd20eb8d-5a96-407f-a898-1dad49ba8355"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.873409 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cd20eb8d-5a96-407f-a898-1dad49ba8355" (UID: "cd20eb8d-5a96-407f-a898-1dad49ba8355"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.874204 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cd20eb8d-5a96-407f-a898-1dad49ba8355" (UID: "cd20eb8d-5a96-407f-a898-1dad49ba8355"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.875239 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd20eb8d-5a96-407f-a898-1dad49ba8355-kube-api-access-j7wds" (OuterVolumeSpecName: "kube-api-access-j7wds") pod "cd20eb8d-5a96-407f-a898-1dad49ba8355" (UID: "cd20eb8d-5a96-407f-a898-1dad49ba8355"). InnerVolumeSpecName "kube-api-access-j7wds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.966936 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7wds\" (UniqueName: \"kubernetes.io/projected/cd20eb8d-5a96-407f-a898-1dad49ba8355-kube-api-access-j7wds\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.966967 5047 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.966980 5047 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.966991 5047 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.967002 5047 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.967010 5047 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:59 crc kubenswrapper[5047]: I0223 07:00:59.967018 5047 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd20eb8d-5a96-407f-a898-1dad49ba8355-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:00 crc kubenswrapper[5047]: I0223 07:01:00.393078 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v5zcl_cd20eb8d-5a96-407f-a898-1dad49ba8355/console/0.log" Feb 23 07:01:00 crc kubenswrapper[5047]: I0223 07:01:00.393691 5047 generic.go:334] "Generic (PLEG): container finished" podID="cd20eb8d-5a96-407f-a898-1dad49ba8355" containerID="3d8ef04e55486d0d69c506579d29f5ba19fc94ad45ea8f63f5378e06e54b6d12" exitCode=2 Feb 23 07:01:00 crc kubenswrapper[5047]: I0223 07:01:00.393764 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v5zcl" Feb 23 07:01:00 crc kubenswrapper[5047]: I0223 07:01:00.393812 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v5zcl" event={"ID":"cd20eb8d-5a96-407f-a898-1dad49ba8355","Type":"ContainerDied","Data":"3d8ef04e55486d0d69c506579d29f5ba19fc94ad45ea8f63f5378e06e54b6d12"} Feb 23 07:01:00 crc kubenswrapper[5047]: I0223 07:01:00.393866 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v5zcl" event={"ID":"cd20eb8d-5a96-407f-a898-1dad49ba8355","Type":"ContainerDied","Data":"b2e733971ff89866db652f10e7fe441bf66de1b155e2ad4e384599063be9625c"} Feb 23 07:01:00 crc kubenswrapper[5047]: I0223 07:01:00.393898 5047 scope.go:117] "RemoveContainer" containerID="3d8ef04e55486d0d69c506579d29f5ba19fc94ad45ea8f63f5378e06e54b6d12" Feb 23 07:01:00 crc kubenswrapper[5047]: I0223 07:01:00.401555 5047 generic.go:334] "Generic (PLEG): container finished" podID="e914782d-7780-4bd6-947e-7d381f18c574" containerID="8df5ba008ee9a4e6ffe4a1184887f182854152243b29012098549694d167e1f5" exitCode=0 Feb 23 07:01:00 crc kubenswrapper[5047]: I0223 07:01:00.401602 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" event={"ID":"e914782d-7780-4bd6-947e-7d381f18c574","Type":"ContainerDied","Data":"8df5ba008ee9a4e6ffe4a1184887f182854152243b29012098549694d167e1f5"} Feb 23 07:01:00 crc kubenswrapper[5047]: I0223 07:01:00.401631 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" event={"ID":"e914782d-7780-4bd6-947e-7d381f18c574","Type":"ContainerStarted","Data":"24b5daeaccbcae11a8631804ac47e8927cf2dadc5d0417738575977f2b13a96b"} Feb 23 07:01:00 crc kubenswrapper[5047]: I0223 07:01:00.424974 5047 scope.go:117] "RemoveContainer" containerID="3d8ef04e55486d0d69c506579d29f5ba19fc94ad45ea8f63f5378e06e54b6d12" Feb 23 07:01:00 crc kubenswrapper[5047]: E0223 07:01:00.425606 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d8ef04e55486d0d69c506579d29f5ba19fc94ad45ea8f63f5378e06e54b6d12\": container with ID starting with 3d8ef04e55486d0d69c506579d29f5ba19fc94ad45ea8f63f5378e06e54b6d12 not found: ID does not exist" containerID="3d8ef04e55486d0d69c506579d29f5ba19fc94ad45ea8f63f5378e06e54b6d12" Feb 23 07:01:00 crc kubenswrapper[5047]: I0223 07:01:00.425674 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d8ef04e55486d0d69c506579d29f5ba19fc94ad45ea8f63f5378e06e54b6d12"} err="failed to get container status \"3d8ef04e55486d0d69c506579d29f5ba19fc94ad45ea8f63f5378e06e54b6d12\": rpc error: code = NotFound desc = could not find container \"3d8ef04e55486d0d69c506579d29f5ba19fc94ad45ea8f63f5378e06e54b6d12\": container with ID starting with 3d8ef04e55486d0d69c506579d29f5ba19fc94ad45ea8f63f5378e06e54b6d12 not found: ID does not exist" Feb 23 07:01:00 crc kubenswrapper[5047]: I0223 07:01:00.427267 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v5zcl"] Feb 23 07:01:00 crc kubenswrapper[5047]: I0223 07:01:00.432227 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-v5zcl"] Feb 23 07:01:02 crc kubenswrapper[5047]: I0223 07:01:02.353661 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd20eb8d-5a96-407f-a898-1dad49ba8355" path="/var/lib/kubelet/pods/cd20eb8d-5a96-407f-a898-1dad49ba8355/volumes" Feb 23 07:01:02 crc kubenswrapper[5047]: I0223 07:01:02.422220 5047 generic.go:334] "Generic (PLEG): container finished" podID="e914782d-7780-4bd6-947e-7d381f18c574" containerID="1acfe7f6da66a25dd2db5c665f3673377ed9a1775f12a46eb94b876934830a7d" exitCode=0 Feb 23 07:01:02 crc kubenswrapper[5047]: I0223 07:01:02.422295 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" event={"ID":"e914782d-7780-4bd6-947e-7d381f18c574","Type":"ContainerDied","Data":"1acfe7f6da66a25dd2db5c665f3673377ed9a1775f12a46eb94b876934830a7d"} Feb 23 07:01:03 crc kubenswrapper[5047]: I0223 07:01:03.433237 5047 generic.go:334] "Generic (PLEG): container finished" podID="e914782d-7780-4bd6-947e-7d381f18c574" containerID="20c10cb802a69ee13accc7eb17eb16d5fe5513b71cd63468317cfe291f716904" exitCode=0 Feb 23 07:01:03 crc kubenswrapper[5047]: I0223 07:01:03.433581 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" event={"ID":"e914782d-7780-4bd6-947e-7d381f18c574","Type":"ContainerDied","Data":"20c10cb802a69ee13accc7eb17eb16d5fe5513b71cd63468317cfe291f716904"} Feb 23 07:01:04 crc kubenswrapper[5047]: I0223 07:01:04.754177 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" Feb 23 07:01:04 crc kubenswrapper[5047]: I0223 07:01:04.838864 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trghr\" (UniqueName: \"kubernetes.io/projected/e914782d-7780-4bd6-947e-7d381f18c574-kube-api-access-trghr\") pod \"e914782d-7780-4bd6-947e-7d381f18c574\" (UID: \"e914782d-7780-4bd6-947e-7d381f18c574\") " Feb 23 07:01:04 crc kubenswrapper[5047]: I0223 07:01:04.840791 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e914782d-7780-4bd6-947e-7d381f18c574-util\") pod \"e914782d-7780-4bd6-947e-7d381f18c574\" (UID: \"e914782d-7780-4bd6-947e-7d381f18c574\") " Feb 23 07:01:04 crc kubenswrapper[5047]: I0223 07:01:04.848720 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e914782d-7780-4bd6-947e-7d381f18c574-bundle\") pod \"e914782d-7780-4bd6-947e-7d381f18c574\" (UID: \"e914782d-7780-4bd6-947e-7d381f18c574\") " Feb 23 07:01:04 crc kubenswrapper[5047]: I0223 07:01:04.850098 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e914782d-7780-4bd6-947e-7d381f18c574-kube-api-access-trghr" (OuterVolumeSpecName: "kube-api-access-trghr") pod "e914782d-7780-4bd6-947e-7d381f18c574" (UID: "e914782d-7780-4bd6-947e-7d381f18c574"). InnerVolumeSpecName "kube-api-access-trghr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:04 crc kubenswrapper[5047]: I0223 07:01:04.851537 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e914782d-7780-4bd6-947e-7d381f18c574-bundle" (OuterVolumeSpecName: "bundle") pod "e914782d-7780-4bd6-947e-7d381f18c574" (UID: "e914782d-7780-4bd6-947e-7d381f18c574"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:04 crc kubenswrapper[5047]: I0223 07:01:04.852118 5047 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e914782d-7780-4bd6-947e-7d381f18c574-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:04 crc kubenswrapper[5047]: I0223 07:01:04.852158 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trghr\" (UniqueName: \"kubernetes.io/projected/e914782d-7780-4bd6-947e-7d381f18c574-kube-api-access-trghr\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:04 crc kubenswrapper[5047]: I0223 07:01:04.867728 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e914782d-7780-4bd6-947e-7d381f18c574-util" (OuterVolumeSpecName: "util") pod "e914782d-7780-4bd6-947e-7d381f18c574" (UID: "e914782d-7780-4bd6-947e-7d381f18c574"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:04 crc kubenswrapper[5047]: I0223 07:01:04.953820 5047 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e914782d-7780-4bd6-947e-7d381f18c574-util\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:05 crc kubenswrapper[5047]: I0223 07:01:05.468646 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" event={"ID":"e914782d-7780-4bd6-947e-7d381f18c574","Type":"ContainerDied","Data":"24b5daeaccbcae11a8631804ac47e8927cf2dadc5d0417738575977f2b13a96b"} Feb 23 07:01:05 crc kubenswrapper[5047]: I0223 07:01:05.468708 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24b5daeaccbcae11a8631804ac47e8927cf2dadc5d0417738575977f2b13a96b" Feb 23 07:01:05 crc kubenswrapper[5047]: I0223 07:01:05.468739 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.460145 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xlc2c"] Feb 23 07:01:12 crc kubenswrapper[5047]: E0223 07:01:12.461549 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e914782d-7780-4bd6-947e-7d381f18c574" containerName="pull" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.461579 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e914782d-7780-4bd6-947e-7d381f18c574" containerName="pull" Feb 23 07:01:12 crc kubenswrapper[5047]: E0223 07:01:12.461613 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd20eb8d-5a96-407f-a898-1dad49ba8355" containerName="console" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.461631 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd20eb8d-5a96-407f-a898-1dad49ba8355" containerName="console" Feb 23 07:01:12 crc kubenswrapper[5047]: E0223 07:01:12.461670 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e914782d-7780-4bd6-947e-7d381f18c574" containerName="extract" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.461688 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e914782d-7780-4bd6-947e-7d381f18c574" containerName="extract" Feb 23 07:01:12 crc kubenswrapper[5047]: E0223 07:01:12.461720 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e914782d-7780-4bd6-947e-7d381f18c574" containerName="util" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.461736 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e914782d-7780-4bd6-947e-7d381f18c574" containerName="util" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.461982 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd20eb8d-5a96-407f-a898-1dad49ba8355" containerName="console" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.462015 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e914782d-7780-4bd6-947e-7d381f18c574" containerName="extract" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.463505 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.481788 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xlc2c"] Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.570494 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd69l\" (UniqueName: \"kubernetes.io/projected/870690c0-1708-445c-8782-20b41e5d685b-kube-api-access-dd69l\") pod \"redhat-marketplace-xlc2c\" (UID: \"870690c0-1708-445c-8782-20b41e5d685b\") " pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.570570 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870690c0-1708-445c-8782-20b41e5d685b-catalog-content\") pod \"redhat-marketplace-xlc2c\" (UID: \"870690c0-1708-445c-8782-20b41e5d685b\") " pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.570652 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870690c0-1708-445c-8782-20b41e5d685b-utilities\") pod \"redhat-marketplace-xlc2c\" (UID: \"870690c0-1708-445c-8782-20b41e5d685b\") " pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.672332 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd69l\" (UniqueName: \"kubernetes.io/projected/870690c0-1708-445c-8782-20b41e5d685b-kube-api-access-dd69l\") pod \"redhat-marketplace-xlc2c\" (UID: \"870690c0-1708-445c-8782-20b41e5d685b\") " pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.672428 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870690c0-1708-445c-8782-20b41e5d685b-catalog-content\") pod \"redhat-marketplace-xlc2c\" (UID: \"870690c0-1708-445c-8782-20b41e5d685b\") " pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.672531 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870690c0-1708-445c-8782-20b41e5d685b-utilities\") pod \"redhat-marketplace-xlc2c\" (UID: \"870690c0-1708-445c-8782-20b41e5d685b\") " pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.673026 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870690c0-1708-445c-8782-20b41e5d685b-catalog-content\") pod \"redhat-marketplace-xlc2c\" (UID: \"870690c0-1708-445c-8782-20b41e5d685b\") " pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.673322 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870690c0-1708-445c-8782-20b41e5d685b-utilities\") pod \"redhat-marketplace-xlc2c\" (UID: \"870690c0-1708-445c-8782-20b41e5d685b\") " pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.700807 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd69l\" (UniqueName: \"kubernetes.io/projected/870690c0-1708-445c-8782-20b41e5d685b-kube-api-access-dd69l\") pod \"redhat-marketplace-xlc2c\" (UID: \"870690c0-1708-445c-8782-20b41e5d685b\") " pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:12 crc kubenswrapper[5047]: I0223 07:01:12.780979 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:13 crc kubenswrapper[5047]: I0223 07:01:13.141747 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xlc2c"] Feb 23 07:01:13 crc kubenswrapper[5047]: I0223 07:01:13.533396 5047 generic.go:334] "Generic (PLEG): container finished" podID="870690c0-1708-445c-8782-20b41e5d685b" containerID="6c3fcef02e6f33ee3ac4572fa3b7e6f3873f54d155e84e1f0b4f1418e248bc27" exitCode=0 Feb 23 07:01:13 crc kubenswrapper[5047]: I0223 07:01:13.533475 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlc2c" event={"ID":"870690c0-1708-445c-8782-20b41e5d685b","Type":"ContainerDied","Data":"6c3fcef02e6f33ee3ac4572fa3b7e6f3873f54d155e84e1f0b4f1418e248bc27"} Feb 23 07:01:13 crc kubenswrapper[5047]: I0223 07:01:13.533512 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlc2c" event={"ID":"870690c0-1708-445c-8782-20b41e5d685b","Type":"ContainerStarted","Data":"058f0c97661020ab6a64a92bf9c3e0040ed40a26a0e737c810cd1a16294e955c"} Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.426255 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc"] Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.427587 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.430121 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-8vxwj" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.432339 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.432766 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.433035 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.440584 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.462066 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc"] Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.504228 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2e5f18f-e775-4839-8e1b-a5b05c2af4e9-webhook-cert\") pod \"metallb-operator-controller-manager-65b877fd6b-65kbc\" (UID: \"a2e5f18f-e775-4839-8e1b-a5b05c2af4e9\") " pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.504373 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j2pq\" (UniqueName: \"kubernetes.io/projected/a2e5f18f-e775-4839-8e1b-a5b05c2af4e9-kube-api-access-5j2pq\") pod \"metallb-operator-controller-manager-65b877fd6b-65kbc\" (UID: \"a2e5f18f-e775-4839-8e1b-a5b05c2af4e9\") " pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.504421 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2e5f18f-e775-4839-8e1b-a5b05c2af4e9-apiservice-cert\") pod \"metallb-operator-controller-manager-65b877fd6b-65kbc\" (UID: \"a2e5f18f-e775-4839-8e1b-a5b05c2af4e9\") " pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.542979 5047 generic.go:334] "Generic (PLEG): container finished" podID="870690c0-1708-445c-8782-20b41e5d685b" containerID="eceef24bd849131ced1fe91b735bbff2a05bf7af34877531e9148e9f11481d71" exitCode=0 Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.543043 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlc2c" event={"ID":"870690c0-1708-445c-8782-20b41e5d685b","Type":"ContainerDied","Data":"eceef24bd849131ced1fe91b735bbff2a05bf7af34877531e9148e9f11481d71"} Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.605373 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j2pq\" (UniqueName: \"kubernetes.io/projected/a2e5f18f-e775-4839-8e1b-a5b05c2af4e9-kube-api-access-5j2pq\") pod \"metallb-operator-controller-manager-65b877fd6b-65kbc\" (UID: \"a2e5f18f-e775-4839-8e1b-a5b05c2af4e9\") " pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.605435 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2e5f18f-e775-4839-8e1b-a5b05c2af4e9-apiservice-cert\") pod \"metallb-operator-controller-manager-65b877fd6b-65kbc\" (UID: \"a2e5f18f-e775-4839-8e1b-a5b05c2af4e9\") " pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.605840 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2e5f18f-e775-4839-8e1b-a5b05c2af4e9-webhook-cert\") pod \"metallb-operator-controller-manager-65b877fd6b-65kbc\" (UID: \"a2e5f18f-e775-4839-8e1b-a5b05c2af4e9\") " pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.613218 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2e5f18f-e775-4839-8e1b-a5b05c2af4e9-apiservice-cert\") pod \"metallb-operator-controller-manager-65b877fd6b-65kbc\" (UID: \"a2e5f18f-e775-4839-8e1b-a5b05c2af4e9\") " pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.613452 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2e5f18f-e775-4839-8e1b-a5b05c2af4e9-webhook-cert\") pod \"metallb-operator-controller-manager-65b877fd6b-65kbc\" (UID: \"a2e5f18f-e775-4839-8e1b-a5b05c2af4e9\") " pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.623750 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j2pq\" (UniqueName: \"kubernetes.io/projected/a2e5f18f-e775-4839-8e1b-a5b05c2af4e9-kube-api-access-5j2pq\") pod \"metallb-operator-controller-manager-65b877fd6b-65kbc\" (UID: \"a2e5f18f-e775-4839-8e1b-a5b05c2af4e9\") " pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.671205 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j"] Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.672062 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.677261 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.678999 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-45fhg" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.684987 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.697194 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j"] Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.743789 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.809284 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d9b84120-1d90-41d1-8686-535d6dcfb6d4-webhook-cert\") pod \"metallb-operator-webhook-server-74bbf6f644-rwt8j\" (UID: \"d9b84120-1d90-41d1-8686-535d6dcfb6d4\") " pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.809364 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn6jb\" (UniqueName: \"kubernetes.io/projected/d9b84120-1d90-41d1-8686-535d6dcfb6d4-kube-api-access-wn6jb\") pod \"metallb-operator-webhook-server-74bbf6f644-rwt8j\" (UID: \"d9b84120-1d90-41d1-8686-535d6dcfb6d4\") " pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.809418 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9b84120-1d90-41d1-8686-535d6dcfb6d4-apiservice-cert\") pod \"metallb-operator-webhook-server-74bbf6f644-rwt8j\" (UID: \"d9b84120-1d90-41d1-8686-535d6dcfb6d4\") " pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.911216 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d9b84120-1d90-41d1-8686-535d6dcfb6d4-webhook-cert\") pod \"metallb-operator-webhook-server-74bbf6f644-rwt8j\" (UID: \"d9b84120-1d90-41d1-8686-535d6dcfb6d4\") " pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.911759 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn6jb\" (UniqueName: \"kubernetes.io/projected/d9b84120-1d90-41d1-8686-535d6dcfb6d4-kube-api-access-wn6jb\") pod \"metallb-operator-webhook-server-74bbf6f644-rwt8j\" (UID: \"d9b84120-1d90-41d1-8686-535d6dcfb6d4\") " pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.911815 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9b84120-1d90-41d1-8686-535d6dcfb6d4-apiservice-cert\") pod \"metallb-operator-webhook-server-74bbf6f644-rwt8j\" (UID: \"d9b84120-1d90-41d1-8686-535d6dcfb6d4\") " pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.929806 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9b84120-1d90-41d1-8686-535d6dcfb6d4-apiservice-cert\") pod \"metallb-operator-webhook-server-74bbf6f644-rwt8j\" (UID: \"d9b84120-1d90-41d1-8686-535d6dcfb6d4\") " pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.936590 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d9b84120-1d90-41d1-8686-535d6dcfb6d4-webhook-cert\") pod \"metallb-operator-webhook-server-74bbf6f644-rwt8j\" (UID: \"d9b84120-1d90-41d1-8686-535d6dcfb6d4\") " pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.936807 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn6jb\" (UniqueName: \"kubernetes.io/projected/d9b84120-1d90-41d1-8686-535d6dcfb6d4-kube-api-access-wn6jb\") pod \"metallb-operator-webhook-server-74bbf6f644-rwt8j\" (UID: \"d9b84120-1d90-41d1-8686-535d6dcfb6d4\") " pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" Feb 23 07:01:14 crc kubenswrapper[5047]: I0223 07:01:14.986427 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" Feb 23 07:01:15 crc kubenswrapper[5047]: I0223 07:01:15.034538 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc"] Feb 23 07:01:15 crc kubenswrapper[5047]: I0223 07:01:15.552723 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlc2c" event={"ID":"870690c0-1708-445c-8782-20b41e5d685b","Type":"ContainerStarted","Data":"e8a29e9e525ea132527a82f5b1f8763f7a902a9a4f1d8ac5515ef8fab677fe02"} Feb 23 07:01:15 crc kubenswrapper[5047]: I0223 07:01:15.553777 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" event={"ID":"a2e5f18f-e775-4839-8e1b-a5b05c2af4e9","Type":"ContainerStarted","Data":"2968688fd06f1fff973e48af8e17262f93f0cbe3c274c3c642b6c6aa79b0653e"} Feb 23 07:01:15 crc kubenswrapper[5047]: I0223 07:01:15.555039 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j"] Feb 23 07:01:15 crc kubenswrapper[5047]: I0223 07:01:15.572578 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xlc2c" podStartSLOduration=2.151224615 podStartE2EDuration="3.572551459s" podCreationTimestamp="2026-02-23 07:01:12 +0000 UTC" firstStartedPulling="2026-02-23 07:01:13.534949118 +0000 UTC m=+995.786276252" lastFinishedPulling="2026-02-23 07:01:14.956275962 +0000 UTC m=+997.207603096" observedRunningTime="2026-02-23 07:01:15.569186407 +0000 UTC m=+997.820513551" watchObservedRunningTime="2026-02-23 07:01:15.572551459 +0000 UTC m=+997.823878593" Feb 23 07:01:16 crc kubenswrapper[5047]: I0223 07:01:16.562109 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" event={"ID":"d9b84120-1d90-41d1-8686-535d6dcfb6d4","Type":"ContainerStarted","Data":"e61285e4fe2a6c99141f42b0c9a1cca2e919e30b885986e2e271422ffb669573"} Feb 23 07:01:18 crc kubenswrapper[5047]: I0223 07:01:18.580707 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" event={"ID":"a2e5f18f-e775-4839-8e1b-a5b05c2af4e9","Type":"ContainerStarted","Data":"f61dc69f47678ebcb5a55a5f83dd7ebf4bebed55613426a59168b8ec3a304598"} Feb 23 07:01:18 crc kubenswrapper[5047]: I0223 07:01:18.581571 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" Feb 23 07:01:18 crc kubenswrapper[5047]: I0223 07:01:18.609099 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" podStartSLOduration=1.861948497 podStartE2EDuration="4.60907958s" podCreationTimestamp="2026-02-23 07:01:14 +0000 UTC" firstStartedPulling="2026-02-23 07:01:15.116604621 +0000 UTC m=+997.367931755" lastFinishedPulling="2026-02-23 07:01:17.863735704 +0000 UTC m=+1000.115062838" observedRunningTime="2026-02-23 07:01:18.606399786 +0000 UTC m=+1000.857726920" watchObservedRunningTime="2026-02-23 07:01:18.60907958 +0000 UTC m=+1000.860406714" Feb 23 07:01:21 crc kubenswrapper[5047]: I0223 07:01:21.612268 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" event={"ID":"d9b84120-1d90-41d1-8686-535d6dcfb6d4","Type":"ContainerStarted","Data":"eb3444338f212956a4a9666645568fe2d6a3da3d2640cb2576b185a22f83beb8"} Feb 23 07:01:21 crc kubenswrapper[5047]: I0223 07:01:21.613162 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" Feb 23 07:01:21 crc kubenswrapper[5047]: I0223 07:01:21.634539 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" podStartSLOduration=2.659247993 podStartE2EDuration="7.634514896s" podCreationTimestamp="2026-02-23 07:01:14 +0000 UTC" firstStartedPulling="2026-02-23 07:01:15.561991679 +0000 UTC m=+997.813318813" lastFinishedPulling="2026-02-23 07:01:20.537258572 +0000 UTC m=+1002.788585716" observedRunningTime="2026-02-23 07:01:21.633691433 +0000 UTC m=+1003.885018587" watchObservedRunningTime="2026-02-23 07:01:21.634514896 +0000 UTC m=+1003.885842040" Feb 23 07:01:22 crc kubenswrapper[5047]: I0223 07:01:22.781328 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:22 crc kubenswrapper[5047]: I0223 07:01:22.781519 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:22 crc kubenswrapper[5047]: I0223 07:01:22.824452 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:23 crc kubenswrapper[5047]: I0223 07:01:23.662427 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:23 crc kubenswrapper[5047]: I0223 07:01:23.709604 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xlc2c"] Feb 23 07:01:25 crc kubenswrapper[5047]: I0223 07:01:25.639126 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xlc2c" podUID="870690c0-1708-445c-8782-20b41e5d685b" containerName="registry-server" containerID="cri-o://e8a29e9e525ea132527a82f5b1f8763f7a902a9a4f1d8ac5515ef8fab677fe02" gracePeriod=2 Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.080311 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.220175 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870690c0-1708-445c-8782-20b41e5d685b-utilities\") pod \"870690c0-1708-445c-8782-20b41e5d685b\" (UID: \"870690c0-1708-445c-8782-20b41e5d685b\") " Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.220274 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870690c0-1708-445c-8782-20b41e5d685b-catalog-content\") pod \"870690c0-1708-445c-8782-20b41e5d685b\" (UID: \"870690c0-1708-445c-8782-20b41e5d685b\") " Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.220389 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd69l\" (UniqueName: \"kubernetes.io/projected/870690c0-1708-445c-8782-20b41e5d685b-kube-api-access-dd69l\") pod \"870690c0-1708-445c-8782-20b41e5d685b\" (UID: \"870690c0-1708-445c-8782-20b41e5d685b\") " Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.221332 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870690c0-1708-445c-8782-20b41e5d685b-utilities" (OuterVolumeSpecName: "utilities") pod "870690c0-1708-445c-8782-20b41e5d685b" (UID: "870690c0-1708-445c-8782-20b41e5d685b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.233239 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870690c0-1708-445c-8782-20b41e5d685b-kube-api-access-dd69l" (OuterVolumeSpecName: "kube-api-access-dd69l") pod "870690c0-1708-445c-8782-20b41e5d685b" (UID: "870690c0-1708-445c-8782-20b41e5d685b"). InnerVolumeSpecName "kube-api-access-dd69l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.249412 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870690c0-1708-445c-8782-20b41e5d685b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "870690c0-1708-445c-8782-20b41e5d685b" (UID: "870690c0-1708-445c-8782-20b41e5d685b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.321669 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870690c0-1708-445c-8782-20b41e5d685b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.321712 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd69l\" (UniqueName: \"kubernetes.io/projected/870690c0-1708-445c-8782-20b41e5d685b-kube-api-access-dd69l\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.321723 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870690c0-1708-445c-8782-20b41e5d685b-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.646869 5047 generic.go:334] "Generic (PLEG): container finished" podID="870690c0-1708-445c-8782-20b41e5d685b" containerID="e8a29e9e525ea132527a82f5b1f8763f7a902a9a4f1d8ac5515ef8fab677fe02" exitCode=0 Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.646943 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlc2c" event={"ID":"870690c0-1708-445c-8782-20b41e5d685b","Type":"ContainerDied","Data":"e8a29e9e525ea132527a82f5b1f8763f7a902a9a4f1d8ac5515ef8fab677fe02"} Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.646976 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xlc2c" event={"ID":"870690c0-1708-445c-8782-20b41e5d685b","Type":"ContainerDied","Data":"058f0c97661020ab6a64a92bf9c3e0040ed40a26a0e737c810cd1a16294e955c"} Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.646995 5047 scope.go:117] "RemoveContainer" containerID="e8a29e9e525ea132527a82f5b1f8763f7a902a9a4f1d8ac5515ef8fab677fe02" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.647115 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xlc2c" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.665765 5047 scope.go:117] "RemoveContainer" containerID="eceef24bd849131ced1fe91b735bbff2a05bf7af34877531e9148e9f11481d71" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.669019 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xlc2c"] Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.671425 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xlc2c"] Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.681773 5047 scope.go:117] "RemoveContainer" containerID="6c3fcef02e6f33ee3ac4572fa3b7e6f3873f54d155e84e1f0b4f1418e248bc27" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.701199 5047 scope.go:117] "RemoveContainer" containerID="e8a29e9e525ea132527a82f5b1f8763f7a902a9a4f1d8ac5515ef8fab677fe02" Feb 23 07:01:26 crc kubenswrapper[5047]: E0223 07:01:26.701735 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8a29e9e525ea132527a82f5b1f8763f7a902a9a4f1d8ac5515ef8fab677fe02\": container with ID starting with e8a29e9e525ea132527a82f5b1f8763f7a902a9a4f1d8ac5515ef8fab677fe02 not found: ID does not exist" containerID="e8a29e9e525ea132527a82f5b1f8763f7a902a9a4f1d8ac5515ef8fab677fe02" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.701773 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a29e9e525ea132527a82f5b1f8763f7a902a9a4f1d8ac5515ef8fab677fe02"} err="failed to get container status \"e8a29e9e525ea132527a82f5b1f8763f7a902a9a4f1d8ac5515ef8fab677fe02\": rpc error: code = NotFound desc = could not find container \"e8a29e9e525ea132527a82f5b1f8763f7a902a9a4f1d8ac5515ef8fab677fe02\": container with ID starting with e8a29e9e525ea132527a82f5b1f8763f7a902a9a4f1d8ac5515ef8fab677fe02 not found: ID does not exist" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.701808 5047 scope.go:117] "RemoveContainer" containerID="eceef24bd849131ced1fe91b735bbff2a05bf7af34877531e9148e9f11481d71" Feb 23 07:01:26 crc kubenswrapper[5047]: E0223 07:01:26.702256 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eceef24bd849131ced1fe91b735bbff2a05bf7af34877531e9148e9f11481d71\": container with ID starting with eceef24bd849131ced1fe91b735bbff2a05bf7af34877531e9148e9f11481d71 not found: ID does not exist" containerID="eceef24bd849131ced1fe91b735bbff2a05bf7af34877531e9148e9f11481d71" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.702278 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eceef24bd849131ced1fe91b735bbff2a05bf7af34877531e9148e9f11481d71"} err="failed to get container status \"eceef24bd849131ced1fe91b735bbff2a05bf7af34877531e9148e9f11481d71\": rpc error: code = NotFound desc = could not find container \"eceef24bd849131ced1fe91b735bbff2a05bf7af34877531e9148e9f11481d71\": container with ID starting with eceef24bd849131ced1fe91b735bbff2a05bf7af34877531e9148e9f11481d71 not found: ID does not exist" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.702290 5047 scope.go:117] "RemoveContainer" containerID="6c3fcef02e6f33ee3ac4572fa3b7e6f3873f54d155e84e1f0b4f1418e248bc27" Feb 23 07:01:26 crc kubenswrapper[5047]: E0223 07:01:26.702604 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c3fcef02e6f33ee3ac4572fa3b7e6f3873f54d155e84e1f0b4f1418e248bc27\": container with ID starting with 6c3fcef02e6f33ee3ac4572fa3b7e6f3873f54d155e84e1f0b4f1418e248bc27 not found: ID does not exist" containerID="6c3fcef02e6f33ee3ac4572fa3b7e6f3873f54d155e84e1f0b4f1418e248bc27" Feb 23 07:01:26 crc kubenswrapper[5047]: I0223 07:01:26.702643 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c3fcef02e6f33ee3ac4572fa3b7e6f3873f54d155e84e1f0b4f1418e248bc27"} err="failed to get container status \"6c3fcef02e6f33ee3ac4572fa3b7e6f3873f54d155e84e1f0b4f1418e248bc27\": rpc error: code = NotFound desc = could not find container \"6c3fcef02e6f33ee3ac4572fa3b7e6f3873f54d155e84e1f0b4f1418e248bc27\": container with ID starting with 6c3fcef02e6f33ee3ac4572fa3b7e6f3873f54d155e84e1f0b4f1418e248bc27 not found: ID does not exist" Feb 23 07:01:28 crc kubenswrapper[5047]: I0223 07:01:28.349702 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870690c0-1708-445c-8782-20b41e5d685b" path="/var/lib/kubelet/pods/870690c0-1708-445c-8782-20b41e5d685b/volumes" Feb 23 07:01:34 crc kubenswrapper[5047]: I0223 07:01:34.991771 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-74bbf6f644-rwt8j" Feb 23 07:01:54 crc kubenswrapper[5047]: I0223 07:01:54.749830 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-65b877fd6b-65kbc" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.488797 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb"] Feb 23 07:01:55 crc kubenswrapper[5047]: E0223 07:01:55.489267 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870690c0-1708-445c-8782-20b41e5d685b" containerName="extract-utilities" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.489300 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="870690c0-1708-445c-8782-20b41e5d685b" containerName="extract-utilities" Feb 23 07:01:55 crc kubenswrapper[5047]: E0223 07:01:55.489328 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870690c0-1708-445c-8782-20b41e5d685b" containerName="registry-server" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.489345 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="870690c0-1708-445c-8782-20b41e5d685b" containerName="registry-server" Feb 23 07:01:55 crc kubenswrapper[5047]: E0223 07:01:55.489384 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870690c0-1708-445c-8782-20b41e5d685b" containerName="extract-content" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.489398 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="870690c0-1708-445c-8782-20b41e5d685b" containerName="extract-content" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.489618 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="870690c0-1708-445c-8782-20b41e5d685b" containerName="registry-server" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.490400 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.497353 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.497362 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-2xbrc" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.500881 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-nh5jn"] Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.513599 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.515746 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.516614 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb"] Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.517967 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.537661 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-metrics-certs\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.537721 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp26q\" (UniqueName: \"kubernetes.io/projected/59ed94c1-a5ba-4e59-b112-68a944434be0-kube-api-access-rp26q\") pod \"frr-k8s-webhook-server-78b44bf5bb-bhglb\" (UID: \"59ed94c1-a5ba-4e59-b112-68a944434be0\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.537752 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-reloader\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.537788 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-metrics\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.537811 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59ed94c1-a5ba-4e59-b112-68a944434be0-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-bhglb\" (UID: \"59ed94c1-a5ba-4e59-b112-68a944434be0\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.537853 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bttfx\" (UniqueName: \"kubernetes.io/projected/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-kube-api-access-bttfx\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.537877 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-frr-startup\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.537931 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-frr-conf\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.537960 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-frr-sockets\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.590009 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wxppp"] Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.591396 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wxppp" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.595681 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-5x969" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.595936 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.596057 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.596530 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.611350 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-9sdh2"] Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.612368 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-9sdh2" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.615051 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.632891 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-9sdh2"] Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.638756 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-frr-conf\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.638811 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jkx6\" (UniqueName: \"kubernetes.io/projected/d6b13e52-f7be-49a3-accf-f5084c8d38a6-kube-api-access-4jkx6\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.638839 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-frr-sockets\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.638873 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-metrics-certs\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.638896 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp26q\" (UniqueName: \"kubernetes.io/projected/59ed94c1-a5ba-4e59-b112-68a944434be0-kube-api-access-rp26q\") pod \"frr-k8s-webhook-server-78b44bf5bb-bhglb\" (UID: \"59ed94c1-a5ba-4e59-b112-68a944434be0\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.638932 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-metrics-certs\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.638949 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-reloader\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.638970 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e8d69d4-4bff-47e9-950f-c07c56adad77-cert\") pod \"controller-69bbfbf88f-9sdh2\" (UID: \"2e8d69d4-4bff-47e9-950f-c07c56adad77\") " pod="metallb-system/controller-69bbfbf88f-9sdh2" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.638987 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6vzv\" (UniqueName: \"kubernetes.io/projected/2e8d69d4-4bff-47e9-950f-c07c56adad77-kube-api-access-z6vzv\") pod \"controller-69bbfbf88f-9sdh2\" (UID: \"2e8d69d4-4bff-47e9-950f-c07c56adad77\") " pod="metallb-system/controller-69bbfbf88f-9sdh2" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.639065 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-metrics\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.639083 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e8d69d4-4bff-47e9-950f-c07c56adad77-metrics-certs\") pod \"controller-69bbfbf88f-9sdh2\" (UID: \"2e8d69d4-4bff-47e9-950f-c07c56adad77\") " pod="metallb-system/controller-69bbfbf88f-9sdh2" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.639098 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59ed94c1-a5ba-4e59-b112-68a944434be0-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-bhglb\" (UID: \"59ed94c1-a5ba-4e59-b112-68a944434be0\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.639123 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-memberlist\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.639147 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bttfx\" (UniqueName: \"kubernetes.io/projected/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-kube-api-access-bttfx\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.639163 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-frr-startup\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.639182 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d6b13e52-f7be-49a3-accf-f5084c8d38a6-metallb-excludel2\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.639637 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-frr-conf\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.639826 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-frr-sockets\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.640832 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-metrics\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.641303 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-reloader\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: E0223 07:01:55.641390 5047 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 23 07:01:55 crc kubenswrapper[5047]: E0223 07:01:55.641441 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59ed94c1-a5ba-4e59-b112-68a944434be0-cert podName:59ed94c1-a5ba-4e59-b112-68a944434be0 nodeName:}" failed. No retries permitted until 2026-02-23 07:01:56.141422839 +0000 UTC m=+1038.392749973 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59ed94c1-a5ba-4e59-b112-68a944434be0-cert") pod "frr-k8s-webhook-server-78b44bf5bb-bhglb" (UID: "59ed94c1-a5ba-4e59-b112-68a944434be0") : secret "frr-k8s-webhook-server-cert" not found Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.642550 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-frr-startup\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.649203 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-metrics-certs\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.664682 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bttfx\" (UniqueName: \"kubernetes.io/projected/e7785968-bae6-4ad4-bc4a-ccc4fac2cf41-kube-api-access-bttfx\") pod \"frr-k8s-nh5jn\" (UID: \"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41\") " pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.666397 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp26q\" (UniqueName: \"kubernetes.io/projected/59ed94c1-a5ba-4e59-b112-68a944434be0-kube-api-access-rp26q\") pod \"frr-k8s-webhook-server-78b44bf5bb-bhglb\" (UID: \"59ed94c1-a5ba-4e59-b112-68a944434be0\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.740117 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-metrics-certs\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.740173 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e8d69d4-4bff-47e9-950f-c07c56adad77-cert\") pod \"controller-69bbfbf88f-9sdh2\" (UID: \"2e8d69d4-4bff-47e9-950f-c07c56adad77\") " pod="metallb-system/controller-69bbfbf88f-9sdh2" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.740195 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6vzv\" (UniqueName: \"kubernetes.io/projected/2e8d69d4-4bff-47e9-950f-c07c56adad77-kube-api-access-z6vzv\") pod \"controller-69bbfbf88f-9sdh2\" (UID: \"2e8d69d4-4bff-47e9-950f-c07c56adad77\") " pod="metallb-system/controller-69bbfbf88f-9sdh2" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.740218 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e8d69d4-4bff-47e9-950f-c07c56adad77-metrics-certs\") pod \"controller-69bbfbf88f-9sdh2\" (UID: \"2e8d69d4-4bff-47e9-950f-c07c56adad77\") " pod="metallb-system/controller-69bbfbf88f-9sdh2" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.740258 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-memberlist\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.740282 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d6b13e52-f7be-49a3-accf-f5084c8d38a6-metallb-excludel2\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.740308 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jkx6\" (UniqueName: \"kubernetes.io/projected/d6b13e52-f7be-49a3-accf-f5084c8d38a6-kube-api-access-4jkx6\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:55 crc kubenswrapper[5047]: E0223 07:01:55.740336 5047 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 23 07:01:55 crc kubenswrapper[5047]: E0223 07:01:55.740444 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-metrics-certs podName:d6b13e52-f7be-49a3-accf-f5084c8d38a6 nodeName:}" failed. No retries permitted until 2026-02-23 07:01:56.240416472 +0000 UTC m=+1038.491743606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-metrics-certs") pod "speaker-wxppp" (UID: "d6b13e52-f7be-49a3-accf-f5084c8d38a6") : secret "speaker-certs-secret" not found Feb 23 07:01:55 crc kubenswrapper[5047]: E0223 07:01:55.740746 5047 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 23 07:01:55 crc kubenswrapper[5047]: E0223 07:01:55.740802 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-memberlist podName:d6b13e52-f7be-49a3-accf-f5084c8d38a6 nodeName:}" failed. No retries permitted until 2026-02-23 07:01:56.240784882 +0000 UTC m=+1038.492112016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-memberlist") pod "speaker-wxppp" (UID: "d6b13e52-f7be-49a3-accf-f5084c8d38a6") : secret "metallb-memberlist" not found Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.741457 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d6b13e52-f7be-49a3-accf-f5084c8d38a6-metallb-excludel2\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.744406 5047 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.744449 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e8d69d4-4bff-47e9-950f-c07c56adad77-metrics-certs\") pod \"controller-69bbfbf88f-9sdh2\" (UID: \"2e8d69d4-4bff-47e9-950f-c07c56adad77\") " pod="metallb-system/controller-69bbfbf88f-9sdh2" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.757302 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e8d69d4-4bff-47e9-950f-c07c56adad77-cert\") pod \"controller-69bbfbf88f-9sdh2\" (UID: \"2e8d69d4-4bff-47e9-950f-c07c56adad77\") " pod="metallb-system/controller-69bbfbf88f-9sdh2" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.764628 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6vzv\" (UniqueName: \"kubernetes.io/projected/2e8d69d4-4bff-47e9-950f-c07c56adad77-kube-api-access-z6vzv\") pod \"controller-69bbfbf88f-9sdh2\" (UID: \"2e8d69d4-4bff-47e9-950f-c07c56adad77\") " pod="metallb-system/controller-69bbfbf88f-9sdh2" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.770565 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jkx6\" (UniqueName: \"kubernetes.io/projected/d6b13e52-f7be-49a3-accf-f5084c8d38a6-kube-api-access-4jkx6\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.833978 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:01:55 crc kubenswrapper[5047]: I0223 07:01:55.927211 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-9sdh2" Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.146375 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59ed94c1-a5ba-4e59-b112-68a944434be0-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-bhglb\" (UID: \"59ed94c1-a5ba-4e59-b112-68a944434be0\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.157285 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59ed94c1-a5ba-4e59-b112-68a944434be0-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-bhglb\" (UID: \"59ed94c1-a5ba-4e59-b112-68a944434be0\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.173016 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-9sdh2"] Feb 23 07:01:56 crc kubenswrapper[5047]: W0223 07:01:56.182566 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e8d69d4_4bff_47e9_950f_c07c56adad77.slice/crio-7923687082ada9959b4368ea9f156a2073a2b9cce7073e8ad1e2d125ee046699 WatchSource:0}: Error finding container 7923687082ada9959b4368ea9f156a2073a2b9cce7073e8ad1e2d125ee046699: Status 404 returned error can't find the container with id 7923687082ada9959b4368ea9f156a2073a2b9cce7073e8ad1e2d125ee046699 Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.247975 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-metrics-certs\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.248067 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-memberlist\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:56 crc kubenswrapper[5047]: E0223 07:01:56.248277 5047 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 23 07:01:56 crc kubenswrapper[5047]: E0223 07:01:56.248366 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-memberlist podName:d6b13e52-f7be-49a3-accf-f5084c8d38a6 nodeName:}" failed. No retries permitted until 2026-02-23 07:01:57.248342739 +0000 UTC m=+1039.499669883 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-memberlist") pod "speaker-wxppp" (UID: "d6b13e52-f7be-49a3-accf-f5084c8d38a6") : secret "metallb-memberlist" not found Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.252250 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-metrics-certs\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.406920 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.820064 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb"] Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.854005 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-9sdh2" event={"ID":"2e8d69d4-4bff-47e9-950f-c07c56adad77","Type":"ContainerStarted","Data":"feb4a34d638692568b26e95ffebb7253e04868ce655d7e0361f3b6ac15cd3b38"} Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.854059 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-9sdh2" event={"ID":"2e8d69d4-4bff-47e9-950f-c07c56adad77","Type":"ContainerStarted","Data":"d204eba5ff286136bdb18c410479d7829bb8b516ebbb4dcaed50c450c330843a"} Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.854071 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-9sdh2" event={"ID":"2e8d69d4-4bff-47e9-950f-c07c56adad77","Type":"ContainerStarted","Data":"7923687082ada9959b4368ea9f156a2073a2b9cce7073e8ad1e2d125ee046699"} Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.855121 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-9sdh2" Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.857845 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" event={"ID":"59ed94c1-a5ba-4e59-b112-68a944434be0","Type":"ContainerStarted","Data":"79545f2af10e8dbb948b30a11dab065f0c23488034ad5ac54a8f0520ca52d69d"} Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.859973 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nh5jn" event={"ID":"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41","Type":"ContainerStarted","Data":"a6d73680816c242c88a3d98f7ff1121c1816a53705964510ee8420f44251e064"} Feb 23 07:01:56 crc kubenswrapper[5047]: I0223 07:01:56.872676 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-9sdh2" podStartSLOduration=1.872648267 podStartE2EDuration="1.872648267s" podCreationTimestamp="2026-02-23 07:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:01:56.870827457 +0000 UTC m=+1039.122154591" watchObservedRunningTime="2026-02-23 07:01:56.872648267 +0000 UTC m=+1039.123975401" Feb 23 07:01:57 crc kubenswrapper[5047]: I0223 07:01:57.271648 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-memberlist\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:57 crc kubenswrapper[5047]: I0223 07:01:57.280022 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d6b13e52-f7be-49a3-accf-f5084c8d38a6-memberlist\") pod \"speaker-wxppp\" (UID: \"d6b13e52-f7be-49a3-accf-f5084c8d38a6\") " pod="metallb-system/speaker-wxppp" Feb 23 07:01:57 crc kubenswrapper[5047]: I0223 07:01:57.411239 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wxppp" Feb 23 07:01:57 crc kubenswrapper[5047]: W0223 07:01:57.433941 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6b13e52_f7be_49a3_accf_f5084c8d38a6.slice/crio-604912238a67a32c0d95366c7cf773a9bf0183fc84ae114b1677ed6c2b5c273f WatchSource:0}: Error finding container 604912238a67a32c0d95366c7cf773a9bf0183fc84ae114b1677ed6c2b5c273f: Status 404 returned error can't find the container with id 604912238a67a32c0d95366c7cf773a9bf0183fc84ae114b1677ed6c2b5c273f Feb 23 07:01:57 crc kubenswrapper[5047]: I0223 07:01:57.869542 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wxppp" event={"ID":"d6b13e52-f7be-49a3-accf-f5084c8d38a6","Type":"ContainerStarted","Data":"fc4368a4f6bafe0149835ef38c7e84814f3d387d20d697d8dc3d0123119c30a2"} Feb 23 07:01:57 crc kubenswrapper[5047]: I0223 07:01:57.869648 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wxppp" event={"ID":"d6b13e52-f7be-49a3-accf-f5084c8d38a6","Type":"ContainerStarted","Data":"604912238a67a32c0d95366c7cf773a9bf0183fc84ae114b1677ed6c2b5c273f"} Feb 23 07:01:58 crc kubenswrapper[5047]: I0223 07:01:58.879172 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wxppp" event={"ID":"d6b13e52-f7be-49a3-accf-f5084c8d38a6","Type":"ContainerStarted","Data":"d6a737b04f6dcbe8f00c6128aaa35fe14aa8bdfd2f15acecf5b9b3ce2832f4d6"} Feb 23 07:01:58 crc kubenswrapper[5047]: I0223 07:01:58.879737 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wxppp" Feb 23 07:01:58 crc kubenswrapper[5047]: I0223 07:01:58.901958 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wxppp" podStartSLOduration=3.90193734 podStartE2EDuration="3.90193734s" podCreationTimestamp="2026-02-23 07:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:01:58.901034485 +0000 UTC m=+1041.152361619" watchObservedRunningTime="2026-02-23 07:01:58.90193734 +0000 UTC m=+1041.153264474" Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.062480 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fv88x"] Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.070271 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.196849 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fv88x"] Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.213506 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53e16170-88d2-48dc-9d81-d9aa9c768655-catalog-content\") pod \"community-operators-fv88x\" (UID: \"53e16170-88d2-48dc-9d81-d9aa9c768655\") " pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.213577 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-httzt\" (UniqueName: \"kubernetes.io/projected/53e16170-88d2-48dc-9d81-d9aa9c768655-kube-api-access-httzt\") pod \"community-operators-fv88x\" (UID: \"53e16170-88d2-48dc-9d81-d9aa9c768655\") " pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.213640 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53e16170-88d2-48dc-9d81-d9aa9c768655-utilities\") pod \"community-operators-fv88x\" (UID: \"53e16170-88d2-48dc-9d81-d9aa9c768655\") " pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.315836 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53e16170-88d2-48dc-9d81-d9aa9c768655-utilities\") pod \"community-operators-fv88x\" (UID: \"53e16170-88d2-48dc-9d81-d9aa9c768655\") " pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.316363 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53e16170-88d2-48dc-9d81-d9aa9c768655-catalog-content\") pod \"community-operators-fv88x\" (UID: \"53e16170-88d2-48dc-9d81-d9aa9c768655\") " pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.316399 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-httzt\" (UniqueName: \"kubernetes.io/projected/53e16170-88d2-48dc-9d81-d9aa9c768655-kube-api-access-httzt\") pod \"community-operators-fv88x\" (UID: \"53e16170-88d2-48dc-9d81-d9aa9c768655\") " pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.316444 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53e16170-88d2-48dc-9d81-d9aa9c768655-utilities\") pod \"community-operators-fv88x\" (UID: \"53e16170-88d2-48dc-9d81-d9aa9c768655\") " pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.316771 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53e16170-88d2-48dc-9d81-d9aa9c768655-catalog-content\") pod \"community-operators-fv88x\" (UID: \"53e16170-88d2-48dc-9d81-d9aa9c768655\") " pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.351591 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-httzt\" (UniqueName: \"kubernetes.io/projected/53e16170-88d2-48dc-9d81-d9aa9c768655-kube-api-access-httzt\") pod \"community-operators-fv88x\" (UID: \"53e16170-88d2-48dc-9d81-d9aa9c768655\") " pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.389189 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.672090 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fv88x"] Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.897645 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fv88x" event={"ID":"53e16170-88d2-48dc-9d81-d9aa9c768655","Type":"ContainerStarted","Data":"21ae0486d50d378dbcff9f297b77805d0b245f4278242ef296d58694bad0da1e"} Feb 23 07:02:00 crc kubenswrapper[5047]: I0223 07:02:00.898157 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fv88x" event={"ID":"53e16170-88d2-48dc-9d81-d9aa9c768655","Type":"ContainerStarted","Data":"71e5251f29ffb4f57816d612c2e52969c0da7488303d9b5419c721402ce94135"} Feb 23 07:02:01 crc kubenswrapper[5047]: I0223 07:02:01.911495 5047 generic.go:334] "Generic (PLEG): container finished" podID="53e16170-88d2-48dc-9d81-d9aa9c768655" containerID="21ae0486d50d378dbcff9f297b77805d0b245f4278242ef296d58694bad0da1e" exitCode=0 Feb 23 07:02:01 crc kubenswrapper[5047]: I0223 07:02:01.911630 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fv88x" event={"ID":"53e16170-88d2-48dc-9d81-d9aa9c768655","Type":"ContainerDied","Data":"21ae0486d50d378dbcff9f297b77805d0b245f4278242ef296d58694bad0da1e"} Feb 23 07:02:04 crc kubenswrapper[5047]: I0223 07:02:04.934017 5047 generic.go:334] "Generic (PLEG): container finished" podID="e7785968-bae6-4ad4-bc4a-ccc4fac2cf41" containerID="2637822f3a9fcad28ae78e3e698f90b8365d4287c573e22a13bc2bf0cf435c77" exitCode=0 Feb 23 07:02:04 crc kubenswrapper[5047]: I0223 07:02:04.934151 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nh5jn" event={"ID":"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41","Type":"ContainerDied","Data":"2637822f3a9fcad28ae78e3e698f90b8365d4287c573e22a13bc2bf0cf435c77"} Feb 23 07:02:04 crc kubenswrapper[5047]: I0223 07:02:04.936921 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" event={"ID":"59ed94c1-a5ba-4e59-b112-68a944434be0","Type":"ContainerStarted","Data":"617e8d7edbf6d29752b9f1b8b07d8be37d90c0d0e0456c3eec980c691e9c8222"} Feb 23 07:02:04 crc kubenswrapper[5047]: I0223 07:02:04.937621 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" Feb 23 07:02:04 crc kubenswrapper[5047]: I0223 07:02:04.940717 5047 generic.go:334] "Generic (PLEG): container finished" podID="53e16170-88d2-48dc-9d81-d9aa9c768655" containerID="14b7492bdd46a93900fcf2a3d61609134f131f5f28d883863789ee35abaca560" exitCode=0 Feb 23 07:02:04 crc kubenswrapper[5047]: I0223 07:02:04.940771 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fv88x" event={"ID":"53e16170-88d2-48dc-9d81-d9aa9c768655","Type":"ContainerDied","Data":"14b7492bdd46a93900fcf2a3d61609134f131f5f28d883863789ee35abaca560"} Feb 23 07:02:05 crc kubenswrapper[5047]: I0223 07:02:05.005975 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" podStartSLOduration=3.043873064 podStartE2EDuration="10.005951042s" podCreationTimestamp="2026-02-23 07:01:55 +0000 UTC" firstStartedPulling="2026-02-23 07:01:56.842692833 +0000 UTC m=+1039.094019967" lastFinishedPulling="2026-02-23 07:02:03.804770811 +0000 UTC m=+1046.056097945" observedRunningTime="2026-02-23 07:02:04.999233448 +0000 UTC m=+1047.250560602" watchObservedRunningTime="2026-02-23 07:02:05.005951042 +0000 UTC m=+1047.257278186" Feb 23 07:02:05 crc kubenswrapper[5047]: I0223 07:02:05.952370 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fv88x" event={"ID":"53e16170-88d2-48dc-9d81-d9aa9c768655","Type":"ContainerStarted","Data":"58d9de180920e579db48ee6897169db34614be3cf3e3717b9df280b4453d4ec0"} Feb 23 07:02:05 crc kubenswrapper[5047]: I0223 07:02:05.954351 5047 generic.go:334] "Generic (PLEG): container finished" podID="e7785968-bae6-4ad4-bc4a-ccc4fac2cf41" containerID="91451b24aa76fee0c2fd6ccc4ab6c2aecb8b205b9a34fc94e3e88dfc9f4c6a61" exitCode=0 Feb 23 07:02:05 crc kubenswrapper[5047]: I0223 07:02:05.954670 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nh5jn" event={"ID":"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41","Type":"ContainerDied","Data":"91451b24aa76fee0c2fd6ccc4ab6c2aecb8b205b9a34fc94e3e88dfc9f4c6a61"} Feb 23 07:02:05 crc kubenswrapper[5047]: I0223 07:02:05.985866 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fv88x" podStartSLOduration=4.322349453 podStartE2EDuration="5.985842078s" podCreationTimestamp="2026-02-23 07:02:00 +0000 UTC" firstStartedPulling="2026-02-23 07:02:03.68072746 +0000 UTC m=+1045.932054594" lastFinishedPulling="2026-02-23 07:02:05.344220075 +0000 UTC m=+1047.595547219" observedRunningTime="2026-02-23 07:02:05.982237259 +0000 UTC m=+1048.233564453" watchObservedRunningTime="2026-02-23 07:02:05.985842078 +0000 UTC m=+1048.237169212" Feb 23 07:02:06 crc kubenswrapper[5047]: I0223 07:02:06.965067 5047 generic.go:334] "Generic (PLEG): container finished" podID="e7785968-bae6-4ad4-bc4a-ccc4fac2cf41" containerID="c848edba99b637bc34452fdc49358b57372a780a67d2d777e69f6d6308cc5e22" exitCode=0 Feb 23 07:02:06 crc kubenswrapper[5047]: I0223 07:02:06.965186 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nh5jn" event={"ID":"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41","Type":"ContainerDied","Data":"c848edba99b637bc34452fdc49358b57372a780a67d2d777e69f6d6308cc5e22"} Feb 23 07:02:07 crc kubenswrapper[5047]: I0223 07:02:07.417189 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wxppp" Feb 23 07:02:07 crc kubenswrapper[5047]: I0223 07:02:07.976015 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nh5jn" event={"ID":"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41","Type":"ContainerStarted","Data":"c318b5f9b72fcfc409b443c2c8f40b262ffa9eeb12357b83db5d42d829068caf"} Feb 23 07:02:07 crc kubenswrapper[5047]: I0223 07:02:07.976068 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nh5jn" event={"ID":"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41","Type":"ContainerStarted","Data":"e8048beb13a8d98699c85ae9520d9c2f46bcbfbfeddb265c784f34708f1ff284"} Feb 23 07:02:07 crc kubenswrapper[5047]: I0223 07:02:07.976078 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nh5jn" event={"ID":"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41","Type":"ContainerStarted","Data":"b959c45af42e4b052a8b27e032a1f905cc4895dee3fed4a6d1f0ebeb6d9b4548"} Feb 23 07:02:07 crc kubenswrapper[5047]: I0223 07:02:07.976091 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nh5jn" event={"ID":"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41","Type":"ContainerStarted","Data":"f0983d003fabd10061ec547251645b174c56ea5d575bcb957ce5c943deee01eb"} Feb 23 07:02:07 crc kubenswrapper[5047]: I0223 07:02:07.976101 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nh5jn" event={"ID":"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41","Type":"ContainerStarted","Data":"14c469ac1ef1a58c0c7f2648c476711dcc42a0beb11a51e9dfb1401c861ae4e1"} Feb 23 07:02:08 crc kubenswrapper[5047]: I0223 07:02:08.907582 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2"] Feb 23 07:02:08 crc kubenswrapper[5047]: I0223 07:02:08.909918 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" Feb 23 07:02:08 crc kubenswrapper[5047]: I0223 07:02:08.915552 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 23 07:02:08 crc kubenswrapper[5047]: I0223 07:02:08.949972 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2"] Feb 23 07:02:08 crc kubenswrapper[5047]: I0223 07:02:08.990536 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nh5jn" event={"ID":"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41","Type":"ContainerStarted","Data":"62ebd312ef9847ba998f636213ecfcd41e0567f8055e1232aaa82fcb88df1d3a"} Feb 23 07:02:08 crc kubenswrapper[5047]: I0223 07:02:08.990748 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:02:09 crc kubenswrapper[5047]: I0223 07:02:09.021328 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-nh5jn" podStartSLOduration=6.216332423 podStartE2EDuration="14.021295649s" podCreationTimestamp="2026-02-23 07:01:55 +0000 UTC" firstStartedPulling="2026-02-23 07:01:55.976642688 +0000 UTC m=+1038.227969822" lastFinishedPulling="2026-02-23 07:02:03.781605914 +0000 UTC m=+1046.032933048" observedRunningTime="2026-02-23 07:02:09.01621323 +0000 UTC m=+1051.267540374" watchObservedRunningTime="2026-02-23 07:02:09.021295649 +0000 UTC m=+1051.272622803" Feb 23 07:02:09 crc kubenswrapper[5047]: I0223 07:02:09.053130 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb5db10f-166f-4c03-b8f9-a8d549c48948-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2\" (UID: \"eb5db10f-166f-4c03-b8f9-a8d549c48948\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" Feb 23 07:02:09 crc kubenswrapper[5047]: I0223 07:02:09.053260 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chspj\" (UniqueName: \"kubernetes.io/projected/eb5db10f-166f-4c03-b8f9-a8d549c48948-kube-api-access-chspj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2\" (UID: \"eb5db10f-166f-4c03-b8f9-a8d549c48948\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" Feb 23 07:02:09 crc kubenswrapper[5047]: I0223 07:02:09.053299 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb5db10f-166f-4c03-b8f9-a8d549c48948-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2\" (UID: \"eb5db10f-166f-4c03-b8f9-a8d549c48948\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" Feb 23 07:02:09 crc kubenswrapper[5047]: I0223 07:02:09.154714 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chspj\" (UniqueName: \"kubernetes.io/projected/eb5db10f-166f-4c03-b8f9-a8d549c48948-kube-api-access-chspj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2\" (UID: \"eb5db10f-166f-4c03-b8f9-a8d549c48948\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" Feb 23 07:02:09 crc kubenswrapper[5047]: I0223 07:02:09.154799 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb5db10f-166f-4c03-b8f9-a8d549c48948-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2\" (UID: \"eb5db10f-166f-4c03-b8f9-a8d549c48948\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" Feb 23 07:02:09 crc kubenswrapper[5047]: I0223 07:02:09.155002 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb5db10f-166f-4c03-b8f9-a8d549c48948-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2\" (UID: \"eb5db10f-166f-4c03-b8f9-a8d549c48948\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" Feb 23 07:02:09 crc kubenswrapper[5047]: I0223 07:02:09.155860 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb5db10f-166f-4c03-b8f9-a8d549c48948-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2\" (UID: \"eb5db10f-166f-4c03-b8f9-a8d549c48948\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" Feb 23 07:02:09 crc kubenswrapper[5047]: I0223 07:02:09.155891 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb5db10f-166f-4c03-b8f9-a8d549c48948-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2\" (UID: \"eb5db10f-166f-4c03-b8f9-a8d549c48948\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" Feb 23 07:02:09 crc kubenswrapper[5047]: I0223 07:02:09.188113 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chspj\" (UniqueName: \"kubernetes.io/projected/eb5db10f-166f-4c03-b8f9-a8d549c48948-kube-api-access-chspj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2\" (UID: \"eb5db10f-166f-4c03-b8f9-a8d549c48948\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" Feb 23 07:02:09 crc kubenswrapper[5047]: I0223 07:02:09.228106 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" Feb 23 07:02:09 crc kubenswrapper[5047]: I0223 07:02:09.705307 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2"] Feb 23 07:02:10 crc kubenswrapper[5047]: I0223 07:02:10.001995 5047 generic.go:334] "Generic (PLEG): container finished" podID="eb5db10f-166f-4c03-b8f9-a8d549c48948" containerID="3a1bc8c950eee6ebc40f04a665c4990ebd661f41164c7b7afd346690cdee8bfa" exitCode=0 Feb 23 07:02:10 crc kubenswrapper[5047]: I0223 07:02:10.002087 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" event={"ID":"eb5db10f-166f-4c03-b8f9-a8d549c48948","Type":"ContainerDied","Data":"3a1bc8c950eee6ebc40f04a665c4990ebd661f41164c7b7afd346690cdee8bfa"} Feb 23 07:02:10 crc kubenswrapper[5047]: I0223 07:02:10.002276 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" event={"ID":"eb5db10f-166f-4c03-b8f9-a8d549c48948","Type":"ContainerStarted","Data":"aa1bb675989e81ed6091f4633f416d62678adf25c3af309ab46ce0c96f39bc16"} Feb 23 07:02:10 crc kubenswrapper[5047]: I0223 07:02:10.390290 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:10 crc kubenswrapper[5047]: I0223 07:02:10.390734 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:10 crc kubenswrapper[5047]: I0223 07:02:10.440337 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:10 crc kubenswrapper[5047]: I0223 07:02:10.836086 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:02:10 crc kubenswrapper[5047]: I0223 07:02:10.880203 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:02:11 crc kubenswrapper[5047]: I0223 07:02:11.067179 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:13 crc kubenswrapper[5047]: I0223 07:02:13.036629 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fv88x"] Feb 23 07:02:13 crc kubenswrapper[5047]: I0223 07:02:13.037391 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fv88x" podUID="53e16170-88d2-48dc-9d81-d9aa9c768655" containerName="registry-server" containerID="cri-o://58d9de180920e579db48ee6897169db34614be3cf3e3717b9df280b4453d4ec0" gracePeriod=2 Feb 23 07:02:13 crc kubenswrapper[5047]: I0223 07:02:13.487259 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:13 crc kubenswrapper[5047]: I0223 07:02:13.630311 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53e16170-88d2-48dc-9d81-d9aa9c768655-catalog-content\") pod \"53e16170-88d2-48dc-9d81-d9aa9c768655\" (UID: \"53e16170-88d2-48dc-9d81-d9aa9c768655\") " Feb 23 07:02:13 crc kubenswrapper[5047]: I0223 07:02:13.631090 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-httzt\" (UniqueName: \"kubernetes.io/projected/53e16170-88d2-48dc-9d81-d9aa9c768655-kube-api-access-httzt\") pod \"53e16170-88d2-48dc-9d81-d9aa9c768655\" (UID: \"53e16170-88d2-48dc-9d81-d9aa9c768655\") " Feb 23 07:02:13 crc kubenswrapper[5047]: I0223 07:02:13.631178 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53e16170-88d2-48dc-9d81-d9aa9c768655-utilities\") pod \"53e16170-88d2-48dc-9d81-d9aa9c768655\" (UID: \"53e16170-88d2-48dc-9d81-d9aa9c768655\") " Feb 23 07:02:13 crc kubenswrapper[5047]: I0223 07:02:13.631956 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53e16170-88d2-48dc-9d81-d9aa9c768655-utilities" (OuterVolumeSpecName: "utilities") pod "53e16170-88d2-48dc-9d81-d9aa9c768655" (UID: "53e16170-88d2-48dc-9d81-d9aa9c768655"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:13 crc kubenswrapper[5047]: I0223 07:02:13.642242 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e16170-88d2-48dc-9d81-d9aa9c768655-kube-api-access-httzt" (OuterVolumeSpecName: "kube-api-access-httzt") pod "53e16170-88d2-48dc-9d81-d9aa9c768655" (UID: "53e16170-88d2-48dc-9d81-d9aa9c768655"). InnerVolumeSpecName "kube-api-access-httzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:13 crc kubenswrapper[5047]: I0223 07:02:13.696661 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53e16170-88d2-48dc-9d81-d9aa9c768655-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53e16170-88d2-48dc-9d81-d9aa9c768655" (UID: "53e16170-88d2-48dc-9d81-d9aa9c768655"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:13 crc kubenswrapper[5047]: I0223 07:02:13.733262 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53e16170-88d2-48dc-9d81-d9aa9c768655-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:13 crc kubenswrapper[5047]: I0223 07:02:13.733311 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-httzt\" (UniqueName: \"kubernetes.io/projected/53e16170-88d2-48dc-9d81-d9aa9c768655-kube-api-access-httzt\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:13 crc kubenswrapper[5047]: I0223 07:02:13.733323 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53e16170-88d2-48dc-9d81-d9aa9c768655-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.046525 5047 generic.go:334] "Generic (PLEG): container finished" podID="eb5db10f-166f-4c03-b8f9-a8d549c48948" containerID="49169672b3cd2d5e34dae76ce73aa5a8132980525d7499cd1534e5b4471ce252" exitCode=0 Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.046626 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" event={"ID":"eb5db10f-166f-4c03-b8f9-a8d549c48948","Type":"ContainerDied","Data":"49169672b3cd2d5e34dae76ce73aa5a8132980525d7499cd1534e5b4471ce252"} Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.049937 5047 generic.go:334] "Generic (PLEG): container finished" podID="53e16170-88d2-48dc-9d81-d9aa9c768655" containerID="58d9de180920e579db48ee6897169db34614be3cf3e3717b9df280b4453d4ec0" exitCode=0 Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.049998 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fv88x" event={"ID":"53e16170-88d2-48dc-9d81-d9aa9c768655","Type":"ContainerDied","Data":"58d9de180920e579db48ee6897169db34614be3cf3e3717b9df280b4453d4ec0"} Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.050042 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fv88x" event={"ID":"53e16170-88d2-48dc-9d81-d9aa9c768655","Type":"ContainerDied","Data":"71e5251f29ffb4f57816d612c2e52969c0da7488303d9b5419c721402ce94135"} Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.050065 5047 scope.go:117] "RemoveContainer" containerID="58d9de180920e579db48ee6897169db34614be3cf3e3717b9df280b4453d4ec0" Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.050144 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fv88x" Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.102341 5047 scope.go:117] "RemoveContainer" containerID="14b7492bdd46a93900fcf2a3d61609134f131f5f28d883863789ee35abaca560" Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.103829 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fv88x"] Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.112230 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fv88x"] Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.148453 5047 scope.go:117] "RemoveContainer" containerID="21ae0486d50d378dbcff9f297b77805d0b245f4278242ef296d58694bad0da1e" Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.198742 5047 scope.go:117] "RemoveContainer" containerID="58d9de180920e579db48ee6897169db34614be3cf3e3717b9df280b4453d4ec0" Feb 23 07:02:14 crc kubenswrapper[5047]: E0223 07:02:14.200295 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d9de180920e579db48ee6897169db34614be3cf3e3717b9df280b4453d4ec0\": container with ID starting with 58d9de180920e579db48ee6897169db34614be3cf3e3717b9df280b4453d4ec0 not found: ID does not exist" containerID="58d9de180920e579db48ee6897169db34614be3cf3e3717b9df280b4453d4ec0" Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.200397 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d9de180920e579db48ee6897169db34614be3cf3e3717b9df280b4453d4ec0"} err="failed to get container status \"58d9de180920e579db48ee6897169db34614be3cf3e3717b9df280b4453d4ec0\": rpc error: code = NotFound desc = could not find container \"58d9de180920e579db48ee6897169db34614be3cf3e3717b9df280b4453d4ec0\": container with ID starting with 58d9de180920e579db48ee6897169db34614be3cf3e3717b9df280b4453d4ec0 not found: ID does not exist" Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.200458 5047 scope.go:117] "RemoveContainer" containerID="14b7492bdd46a93900fcf2a3d61609134f131f5f28d883863789ee35abaca560" Feb 23 07:02:14 crc kubenswrapper[5047]: E0223 07:02:14.201540 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14b7492bdd46a93900fcf2a3d61609134f131f5f28d883863789ee35abaca560\": container with ID starting with 14b7492bdd46a93900fcf2a3d61609134f131f5f28d883863789ee35abaca560 not found: ID does not exist" containerID="14b7492bdd46a93900fcf2a3d61609134f131f5f28d883863789ee35abaca560" Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.201587 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14b7492bdd46a93900fcf2a3d61609134f131f5f28d883863789ee35abaca560"} err="failed to get container status \"14b7492bdd46a93900fcf2a3d61609134f131f5f28d883863789ee35abaca560\": rpc error: code = NotFound desc = could not find container \"14b7492bdd46a93900fcf2a3d61609134f131f5f28d883863789ee35abaca560\": container with ID starting with 14b7492bdd46a93900fcf2a3d61609134f131f5f28d883863789ee35abaca560 not found: ID does not exist" Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.201741 5047 scope.go:117] "RemoveContainer" containerID="21ae0486d50d378dbcff9f297b77805d0b245f4278242ef296d58694bad0da1e" Feb 23 07:02:14 crc kubenswrapper[5047]: E0223 07:02:14.202429 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ae0486d50d378dbcff9f297b77805d0b245f4278242ef296d58694bad0da1e\": container with ID starting with 21ae0486d50d378dbcff9f297b77805d0b245f4278242ef296d58694bad0da1e not found: ID does not exist" containerID="21ae0486d50d378dbcff9f297b77805d0b245f4278242ef296d58694bad0da1e" Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.202479 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ae0486d50d378dbcff9f297b77805d0b245f4278242ef296d58694bad0da1e"} err="failed to get container status \"21ae0486d50d378dbcff9f297b77805d0b245f4278242ef296d58694bad0da1e\": rpc error: code = NotFound desc = could not find container \"21ae0486d50d378dbcff9f297b77805d0b245f4278242ef296d58694bad0da1e\": container with ID starting with 21ae0486d50d378dbcff9f297b77805d0b245f4278242ef296d58694bad0da1e not found: ID does not exist" Feb 23 07:02:14 crc kubenswrapper[5047]: I0223 07:02:14.350393 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53e16170-88d2-48dc-9d81-d9aa9c768655" path="/var/lib/kubelet/pods/53e16170-88d2-48dc-9d81-d9aa9c768655/volumes" Feb 23 07:02:15 crc kubenswrapper[5047]: I0223 07:02:15.061121 5047 generic.go:334] "Generic (PLEG): container finished" podID="eb5db10f-166f-4c03-b8f9-a8d549c48948" containerID="a519be80e27861ae24d1d9d44fbf661c5b62e544025f6415a465757f51782ad4" exitCode=0 Feb 23 07:02:15 crc kubenswrapper[5047]: I0223 07:02:15.061230 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" event={"ID":"eb5db10f-166f-4c03-b8f9-a8d549c48948","Type":"ContainerDied","Data":"a519be80e27861ae24d1d9d44fbf661c5b62e544025f6415a465757f51782ad4"} Feb 23 07:02:15 crc kubenswrapper[5047]: I0223 07:02:15.932652 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-9sdh2" Feb 23 07:02:16 crc kubenswrapper[5047]: I0223 07:02:16.401056 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" Feb 23 07:02:16 crc kubenswrapper[5047]: I0223 07:02:16.415257 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bhglb" Feb 23 07:02:16 crc kubenswrapper[5047]: I0223 07:02:16.481576 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chspj\" (UniqueName: \"kubernetes.io/projected/eb5db10f-166f-4c03-b8f9-a8d549c48948-kube-api-access-chspj\") pod \"eb5db10f-166f-4c03-b8f9-a8d549c48948\" (UID: \"eb5db10f-166f-4c03-b8f9-a8d549c48948\") " Feb 23 07:02:16 crc kubenswrapper[5047]: I0223 07:02:16.481650 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb5db10f-166f-4c03-b8f9-a8d549c48948-bundle\") pod \"eb5db10f-166f-4c03-b8f9-a8d549c48948\" (UID: \"eb5db10f-166f-4c03-b8f9-a8d549c48948\") " Feb 23 07:02:16 crc kubenswrapper[5047]: I0223 07:02:16.481761 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb5db10f-166f-4c03-b8f9-a8d549c48948-util\") pod \"eb5db10f-166f-4c03-b8f9-a8d549c48948\" (UID: \"eb5db10f-166f-4c03-b8f9-a8d549c48948\") " Feb 23 07:02:16 crc kubenswrapper[5047]: I0223 07:02:16.483139 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb5db10f-166f-4c03-b8f9-a8d549c48948-bundle" (OuterVolumeSpecName: "bundle") pod "eb5db10f-166f-4c03-b8f9-a8d549c48948" (UID: "eb5db10f-166f-4c03-b8f9-a8d549c48948"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:16 crc kubenswrapper[5047]: I0223 07:02:16.488428 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb5db10f-166f-4c03-b8f9-a8d549c48948-kube-api-access-chspj" (OuterVolumeSpecName: "kube-api-access-chspj") pod "eb5db10f-166f-4c03-b8f9-a8d549c48948" (UID: "eb5db10f-166f-4c03-b8f9-a8d549c48948"). InnerVolumeSpecName "kube-api-access-chspj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:16 crc kubenswrapper[5047]: I0223 07:02:16.498982 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb5db10f-166f-4c03-b8f9-a8d549c48948-util" (OuterVolumeSpecName: "util") pod "eb5db10f-166f-4c03-b8f9-a8d549c48948" (UID: "eb5db10f-166f-4c03-b8f9-a8d549c48948"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:16 crc kubenswrapper[5047]: I0223 07:02:16.584387 5047 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb5db10f-166f-4c03-b8f9-a8d549c48948-util\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:16 crc kubenswrapper[5047]: I0223 07:02:16.584443 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chspj\" (UniqueName: \"kubernetes.io/projected/eb5db10f-166f-4c03-b8f9-a8d549c48948-kube-api-access-chspj\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:16 crc kubenswrapper[5047]: I0223 07:02:16.584460 5047 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb5db10f-166f-4c03-b8f9-a8d549c48948-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:16 crc kubenswrapper[5047]: I0223 07:02:16.759585 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:02:16 crc kubenswrapper[5047]: I0223 07:02:16.759723 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:02:17 crc kubenswrapper[5047]: I0223 07:02:17.084033 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" event={"ID":"eb5db10f-166f-4c03-b8f9-a8d549c48948","Type":"ContainerDied","Data":"aa1bb675989e81ed6091f4633f416d62678adf25c3af309ab46ce0c96f39bc16"} Feb 23 07:02:17 crc kubenswrapper[5047]: I0223 07:02:17.084085 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa1bb675989e81ed6091f4633f416d62678adf25c3af309ab46ce0c96f39bc16" Feb 23 07:02:17 crc kubenswrapper[5047]: I0223 07:02:17.084159 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.395634 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc"] Feb 23 07:02:20 crc kubenswrapper[5047]: E0223 07:02:20.396867 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e16170-88d2-48dc-9d81-d9aa9c768655" containerName="extract-utilities" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.396887 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e16170-88d2-48dc-9d81-d9aa9c768655" containerName="extract-utilities" Feb 23 07:02:20 crc kubenswrapper[5047]: E0223 07:02:20.396900 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb5db10f-166f-4c03-b8f9-a8d549c48948" containerName="util" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.396929 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb5db10f-166f-4c03-b8f9-a8d549c48948" containerName="util" Feb 23 07:02:20 crc kubenswrapper[5047]: E0223 07:02:20.396951 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e16170-88d2-48dc-9d81-d9aa9c768655" containerName="registry-server" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.396959 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e16170-88d2-48dc-9d81-d9aa9c768655" containerName="registry-server" Feb 23 07:02:20 crc kubenswrapper[5047]: E0223 07:02:20.396968 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb5db10f-166f-4c03-b8f9-a8d549c48948" containerName="extract" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.396976 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb5db10f-166f-4c03-b8f9-a8d549c48948" containerName="extract" Feb 23 07:02:20 crc kubenswrapper[5047]: E0223 07:02:20.396989 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb5db10f-166f-4c03-b8f9-a8d549c48948" containerName="pull" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.396997 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb5db10f-166f-4c03-b8f9-a8d549c48948" containerName="pull" Feb 23 07:02:20 crc kubenswrapper[5047]: E0223 07:02:20.397018 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e16170-88d2-48dc-9d81-d9aa9c768655" containerName="extract-content" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.397026 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e16170-88d2-48dc-9d81-d9aa9c768655" containerName="extract-content" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.397196 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb5db10f-166f-4c03-b8f9-a8d549c48948" containerName="extract" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.397220 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="53e16170-88d2-48dc-9d81-d9aa9c768655" containerName="registry-server" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.397995 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.403496 5047 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-s96xj" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.403640 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc"] Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.403725 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.403757 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.541666 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmfv5\" (UniqueName: \"kubernetes.io/projected/1c5fcdc3-a2cb-499f-9482-5e135180f65c-kube-api-access-gmfv5\") pod \"cert-manager-operator-controller-manager-66c8bdd694-f94tc\" (UID: \"1c5fcdc3-a2cb-499f-9482-5e135180f65c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.541783 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1c5fcdc3-a2cb-499f-9482-5e135180f65c-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-f94tc\" (UID: \"1c5fcdc3-a2cb-499f-9482-5e135180f65c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.643346 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1c5fcdc3-a2cb-499f-9482-5e135180f65c-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-f94tc\" (UID: \"1c5fcdc3-a2cb-499f-9482-5e135180f65c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.643543 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmfv5\" (UniqueName: \"kubernetes.io/projected/1c5fcdc3-a2cb-499f-9482-5e135180f65c-kube-api-access-gmfv5\") pod \"cert-manager-operator-controller-manager-66c8bdd694-f94tc\" (UID: \"1c5fcdc3-a2cb-499f-9482-5e135180f65c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.644104 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1c5fcdc3-a2cb-499f-9482-5e135180f65c-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-f94tc\" (UID: \"1c5fcdc3-a2cb-499f-9482-5e135180f65c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.667060 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmfv5\" (UniqueName: \"kubernetes.io/projected/1c5fcdc3-a2cb-499f-9482-5e135180f65c-kube-api-access-gmfv5\") pod \"cert-manager-operator-controller-manager-66c8bdd694-f94tc\" (UID: \"1c5fcdc3-a2cb-499f-9482-5e135180f65c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc" Feb 23 07:02:20 crc kubenswrapper[5047]: I0223 07:02:20.738092 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc" Feb 23 07:02:21 crc kubenswrapper[5047]: I0223 07:02:21.032013 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc"] Feb 23 07:02:21 crc kubenswrapper[5047]: W0223 07:02:21.040537 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c5fcdc3_a2cb_499f_9482_5e135180f65c.slice/crio-80fa105cc8a0a13c1226856d2dd4a682732ec297815bcc44c149d470e9580620 WatchSource:0}: Error finding container 80fa105cc8a0a13c1226856d2dd4a682732ec297815bcc44c149d470e9580620: Status 404 returned error can't find the container with id 80fa105cc8a0a13c1226856d2dd4a682732ec297815bcc44c149d470e9580620 Feb 23 07:02:21 crc kubenswrapper[5047]: I0223 07:02:21.110484 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc" event={"ID":"1c5fcdc3-a2cb-499f-9482-5e135180f65c","Type":"ContainerStarted","Data":"80fa105cc8a0a13c1226856d2dd4a682732ec297815bcc44c149d470e9580620"} Feb 23 07:02:24 crc kubenswrapper[5047]: I0223 07:02:24.133822 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc" event={"ID":"1c5fcdc3-a2cb-499f-9482-5e135180f65c","Type":"ContainerStarted","Data":"d9cbcbce6b46ad510d76785bf94f0c6d55ecc609fff5f509359c02aa4b6681ed"} Feb 23 07:02:24 crc kubenswrapper[5047]: I0223 07:02:24.157038 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-f94tc" podStartSLOduration=1.482721512 podStartE2EDuration="4.157014272s" podCreationTimestamp="2026-02-23 07:02:20 +0000 UTC" firstStartedPulling="2026-02-23 07:02:21.043962687 +0000 UTC m=+1063.295289821" lastFinishedPulling="2026-02-23 07:02:23.718255437 +0000 UTC m=+1065.969582581" observedRunningTime="2026-02-23 07:02:24.155757388 +0000 UTC m=+1066.407084522" watchObservedRunningTime="2026-02-23 07:02:24.157014272 +0000 UTC m=+1066.408341406" Feb 23 07:02:25 crc kubenswrapper[5047]: I0223 07:02:25.841497 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-nh5jn" Feb 23 07:02:30 crc kubenswrapper[5047]: I0223 07:02:30.414126 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-hrqps"] Feb 23 07:02:30 crc kubenswrapper[5047]: I0223 07:02:30.415680 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-hrqps" Feb 23 07:02:30 crc kubenswrapper[5047]: I0223 07:02:30.418046 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 23 07:02:30 crc kubenswrapper[5047]: I0223 07:02:30.418687 5047 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-dmn22" Feb 23 07:02:30 crc kubenswrapper[5047]: I0223 07:02:30.419973 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 23 07:02:30 crc kubenswrapper[5047]: I0223 07:02:30.432978 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-hrqps"] Feb 23 07:02:30 crc kubenswrapper[5047]: I0223 07:02:30.502326 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20a7cb7d-0122-4d8b-b98c-ed082d5fb596-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-hrqps\" (UID: \"20a7cb7d-0122-4d8b-b98c-ed082d5fb596\") " pod="cert-manager/cert-manager-cainjector-5545bd876-hrqps" Feb 23 07:02:30 crc kubenswrapper[5047]: I0223 07:02:30.502653 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkwcq\" (UniqueName: \"kubernetes.io/projected/20a7cb7d-0122-4d8b-b98c-ed082d5fb596-kube-api-access-qkwcq\") pod \"cert-manager-cainjector-5545bd876-hrqps\" (UID: \"20a7cb7d-0122-4d8b-b98c-ed082d5fb596\") " pod="cert-manager/cert-manager-cainjector-5545bd876-hrqps" Feb 23 07:02:30 crc kubenswrapper[5047]: I0223 07:02:30.604310 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20a7cb7d-0122-4d8b-b98c-ed082d5fb596-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-hrqps\" (UID: \"20a7cb7d-0122-4d8b-b98c-ed082d5fb596\") " pod="cert-manager/cert-manager-cainjector-5545bd876-hrqps" Feb 23 07:02:30 crc kubenswrapper[5047]: I0223 07:02:30.604435 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkwcq\" (UniqueName: \"kubernetes.io/projected/20a7cb7d-0122-4d8b-b98c-ed082d5fb596-kube-api-access-qkwcq\") pod \"cert-manager-cainjector-5545bd876-hrqps\" (UID: \"20a7cb7d-0122-4d8b-b98c-ed082d5fb596\") " pod="cert-manager/cert-manager-cainjector-5545bd876-hrqps" Feb 23 07:02:30 crc kubenswrapper[5047]: I0223 07:02:30.627790 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20a7cb7d-0122-4d8b-b98c-ed082d5fb596-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-hrqps\" (UID: \"20a7cb7d-0122-4d8b-b98c-ed082d5fb596\") " pod="cert-manager/cert-manager-cainjector-5545bd876-hrqps" Feb 23 07:02:30 crc kubenswrapper[5047]: I0223 07:02:30.627824 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkwcq\" (UniqueName: \"kubernetes.io/projected/20a7cb7d-0122-4d8b-b98c-ed082d5fb596-kube-api-access-qkwcq\") pod \"cert-manager-cainjector-5545bd876-hrqps\" (UID: \"20a7cb7d-0122-4d8b-b98c-ed082d5fb596\") " pod="cert-manager/cert-manager-cainjector-5545bd876-hrqps" Feb 23 07:02:30 crc kubenswrapper[5047]: I0223 07:02:30.741326 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-hrqps" Feb 23 07:02:31 crc kubenswrapper[5047]: I0223 07:02:31.185867 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-hrqps"] Feb 23 07:02:32 crc kubenswrapper[5047]: I0223 07:02:32.191932 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-hrqps" event={"ID":"20a7cb7d-0122-4d8b-b98c-ed082d5fb596","Type":"ContainerStarted","Data":"bff62279dd551beb4b1f21b51c7228e64e2297a8e443834e7d1f58132ebfd437"} Feb 23 07:02:33 crc kubenswrapper[5047]: I0223 07:02:33.019483 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-rbg92"] Feb 23 07:02:33 crc kubenswrapper[5047]: I0223 07:02:33.020290 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-rbg92" Feb 23 07:02:33 crc kubenswrapper[5047]: I0223 07:02:33.024546 5047 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-4vpdk" Feb 23 07:02:33 crc kubenswrapper[5047]: I0223 07:02:33.042371 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-rbg92"] Feb 23 07:02:33 crc kubenswrapper[5047]: I0223 07:02:33.145314 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpnmf\" (UniqueName: \"kubernetes.io/projected/7ccbccf7-5572-4158-b181-e23c3ef00a05-kube-api-access-xpnmf\") pod \"cert-manager-webhook-6888856db4-rbg92\" (UID: \"7ccbccf7-5572-4158-b181-e23c3ef00a05\") " pod="cert-manager/cert-manager-webhook-6888856db4-rbg92" Feb 23 07:02:33 crc kubenswrapper[5047]: I0223 07:02:33.145425 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ccbccf7-5572-4158-b181-e23c3ef00a05-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-rbg92\" (UID: \"7ccbccf7-5572-4158-b181-e23c3ef00a05\") " pod="cert-manager/cert-manager-webhook-6888856db4-rbg92" Feb 23 07:02:33 crc kubenswrapper[5047]: I0223 07:02:33.246787 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpnmf\" (UniqueName: \"kubernetes.io/projected/7ccbccf7-5572-4158-b181-e23c3ef00a05-kube-api-access-xpnmf\") pod \"cert-manager-webhook-6888856db4-rbg92\" (UID: \"7ccbccf7-5572-4158-b181-e23c3ef00a05\") " pod="cert-manager/cert-manager-webhook-6888856db4-rbg92" Feb 23 07:02:33 crc kubenswrapper[5047]: I0223 07:02:33.246937 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ccbccf7-5572-4158-b181-e23c3ef00a05-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-rbg92\" (UID: \"7ccbccf7-5572-4158-b181-e23c3ef00a05\") " pod="cert-manager/cert-manager-webhook-6888856db4-rbg92" Feb 23 07:02:33 crc kubenswrapper[5047]: I0223 07:02:33.284683 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ccbccf7-5572-4158-b181-e23c3ef00a05-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-rbg92\" (UID: \"7ccbccf7-5572-4158-b181-e23c3ef00a05\") " pod="cert-manager/cert-manager-webhook-6888856db4-rbg92" Feb 23 07:02:33 crc kubenswrapper[5047]: I0223 07:02:33.285765 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpnmf\" (UniqueName: \"kubernetes.io/projected/7ccbccf7-5572-4158-b181-e23c3ef00a05-kube-api-access-xpnmf\") pod \"cert-manager-webhook-6888856db4-rbg92\" (UID: \"7ccbccf7-5572-4158-b181-e23c3ef00a05\") " pod="cert-manager/cert-manager-webhook-6888856db4-rbg92" Feb 23 07:02:33 crc kubenswrapper[5047]: I0223 07:02:33.358135 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-rbg92" Feb 23 07:02:33 crc kubenswrapper[5047]: I0223 07:02:33.623918 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-rbg92"] Feb 23 07:02:34 crc kubenswrapper[5047]: I0223 07:02:34.215772 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-rbg92" event={"ID":"7ccbccf7-5572-4158-b181-e23c3ef00a05","Type":"ContainerStarted","Data":"8d7b7739439f8486cdff63ea43a6b2ab861a058323d096733cac731af28df4a5"} Feb 23 07:02:36 crc kubenswrapper[5047]: I0223 07:02:36.232107 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-rbg92" event={"ID":"7ccbccf7-5572-4158-b181-e23c3ef00a05","Type":"ContainerStarted","Data":"333be86e43e8d93810e7d1ad156ad35776301448510090c8ba51dcc49e07fced"} Feb 23 07:02:36 crc kubenswrapper[5047]: I0223 07:02:36.232393 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-rbg92" Feb 23 07:02:36 crc kubenswrapper[5047]: I0223 07:02:36.235671 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-hrqps" event={"ID":"20a7cb7d-0122-4d8b-b98c-ed082d5fb596","Type":"ContainerStarted","Data":"85c755b39f1b6a3e4cb4244f930d0c940db007569b492e3f0f22046d489be7b3"} Feb 23 07:02:36 crc kubenswrapper[5047]: I0223 07:02:36.259084 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-rbg92" podStartSLOduration=2.313347137 podStartE2EDuration="4.259051592s" podCreationTimestamp="2026-02-23 07:02:32 +0000 UTC" firstStartedPulling="2026-02-23 07:02:33.637297707 +0000 UTC m=+1075.888624841" lastFinishedPulling="2026-02-23 07:02:35.583002162 +0000 UTC m=+1077.834329296" observedRunningTime="2026-02-23 07:02:36.250415105 +0000 UTC m=+1078.501742319" watchObservedRunningTime="2026-02-23 07:02:36.259051592 +0000 UTC m=+1078.510378776" Feb 23 07:02:36 crc kubenswrapper[5047]: I0223 07:02:36.281747 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-hrqps" podStartSLOduration=1.892967091 podStartE2EDuration="6.281706645s" podCreationTimestamp="2026-02-23 07:02:30 +0000 UTC" firstStartedPulling="2026-02-23 07:02:31.19507699 +0000 UTC m=+1073.446404134" lastFinishedPulling="2026-02-23 07:02:35.583816554 +0000 UTC m=+1077.835143688" observedRunningTime="2026-02-23 07:02:36.277626123 +0000 UTC m=+1078.528953297" watchObservedRunningTime="2026-02-23 07:02:36.281706645 +0000 UTC m=+1078.533033819" Feb 23 07:02:37 crc kubenswrapper[5047]: I0223 07:02:37.974344 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-8rv27"] Feb 23 07:02:37 crc kubenswrapper[5047]: I0223 07:02:37.975468 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-8rv27" Feb 23 07:02:37 crc kubenswrapper[5047]: I0223 07:02:37.981019 5047 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-zzkhx" Feb 23 07:02:37 crc kubenswrapper[5047]: I0223 07:02:37.999117 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-8rv27"] Feb 23 07:02:38 crc kubenswrapper[5047]: I0223 07:02:38.133074 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmdjr\" (UniqueName: \"kubernetes.io/projected/8f24c3b5-950d-40ec-a7cf-e71d942a57aa-kube-api-access-tmdjr\") pod \"cert-manager-545d4d4674-8rv27\" (UID: \"8f24c3b5-950d-40ec-a7cf-e71d942a57aa\") " pod="cert-manager/cert-manager-545d4d4674-8rv27" Feb 23 07:02:38 crc kubenswrapper[5047]: I0223 07:02:38.133150 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f24c3b5-950d-40ec-a7cf-e71d942a57aa-bound-sa-token\") pod \"cert-manager-545d4d4674-8rv27\" (UID: \"8f24c3b5-950d-40ec-a7cf-e71d942a57aa\") " pod="cert-manager/cert-manager-545d4d4674-8rv27" Feb 23 07:02:38 crc kubenswrapper[5047]: I0223 07:02:38.235950 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmdjr\" (UniqueName: \"kubernetes.io/projected/8f24c3b5-950d-40ec-a7cf-e71d942a57aa-kube-api-access-tmdjr\") pod \"cert-manager-545d4d4674-8rv27\" (UID: \"8f24c3b5-950d-40ec-a7cf-e71d942a57aa\") " pod="cert-manager/cert-manager-545d4d4674-8rv27" Feb 23 07:02:38 crc kubenswrapper[5047]: I0223 07:02:38.236052 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f24c3b5-950d-40ec-a7cf-e71d942a57aa-bound-sa-token\") pod \"cert-manager-545d4d4674-8rv27\" (UID: \"8f24c3b5-950d-40ec-a7cf-e71d942a57aa\") " pod="cert-manager/cert-manager-545d4d4674-8rv27" Feb 23 07:02:38 crc kubenswrapper[5047]: I0223 07:02:38.263874 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmdjr\" (UniqueName: \"kubernetes.io/projected/8f24c3b5-950d-40ec-a7cf-e71d942a57aa-kube-api-access-tmdjr\") pod \"cert-manager-545d4d4674-8rv27\" (UID: \"8f24c3b5-950d-40ec-a7cf-e71d942a57aa\") " pod="cert-manager/cert-manager-545d4d4674-8rv27" Feb 23 07:02:38 crc kubenswrapper[5047]: I0223 07:02:38.264379 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f24c3b5-950d-40ec-a7cf-e71d942a57aa-bound-sa-token\") pod \"cert-manager-545d4d4674-8rv27\" (UID: \"8f24c3b5-950d-40ec-a7cf-e71d942a57aa\") " pod="cert-manager/cert-manager-545d4d4674-8rv27" Feb 23 07:02:38 crc kubenswrapper[5047]: I0223 07:02:38.296475 5047 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-zzkhx" Feb 23 07:02:38 crc kubenswrapper[5047]: I0223 07:02:38.304152 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-8rv27" Feb 23 07:02:38 crc kubenswrapper[5047]: W0223 07:02:38.574494 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f24c3b5_950d_40ec_a7cf_e71d942a57aa.slice/crio-4019077f071eddd9c0f1f9716566f4ce34d4a32f640a2b22f9cfd59b46794914 WatchSource:0}: Error finding container 4019077f071eddd9c0f1f9716566f4ce34d4a32f640a2b22f9cfd59b46794914: Status 404 returned error can't find the container with id 4019077f071eddd9c0f1f9716566f4ce34d4a32f640a2b22f9cfd59b46794914 Feb 23 07:02:38 crc kubenswrapper[5047]: I0223 07:02:38.578710 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-8rv27"] Feb 23 07:02:39 crc kubenswrapper[5047]: I0223 07:02:39.275670 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-8rv27" event={"ID":"8f24c3b5-950d-40ec-a7cf-e71d942a57aa","Type":"ContainerStarted","Data":"76615442d68417574568af73e10994babd3b7f495267ebf04d0233a3e047759a"} Feb 23 07:02:39 crc kubenswrapper[5047]: I0223 07:02:39.277837 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-8rv27" event={"ID":"8f24c3b5-950d-40ec-a7cf-e71d942a57aa","Type":"ContainerStarted","Data":"4019077f071eddd9c0f1f9716566f4ce34d4a32f640a2b22f9cfd59b46794914"} Feb 23 07:02:39 crc kubenswrapper[5047]: I0223 07:02:39.310254 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-8rv27" podStartSLOduration=2.310209945 podStartE2EDuration="2.310209945s" podCreationTimestamp="2026-02-23 07:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:02:39.300401926 +0000 UTC m=+1081.551729140" watchObservedRunningTime="2026-02-23 07:02:39.310209945 +0000 UTC m=+1081.561537119" Feb 23 07:02:43 crc kubenswrapper[5047]: I0223 07:02:43.363613 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-rbg92" Feb 23 07:02:46 crc kubenswrapper[5047]: I0223 07:02:46.444482 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5875c"] Feb 23 07:02:46 crc kubenswrapper[5047]: I0223 07:02:46.446197 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5875c" Feb 23 07:02:46 crc kubenswrapper[5047]: I0223 07:02:46.450134 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 23 07:02:46 crc kubenswrapper[5047]: I0223 07:02:46.450177 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 23 07:02:46 crc kubenswrapper[5047]: I0223 07:02:46.450523 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-qwft2" Feb 23 07:02:46 crc kubenswrapper[5047]: I0223 07:02:46.475745 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5875c"] Feb 23 07:02:46 crc kubenswrapper[5047]: I0223 07:02:46.584734 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxbsl\" (UniqueName: \"kubernetes.io/projected/c4003a37-a564-4e3a-9851-43474a64daee-kube-api-access-rxbsl\") pod \"openstack-operator-index-5875c\" (UID: \"c4003a37-a564-4e3a-9851-43474a64daee\") " pod="openstack-operators/openstack-operator-index-5875c" Feb 23 07:02:46 crc kubenswrapper[5047]: I0223 07:02:46.686625 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxbsl\" (UniqueName: \"kubernetes.io/projected/c4003a37-a564-4e3a-9851-43474a64daee-kube-api-access-rxbsl\") pod \"openstack-operator-index-5875c\" (UID: \"c4003a37-a564-4e3a-9851-43474a64daee\") " pod="openstack-operators/openstack-operator-index-5875c" Feb 23 07:02:46 crc kubenswrapper[5047]: I0223 07:02:46.711591 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxbsl\" (UniqueName: \"kubernetes.io/projected/c4003a37-a564-4e3a-9851-43474a64daee-kube-api-access-rxbsl\") pod \"openstack-operator-index-5875c\" (UID: \"c4003a37-a564-4e3a-9851-43474a64daee\") " pod="openstack-operators/openstack-operator-index-5875c" Feb 23 07:02:46 crc kubenswrapper[5047]: I0223 07:02:46.760297 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:02:46 crc kubenswrapper[5047]: I0223 07:02:46.760390 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:02:46 crc kubenswrapper[5047]: I0223 07:02:46.778241 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5875c" Feb 23 07:02:47 crc kubenswrapper[5047]: I0223 07:02:47.263260 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5875c"] Feb 23 07:02:47 crc kubenswrapper[5047]: I0223 07:02:47.356258 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5875c" event={"ID":"c4003a37-a564-4e3a-9851-43474a64daee","Type":"ContainerStarted","Data":"a97b500f72f95c7dc3fda0e4cd98b493a7dbead4b2f96db4949ec77259922233"} Feb 23 07:02:48 crc kubenswrapper[5047]: I0223 07:02:48.365135 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5875c" event={"ID":"c4003a37-a564-4e3a-9851-43474a64daee","Type":"ContainerStarted","Data":"6510046a32f3b39d93a25b8a3264ff64fa59fd1b15e153b6b4498cec15f07b05"} Feb 23 07:02:48 crc kubenswrapper[5047]: I0223 07:02:48.388304 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5875c" podStartSLOduration=1.63362489 podStartE2EDuration="2.388280701s" podCreationTimestamp="2026-02-23 07:02:46 +0000 UTC" firstStartedPulling="2026-02-23 07:02:47.272091808 +0000 UTC m=+1089.523418942" lastFinishedPulling="2026-02-23 07:02:48.026747609 +0000 UTC m=+1090.278074753" observedRunningTime="2026-02-23 07:02:48.382562104 +0000 UTC m=+1090.633889248" watchObservedRunningTime="2026-02-23 07:02:48.388280701 +0000 UTC m=+1090.639607835" Feb 23 07:02:49 crc kubenswrapper[5047]: I0223 07:02:49.817116 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5875c"] Feb 23 07:02:50 crc kubenswrapper[5047]: I0223 07:02:50.378160 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5875c" podUID="c4003a37-a564-4e3a-9851-43474a64daee" containerName="registry-server" containerID="cri-o://6510046a32f3b39d93a25b8a3264ff64fa59fd1b15e153b6b4498cec15f07b05" gracePeriod=2 Feb 23 07:02:50 crc kubenswrapper[5047]: I0223 07:02:50.430763 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-cxhzm"] Feb 23 07:02:50 crc kubenswrapper[5047]: I0223 07:02:50.431726 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cxhzm" Feb 23 07:02:50 crc kubenswrapper[5047]: I0223 07:02:50.438615 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cxhzm"] Feb 23 07:02:50 crc kubenswrapper[5047]: I0223 07:02:50.559360 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whwxc\" (UniqueName: \"kubernetes.io/projected/59e0a106-e284-4715-b287-b77124d0fc64-kube-api-access-whwxc\") pod \"openstack-operator-index-cxhzm\" (UID: \"59e0a106-e284-4715-b287-b77124d0fc64\") " pod="openstack-operators/openstack-operator-index-cxhzm" Feb 23 07:02:50 crc kubenswrapper[5047]: I0223 07:02:50.668468 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whwxc\" (UniqueName: \"kubernetes.io/projected/59e0a106-e284-4715-b287-b77124d0fc64-kube-api-access-whwxc\") pod \"openstack-operator-index-cxhzm\" (UID: \"59e0a106-e284-4715-b287-b77124d0fc64\") " pod="openstack-operators/openstack-operator-index-cxhzm" Feb 23 07:02:50 crc kubenswrapper[5047]: I0223 07:02:50.696813 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whwxc\" (UniqueName: \"kubernetes.io/projected/59e0a106-e284-4715-b287-b77124d0fc64-kube-api-access-whwxc\") pod \"openstack-operator-index-cxhzm\" (UID: \"59e0a106-e284-4715-b287-b77124d0fc64\") " pod="openstack-operators/openstack-operator-index-cxhzm" Feb 23 07:02:50 crc kubenswrapper[5047]: I0223 07:02:50.771312 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cxhzm" Feb 23 07:02:50 crc kubenswrapper[5047]: I0223 07:02:50.863328 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5875c" Feb 23 07:02:50 crc kubenswrapper[5047]: I0223 07:02:50.986845 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxbsl\" (UniqueName: \"kubernetes.io/projected/c4003a37-a564-4e3a-9851-43474a64daee-kube-api-access-rxbsl\") pod \"c4003a37-a564-4e3a-9851-43474a64daee\" (UID: \"c4003a37-a564-4e3a-9851-43474a64daee\") " Feb 23 07:02:51 crc kubenswrapper[5047]: I0223 07:02:51.000284 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4003a37-a564-4e3a-9851-43474a64daee-kube-api-access-rxbsl" (OuterVolumeSpecName: "kube-api-access-rxbsl") pod "c4003a37-a564-4e3a-9851-43474a64daee" (UID: "c4003a37-a564-4e3a-9851-43474a64daee"). InnerVolumeSpecName "kube-api-access-rxbsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:51 crc kubenswrapper[5047]: I0223 07:02:51.089080 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxbsl\" (UniqueName: \"kubernetes.io/projected/c4003a37-a564-4e3a-9851-43474a64daee-kube-api-access-rxbsl\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:51 crc kubenswrapper[5047]: I0223 07:02:51.265605 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cxhzm"] Feb 23 07:02:51 crc kubenswrapper[5047]: I0223 07:02:51.389738 5047 generic.go:334] "Generic (PLEG): container finished" podID="c4003a37-a564-4e3a-9851-43474a64daee" containerID="6510046a32f3b39d93a25b8a3264ff64fa59fd1b15e153b6b4498cec15f07b05" exitCode=0 Feb 23 07:02:51 crc kubenswrapper[5047]: I0223 07:02:51.389885 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5875c" event={"ID":"c4003a37-a564-4e3a-9851-43474a64daee","Type":"ContainerDied","Data":"6510046a32f3b39d93a25b8a3264ff64fa59fd1b15e153b6b4498cec15f07b05"} Feb 23 07:02:51 crc kubenswrapper[5047]: I0223 07:02:51.390010 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5875c" event={"ID":"c4003a37-a564-4e3a-9851-43474a64daee","Type":"ContainerDied","Data":"a97b500f72f95c7dc3fda0e4cd98b493a7dbead4b2f96db4949ec77259922233"} Feb 23 07:02:51 crc kubenswrapper[5047]: I0223 07:02:51.390087 5047 scope.go:117] "RemoveContainer" containerID="6510046a32f3b39d93a25b8a3264ff64fa59fd1b15e153b6b4498cec15f07b05" Feb 23 07:02:51 crc kubenswrapper[5047]: I0223 07:02:51.390529 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5875c" Feb 23 07:02:51 crc kubenswrapper[5047]: I0223 07:02:51.391735 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cxhzm" event={"ID":"59e0a106-e284-4715-b287-b77124d0fc64","Type":"ContainerStarted","Data":"5e9a858aad2ef8a0990285d8fdd73bb00dfaed35403e8dcd3cf7a881d5fc6de0"} Feb 23 07:02:51 crc kubenswrapper[5047]: I0223 07:02:51.450391 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5875c"] Feb 23 07:02:51 crc kubenswrapper[5047]: I0223 07:02:51.459241 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5875c"] Feb 23 07:02:51 crc kubenswrapper[5047]: I0223 07:02:51.492232 5047 scope.go:117] "RemoveContainer" containerID="6510046a32f3b39d93a25b8a3264ff64fa59fd1b15e153b6b4498cec15f07b05" Feb 23 07:02:51 crc kubenswrapper[5047]: E0223 07:02:51.499122 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6510046a32f3b39d93a25b8a3264ff64fa59fd1b15e153b6b4498cec15f07b05\": container with ID starting with 6510046a32f3b39d93a25b8a3264ff64fa59fd1b15e153b6b4498cec15f07b05 not found: ID does not exist" containerID="6510046a32f3b39d93a25b8a3264ff64fa59fd1b15e153b6b4498cec15f07b05" Feb 23 07:02:51 crc kubenswrapper[5047]: I0223 07:02:51.499195 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6510046a32f3b39d93a25b8a3264ff64fa59fd1b15e153b6b4498cec15f07b05"} err="failed to get container status \"6510046a32f3b39d93a25b8a3264ff64fa59fd1b15e153b6b4498cec15f07b05\": rpc error: code = NotFound desc = could not find container \"6510046a32f3b39d93a25b8a3264ff64fa59fd1b15e153b6b4498cec15f07b05\": container with ID starting with 6510046a32f3b39d93a25b8a3264ff64fa59fd1b15e153b6b4498cec15f07b05 not found: ID does not exist" Feb 23 07:02:52 crc kubenswrapper[5047]: I0223 07:02:52.354712 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4003a37-a564-4e3a-9851-43474a64daee" path="/var/lib/kubelet/pods/c4003a37-a564-4e3a-9851-43474a64daee/volumes" Feb 23 07:02:52 crc kubenswrapper[5047]: I0223 07:02:52.419672 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cxhzm" event={"ID":"59e0a106-e284-4715-b287-b77124d0fc64","Type":"ContainerStarted","Data":"dce2a156b13dc243ebb626b539e662bd8849e843970e9a5d4efdb12ddc7b20b9"} Feb 23 07:02:52 crc kubenswrapper[5047]: I0223 07:02:52.436124 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-cxhzm" podStartSLOduration=1.906757205 podStartE2EDuration="2.436097461s" podCreationTimestamp="2026-02-23 07:02:50 +0000 UTC" firstStartedPulling="2026-02-23 07:02:51.273980934 +0000 UTC m=+1093.525308108" lastFinishedPulling="2026-02-23 07:02:51.80332123 +0000 UTC m=+1094.054648364" observedRunningTime="2026-02-23 07:02:52.435233927 +0000 UTC m=+1094.686561081" watchObservedRunningTime="2026-02-23 07:02:52.436097461 +0000 UTC m=+1094.687424605" Feb 23 07:03:00 crc kubenswrapper[5047]: I0223 07:03:00.771990 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-cxhzm" Feb 23 07:03:00 crc kubenswrapper[5047]: I0223 07:03:00.772432 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-cxhzm" Feb 23 07:03:00 crc kubenswrapper[5047]: I0223 07:03:00.815233 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-cxhzm" Feb 23 07:03:01 crc kubenswrapper[5047]: I0223 07:03:01.544916 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-cxhzm" Feb 23 07:03:08 crc kubenswrapper[5047]: I0223 07:03:08.737273 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7"] Feb 23 07:03:08 crc kubenswrapper[5047]: E0223 07:03:08.738785 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4003a37-a564-4e3a-9851-43474a64daee" containerName="registry-server" Feb 23 07:03:08 crc kubenswrapper[5047]: I0223 07:03:08.738805 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4003a37-a564-4e3a-9851-43474a64daee" containerName="registry-server" Feb 23 07:03:08 crc kubenswrapper[5047]: I0223 07:03:08.752003 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4003a37-a564-4e3a-9851-43474a64daee" containerName="registry-server" Feb 23 07:03:08 crc kubenswrapper[5047]: I0223 07:03:08.753981 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" Feb 23 07:03:08 crc kubenswrapper[5047]: I0223 07:03:08.757671 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-9jxcn" Feb 23 07:03:08 crc kubenswrapper[5047]: I0223 07:03:08.784066 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7"] Feb 23 07:03:08 crc kubenswrapper[5047]: I0223 07:03:08.901811 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7\" (UID: \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" Feb 23 07:03:08 crc kubenswrapper[5047]: I0223 07:03:08.901996 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nknv8\" (UniqueName: \"kubernetes.io/projected/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-kube-api-access-nknv8\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7\" (UID: \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" Feb 23 07:03:08 crc kubenswrapper[5047]: I0223 07:03:08.902292 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7\" (UID: \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" Feb 23 07:03:09 crc kubenswrapper[5047]: I0223 07:03:09.005856 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7\" (UID: \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" Feb 23 07:03:09 crc kubenswrapper[5047]: I0223 07:03:09.005953 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nknv8\" (UniqueName: \"kubernetes.io/projected/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-kube-api-access-nknv8\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7\" (UID: \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" Feb 23 07:03:09 crc kubenswrapper[5047]: I0223 07:03:09.006034 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7\" (UID: \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" Feb 23 07:03:09 crc kubenswrapper[5047]: I0223 07:03:09.006562 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7\" (UID: \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" Feb 23 07:03:09 crc kubenswrapper[5047]: I0223 07:03:09.006704 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7\" (UID: \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" Feb 23 07:03:09 crc kubenswrapper[5047]: I0223 07:03:09.030162 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nknv8\" (UniqueName: \"kubernetes.io/projected/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-kube-api-access-nknv8\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7\" (UID: \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" Feb 23 07:03:09 crc kubenswrapper[5047]: I0223 07:03:09.088660 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" Feb 23 07:03:09 crc kubenswrapper[5047]: I0223 07:03:09.616148 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7"] Feb 23 07:03:09 crc kubenswrapper[5047]: W0223 07:03:09.623458 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc67ccc6d_2058_4aa1_b91f_db542ab3ff96.slice/crio-59650d25218a68a63ef528fe53a5095dc07627571500d321146e09759fe0ba24 WatchSource:0}: Error finding container 59650d25218a68a63ef528fe53a5095dc07627571500d321146e09759fe0ba24: Status 404 returned error can't find the container with id 59650d25218a68a63ef528fe53a5095dc07627571500d321146e09759fe0ba24 Feb 23 07:03:10 crc kubenswrapper[5047]: I0223 07:03:10.585276 5047 generic.go:334] "Generic (PLEG): container finished" podID="c67ccc6d-2058-4aa1-b91f-db542ab3ff96" containerID="20b5705354e4ea74462c2ccd0f68cb79099751afb48c141d5b9304a4e6e36a72" exitCode=0 Feb 23 07:03:10 crc kubenswrapper[5047]: I0223 07:03:10.585418 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" event={"ID":"c67ccc6d-2058-4aa1-b91f-db542ab3ff96","Type":"ContainerDied","Data":"20b5705354e4ea74462c2ccd0f68cb79099751afb48c141d5b9304a4e6e36a72"} Feb 23 07:03:10 crc kubenswrapper[5047]: I0223 07:03:10.585940 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" event={"ID":"c67ccc6d-2058-4aa1-b91f-db542ab3ff96","Type":"ContainerStarted","Data":"59650d25218a68a63ef528fe53a5095dc07627571500d321146e09759fe0ba24"} Feb 23 07:03:11 crc kubenswrapper[5047]: I0223 07:03:11.599253 5047 generic.go:334] "Generic (PLEG): container finished" podID="c67ccc6d-2058-4aa1-b91f-db542ab3ff96" containerID="a03c54ae2d541bf0ce4952cd15ccfcec095e403f1c9ba2115e36bc1de33170bc" exitCode=0 Feb 23 07:03:11 crc kubenswrapper[5047]: I0223 07:03:11.599523 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" event={"ID":"c67ccc6d-2058-4aa1-b91f-db542ab3ff96","Type":"ContainerDied","Data":"a03c54ae2d541bf0ce4952cd15ccfcec095e403f1c9ba2115e36bc1de33170bc"} Feb 23 07:03:12 crc kubenswrapper[5047]: I0223 07:03:12.614888 5047 generic.go:334] "Generic (PLEG): container finished" podID="c67ccc6d-2058-4aa1-b91f-db542ab3ff96" containerID="bab338cebc99b42ba386883b6640167f71dc3f46a17f55259133bbce33d20739" exitCode=0 Feb 23 07:03:12 crc kubenswrapper[5047]: I0223 07:03:12.614984 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" event={"ID":"c67ccc6d-2058-4aa1-b91f-db542ab3ff96","Type":"ContainerDied","Data":"bab338cebc99b42ba386883b6640167f71dc3f46a17f55259133bbce33d20739"} Feb 23 07:03:13 crc kubenswrapper[5047]: I0223 07:03:13.895884 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" Feb 23 07:03:13 crc kubenswrapper[5047]: I0223 07:03:13.997395 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-bundle\") pod \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\" (UID: \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\") " Feb 23 07:03:13 crc kubenswrapper[5047]: I0223 07:03:13.997494 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-util\") pod \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\" (UID: \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\") " Feb 23 07:03:13 crc kubenswrapper[5047]: I0223 07:03:13.997582 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nknv8\" (UniqueName: \"kubernetes.io/projected/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-kube-api-access-nknv8\") pod \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\" (UID: \"c67ccc6d-2058-4aa1-b91f-db542ab3ff96\") " Feb 23 07:03:13 crc kubenswrapper[5047]: I0223 07:03:13.998614 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-bundle" (OuterVolumeSpecName: "bundle") pod "c67ccc6d-2058-4aa1-b91f-db542ab3ff96" (UID: "c67ccc6d-2058-4aa1-b91f-db542ab3ff96"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:03:14 crc kubenswrapper[5047]: I0223 07:03:14.007635 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-kube-api-access-nknv8" (OuterVolumeSpecName: "kube-api-access-nknv8") pod "c67ccc6d-2058-4aa1-b91f-db542ab3ff96" (UID: "c67ccc6d-2058-4aa1-b91f-db542ab3ff96"). InnerVolumeSpecName "kube-api-access-nknv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:03:14 crc kubenswrapper[5047]: I0223 07:03:14.012700 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-util" (OuterVolumeSpecName: "util") pod "c67ccc6d-2058-4aa1-b91f-db542ab3ff96" (UID: "c67ccc6d-2058-4aa1-b91f-db542ab3ff96"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:03:14 crc kubenswrapper[5047]: I0223 07:03:14.099358 5047 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:14 crc kubenswrapper[5047]: I0223 07:03:14.099397 5047 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-util\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:14 crc kubenswrapper[5047]: I0223 07:03:14.099408 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nknv8\" (UniqueName: \"kubernetes.io/projected/c67ccc6d-2058-4aa1-b91f-db542ab3ff96-kube-api-access-nknv8\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:14 crc kubenswrapper[5047]: I0223 07:03:14.637646 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" event={"ID":"c67ccc6d-2058-4aa1-b91f-db542ab3ff96","Type":"ContainerDied","Data":"59650d25218a68a63ef528fe53a5095dc07627571500d321146e09759fe0ba24"} Feb 23 07:03:14 crc kubenswrapper[5047]: I0223 07:03:14.637742 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59650d25218a68a63ef528fe53a5095dc07627571500d321146e09759fe0ba24" Feb 23 07:03:14 crc kubenswrapper[5047]: I0223 07:03:14.637884 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7" Feb 23 07:03:16 crc kubenswrapper[5047]: I0223 07:03:16.759675 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:03:16 crc kubenswrapper[5047]: I0223 07:03:16.760315 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:03:16 crc kubenswrapper[5047]: I0223 07:03:16.760403 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 07:03:16 crc kubenswrapper[5047]: I0223 07:03:16.761518 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3620b5bb5ccf16efd5aeafb9b69b3b4d8f6c25db9ca9c3c473e293ae127f043c"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:03:16 crc kubenswrapper[5047]: I0223 07:03:16.761611 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://3620b5bb5ccf16efd5aeafb9b69b3b4d8f6c25db9ca9c3c473e293ae127f043c" gracePeriod=600 Feb 23 07:03:17 crc kubenswrapper[5047]: I0223 07:03:17.670195 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="3620b5bb5ccf16efd5aeafb9b69b3b4d8f6c25db9ca9c3c473e293ae127f043c" exitCode=0 Feb 23 07:03:17 crc kubenswrapper[5047]: I0223 07:03:17.670299 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"3620b5bb5ccf16efd5aeafb9b69b3b4d8f6c25db9ca9c3c473e293ae127f043c"} Feb 23 07:03:17 crc kubenswrapper[5047]: I0223 07:03:17.670900 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"18683d8e116e648be702e8fe2b59331eb73a682d795d009e915974255f56b210"} Feb 23 07:03:17 crc kubenswrapper[5047]: I0223 07:03:17.670974 5047 scope.go:117] "RemoveContainer" containerID="bf0d472a1be69e8faaec3de3ca52b467dc0e01107e59ea30330ccf2c4ba1980f" Feb 23 07:03:20 crc kubenswrapper[5047]: I0223 07:03:20.814550 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-68nbx"] Feb 23 07:03:20 crc kubenswrapper[5047]: E0223 07:03:20.816656 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67ccc6d-2058-4aa1-b91f-db542ab3ff96" containerName="util" Feb 23 07:03:20 crc kubenswrapper[5047]: I0223 07:03:20.816750 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67ccc6d-2058-4aa1-b91f-db542ab3ff96" containerName="util" Feb 23 07:03:20 crc kubenswrapper[5047]: E0223 07:03:20.816810 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67ccc6d-2058-4aa1-b91f-db542ab3ff96" containerName="extract" Feb 23 07:03:20 crc kubenswrapper[5047]: I0223 07:03:20.816864 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67ccc6d-2058-4aa1-b91f-db542ab3ff96" containerName="extract" Feb 23 07:03:20 crc kubenswrapper[5047]: E0223 07:03:20.816976 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67ccc6d-2058-4aa1-b91f-db542ab3ff96" containerName="pull" Feb 23 07:03:20 crc kubenswrapper[5047]: I0223 07:03:20.817040 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67ccc6d-2058-4aa1-b91f-db542ab3ff96" containerName="pull" Feb 23 07:03:20 crc kubenswrapper[5047]: I0223 07:03:20.817208 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c67ccc6d-2058-4aa1-b91f-db542ab3ff96" containerName="extract" Feb 23 07:03:20 crc kubenswrapper[5047]: I0223 07:03:20.817750 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-68nbx" Feb 23 07:03:20 crc kubenswrapper[5047]: I0223 07:03:20.820001 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-j8fnb" Feb 23 07:03:20 crc kubenswrapper[5047]: I0223 07:03:20.851118 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-68nbx"] Feb 23 07:03:20 crc kubenswrapper[5047]: I0223 07:03:20.946463 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvsd7\" (UniqueName: \"kubernetes.io/projected/f5b3049a-c454-4b3a-bc73-7529f164bcf1-kube-api-access-vvsd7\") pod \"openstack-operator-controller-init-6679bf9b57-68nbx\" (UID: \"f5b3049a-c454-4b3a-bc73-7529f164bcf1\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-68nbx" Feb 23 07:03:21 crc kubenswrapper[5047]: I0223 07:03:21.048341 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvsd7\" (UniqueName: \"kubernetes.io/projected/f5b3049a-c454-4b3a-bc73-7529f164bcf1-kube-api-access-vvsd7\") pod \"openstack-operator-controller-init-6679bf9b57-68nbx\" (UID: \"f5b3049a-c454-4b3a-bc73-7529f164bcf1\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-68nbx" Feb 23 07:03:21 crc kubenswrapper[5047]: I0223 07:03:21.072469 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvsd7\" (UniqueName: \"kubernetes.io/projected/f5b3049a-c454-4b3a-bc73-7529f164bcf1-kube-api-access-vvsd7\") pod \"openstack-operator-controller-init-6679bf9b57-68nbx\" (UID: \"f5b3049a-c454-4b3a-bc73-7529f164bcf1\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-68nbx" Feb 23 07:03:21 crc kubenswrapper[5047]: I0223 07:03:21.139485 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-68nbx" Feb 23 07:03:21 crc kubenswrapper[5047]: I0223 07:03:21.445744 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-68nbx"] Feb 23 07:03:21 crc kubenswrapper[5047]: W0223 07:03:21.451887 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5b3049a_c454_4b3a_bc73_7529f164bcf1.slice/crio-0f0a78d6d96bc39554c4a5bd8687c8c59cef8ea726b70c61f9f517947917bdca WatchSource:0}: Error finding container 0f0a78d6d96bc39554c4a5bd8687c8c59cef8ea726b70c61f9f517947917bdca: Status 404 returned error can't find the container with id 0f0a78d6d96bc39554c4a5bd8687c8c59cef8ea726b70c61f9f517947917bdca Feb 23 07:03:21 crc kubenswrapper[5047]: I0223 07:03:21.701973 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-68nbx" event={"ID":"f5b3049a-c454-4b3a-bc73-7529f164bcf1","Type":"ContainerStarted","Data":"0f0a78d6d96bc39554c4a5bd8687c8c59cef8ea726b70c61f9f517947917bdca"} Feb 23 07:03:26 crc kubenswrapper[5047]: I0223 07:03:26.750264 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-68nbx" event={"ID":"f5b3049a-c454-4b3a-bc73-7529f164bcf1","Type":"ContainerStarted","Data":"89d33b2defdfcf4adc82d471f914e7bc5cf515fa786ba77397f68dc8963f1227"} Feb 23 07:03:26 crc kubenswrapper[5047]: I0223 07:03:26.750900 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-68nbx" Feb 23 07:03:26 crc kubenswrapper[5047]: I0223 07:03:26.797091 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-68nbx" podStartSLOduration=2.490481237 podStartE2EDuration="6.797062171s" podCreationTimestamp="2026-02-23 07:03:20 +0000 UTC" firstStartedPulling="2026-02-23 07:03:21.455228341 +0000 UTC m=+1123.706555475" lastFinishedPulling="2026-02-23 07:03:25.761809275 +0000 UTC m=+1128.013136409" observedRunningTime="2026-02-23 07:03:26.78005263 +0000 UTC m=+1129.031379804" watchObservedRunningTime="2026-02-23 07:03:26.797062171 +0000 UTC m=+1129.048389315" Feb 23 07:03:31 crc kubenswrapper[5047]: I0223 07:03:31.144104 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-68nbx" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.210248 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-7642d"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.211871 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-7642d" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.215692 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-mmcgk" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.227283 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-7642d"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.276978 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-2cbb6"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.278353 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-2cbb6" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.293025 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-j92nl" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.298321 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-6sl7g"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.305820 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6sl7g" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.309860 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-qkhrw" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.317007 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-2cbb6"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.358566 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hjkd\" (UniqueName: \"kubernetes.io/projected/d5a20255-deb8-4cda-a48c-2735b1e66247-kube-api-access-9hjkd\") pod \"barbican-operator-controller-manager-868647ff47-7642d\" (UID: \"d5a20255-deb8-4cda-a48c-2735b1e66247\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-7642d" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.358612 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-6sl7g"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.362949 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-7gz7l"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.364036 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-7gz7l" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.369230 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-t8f2n" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.384321 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-pbjl2"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.385655 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pbjl2" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.400620 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-n8l6z" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.400775 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-7gz7l"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.416156 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-pbjl2"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.431395 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-cqtpj"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.432461 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-cqtpj" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.435460 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-jpf4f" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.451160 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.452214 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.458002 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-8b2tl" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.458200 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.459839 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rg7v\" (UniqueName: \"kubernetes.io/projected/50b2c3a5-05f7-4f75-a03b-b79278119309-kube-api-access-9rg7v\") pod \"cinder-operator-controller-manager-5d946d989d-2cbb6\" (UID: \"50b2c3a5-05f7-4f75-a03b-b79278119309\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-2cbb6" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.459960 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm24v\" (UniqueName: \"kubernetes.io/projected/fd87e444-98c8-4de9-9a11-ec9678daaeaa-kube-api-access-fm24v\") pod \"designate-operator-controller-manager-6d8bf5c495-6sl7g\" (UID: \"fd87e444-98c8-4de9-9a11-ec9678daaeaa\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6sl7g" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.459993 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdq8c\" (UniqueName: \"kubernetes.io/projected/8dcb699b-69b9-4a9c-86ff-d04bf088e297-kube-api-access-jdq8c\") pod \"glance-operator-controller-manager-77987464f4-7gz7l\" (UID: \"8dcb699b-69b9-4a9c-86ff-d04bf088e297\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-7gz7l" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.460197 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hjkd\" (UniqueName: \"kubernetes.io/projected/d5a20255-deb8-4cda-a48c-2735b1e66247-kube-api-access-9hjkd\") pod \"barbican-operator-controller-manager-868647ff47-7642d\" (UID: \"d5a20255-deb8-4cda-a48c-2735b1e66247\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-7642d" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.463142 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-cqtpj"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.468494 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.475039 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.475976 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.479118 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rqnzw" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.490057 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.490426 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.491353 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.498687 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4bbb4" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.502447 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.510232 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hjkd\" (UniqueName: \"kubernetes.io/projected/d5a20255-deb8-4cda-a48c-2735b1e66247-kube-api-access-9hjkd\") pod \"barbican-operator-controller-manager-868647ff47-7642d\" (UID: \"d5a20255-deb8-4cda-a48c-2735b1e66247\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-7642d" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.515788 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.516852 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.521507 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-2xgp6" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.524935 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.526062 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.532086 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-kbr9x" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.532594 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-7642d" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.533128 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.548022 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.549111 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.551090 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-684ng" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.551237 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.556192 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.557182 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.561433 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-hvtqc" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.563048 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.564116 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.564136 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.566135 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwbcj\" (UniqueName: \"kubernetes.io/projected/9334599a-1686-4e49-b6e2-799c2038e0df-kube-api-access-jwbcj\") pod \"horizon-operator-controller-manager-5b9b8895d5-cqtpj\" (UID: \"9334599a-1686-4e49-b6e2-799c2038e0df\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-cqtpj" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.566169 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert\") pod \"infra-operator-controller-manager-79d975b745-hfcm8\" (UID: \"4f170913-8ccb-42e2-9113-23fb049373c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.566191 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg724\" (UniqueName: \"kubernetes.io/projected/b047da04-c7bc-47c3-aae0-71b9d98a650e-kube-api-access-qg724\") pod \"heat-operator-controller-manager-69f49c598c-pbjl2\" (UID: \"b047da04-c7bc-47c3-aae0-71b9d98a650e\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pbjl2" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.566234 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rg7v\" (UniqueName: \"kubernetes.io/projected/50b2c3a5-05f7-4f75-a03b-b79278119309-kube-api-access-9rg7v\") pod \"cinder-operator-controller-manager-5d946d989d-2cbb6\" (UID: \"50b2c3a5-05f7-4f75-a03b-b79278119309\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-2cbb6" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.566257 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kdlj\" (UniqueName: \"kubernetes.io/projected/a263ae62-717d-4032-b1e2-042f1b3e936e-kube-api-access-2kdlj\") pod \"ironic-operator-controller-manager-554564d7fc-qr226\" (UID: \"a263ae62-717d-4032-b1e2-042f1b3e936e\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.566281 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm24v\" (UniqueName: \"kubernetes.io/projected/fd87e444-98c8-4de9-9a11-ec9678daaeaa-kube-api-access-fm24v\") pod \"designate-operator-controller-manager-6d8bf5c495-6sl7g\" (UID: \"fd87e444-98c8-4de9-9a11-ec9678daaeaa\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6sl7g" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.566297 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdq8c\" (UniqueName: \"kubernetes.io/projected/8dcb699b-69b9-4a9c-86ff-d04bf088e297-kube-api-access-jdq8c\") pod \"glance-operator-controller-manager-77987464f4-7gz7l\" (UID: \"8dcb699b-69b9-4a9c-86ff-d04bf088e297\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-7gz7l" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.566325 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz784\" (UniqueName: \"kubernetes.io/projected/4f170913-8ccb-42e2-9113-23fb049373c9-kube-api-access-xz784\") pod \"infra-operator-controller-manager-79d975b745-hfcm8\" (UID: \"4f170913-8ccb-42e2-9113-23fb049373c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.566352 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv554\" (UniqueName: \"kubernetes.io/projected/c67d85ac-ca9b-4b6f-826d-c04dd6b6850b-kube-api-access-tv554\") pod \"keystone-operator-controller-manager-b4d948c87-wdx4l\" (UID: \"c67d85ac-ca9b-4b6f-826d-c04dd6b6850b\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.571434 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.576886 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-9zqkc" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.577086 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.602709 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.603638 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.605749 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rg7v\" (UniqueName: \"kubernetes.io/projected/50b2c3a5-05f7-4f75-a03b-b79278119309-kube-api-access-9rg7v\") pod \"cinder-operator-controller-manager-5d946d989d-2cbb6\" (UID: \"50b2c3a5-05f7-4f75-a03b-b79278119309\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-2cbb6" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.613166 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.613194 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5jwcg" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.621793 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdq8c\" (UniqueName: \"kubernetes.io/projected/8dcb699b-69b9-4a9c-86ff-d04bf088e297-kube-api-access-jdq8c\") pod \"glance-operator-controller-manager-77987464f4-7gz7l\" (UID: \"8dcb699b-69b9-4a9c-86ff-d04bf088e297\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-7gz7l" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.622581 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm24v\" (UniqueName: \"kubernetes.io/projected/fd87e444-98c8-4de9-9a11-ec9678daaeaa-kube-api-access-fm24v\") pod \"designate-operator-controller-manager-6d8bf5c495-6sl7g\" (UID: \"fd87e444-98c8-4de9-9a11-ec9678daaeaa\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6sl7g" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.624273 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-w297v"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.634637 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-2cbb6" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.638981 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6sl7g" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.663845 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-w297v" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.668113 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-c4tfl" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.670352 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg724\" (UniqueName: \"kubernetes.io/projected/b047da04-c7bc-47c3-aae0-71b9d98a650e-kube-api-access-qg724\") pod \"heat-operator-controller-manager-69f49c598c-pbjl2\" (UID: \"b047da04-c7bc-47c3-aae0-71b9d98a650e\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pbjl2" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.670407 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj4kl\" (UniqueName: \"kubernetes.io/projected/af0d0299-b4a6-47bd-99a1-567fd5126f5c-kube-api-access-kj4kl\") pod \"neutron-operator-controller-manager-64ddbf8bb-bs8mv\" (UID: \"af0d0299-b4a6-47bd-99a1-567fd5126f5c\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.670442 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w58qm\" (UniqueName: \"kubernetes.io/projected/86dd701f-396e-4fc1-8d42-29befb968db9-kube-api-access-w58qm\") pod \"manila-operator-controller-manager-54f6768c69-bg6jh\" (UID: \"86dd701f-396e-4fc1-8d42-29befb968db9\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.670476 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5zsf\" (UniqueName: \"kubernetes.io/projected/f1b21582-c058-4cbc-bc6b-95d77c4a526c-kube-api-access-t5zsf\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl\" (UID: \"f1b21582-c058-4cbc-bc6b-95d77c4a526c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.670503 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmlgq\" (UniqueName: \"kubernetes.io/projected/e2712b1a-9cd8-40f1-aedd-8c6146c4182e-kube-api-access-vmlgq\") pod \"octavia-operator-controller-manager-69f8888797-ft5sm\" (UID: \"e2712b1a-9cd8-40f1-aedd-8c6146c4182e\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.670533 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kdlj\" (UniqueName: \"kubernetes.io/projected/a263ae62-717d-4032-b1e2-042f1b3e936e-kube-api-access-2kdlj\") pod \"ironic-operator-controller-manager-554564d7fc-qr226\" (UID: \"a263ae62-717d-4032-b1e2-042f1b3e936e\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.670577 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvbnh\" (UniqueName: \"kubernetes.io/projected/32942f2f-2abb-4f90-8444-ed9e77aeef57-kube-api-access-wvbnh\") pod \"nova-operator-controller-manager-567668f5cf-h6tck\" (UID: \"32942f2f-2abb-4f90-8444-ed9e77aeef57\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.670605 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cjxd\" (UniqueName: \"kubernetes.io/projected/d331d283-a586-4c54-9732-151a8050ed40-kube-api-access-4cjxd\") pod \"mariadb-operator-controller-manager-6994f66f48-lmr7j\" (UID: \"d331d283-a586-4c54-9732-151a8050ed40\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.670635 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz784\" (UniqueName: \"kubernetes.io/projected/4f170913-8ccb-42e2-9113-23fb049373c9-kube-api-access-xz784\") pod \"infra-operator-controller-manager-79d975b745-hfcm8\" (UID: \"4f170913-8ccb-42e2-9113-23fb049373c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.670662 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv554\" (UniqueName: \"kubernetes.io/projected/c67d85ac-ca9b-4b6f-826d-c04dd6b6850b-kube-api-access-tv554\") pod \"keystone-operator-controller-manager-b4d948c87-wdx4l\" (UID: \"c67d85ac-ca9b-4b6f-826d-c04dd6b6850b\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.670701 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl\" (UID: \"f1b21582-c058-4cbc-bc6b-95d77c4a526c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.670743 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwbcj\" (UniqueName: \"kubernetes.io/projected/9334599a-1686-4e49-b6e2-799c2038e0df-kube-api-access-jwbcj\") pod \"horizon-operator-controller-manager-5b9b8895d5-cqtpj\" (UID: \"9334599a-1686-4e49-b6e2-799c2038e0df\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-cqtpj" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.670834 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert\") pod \"infra-operator-controller-manager-79d975b745-hfcm8\" (UID: \"4f170913-8ccb-42e2-9113-23fb049373c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:08 crc kubenswrapper[5047]: E0223 07:04:08.671040 5047 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 07:04:08 crc kubenswrapper[5047]: E0223 07:04:08.671107 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert podName:4f170913-8ccb-42e2-9113-23fb049373c9 nodeName:}" failed. No retries permitted until 2026-02-23 07:04:09.171085021 +0000 UTC m=+1171.422412155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert") pod "infra-operator-controller-manager-79d975b745-hfcm8" (UID: "4f170913-8ccb-42e2-9113-23fb049373c9") : secret "infra-operator-webhook-server-cert" not found Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.676477 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.702317 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-7gz7l" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.712633 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg724\" (UniqueName: \"kubernetes.io/projected/b047da04-c7bc-47c3-aae0-71b9d98a650e-kube-api-access-qg724\") pod \"heat-operator-controller-manager-69f49c598c-pbjl2\" (UID: \"b047da04-c7bc-47c3-aae0-71b9d98a650e\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pbjl2" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.713169 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwbcj\" (UniqueName: \"kubernetes.io/projected/9334599a-1686-4e49-b6e2-799c2038e0df-kube-api-access-jwbcj\") pod \"horizon-operator-controller-manager-5b9b8895d5-cqtpj\" (UID: \"9334599a-1686-4e49-b6e2-799c2038e0df\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-cqtpj" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.722197 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.722978 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz784\" (UniqueName: \"kubernetes.io/projected/4f170913-8ccb-42e2-9113-23fb049373c9-kube-api-access-xz784\") pod \"infra-operator-controller-manager-79d975b745-hfcm8\" (UID: \"4f170913-8ccb-42e2-9113-23fb049373c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.726654 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kdlj\" (UniqueName: \"kubernetes.io/projected/a263ae62-717d-4032-b1e2-042f1b3e936e-kube-api-access-2kdlj\") pod \"ironic-operator-controller-manager-554564d7fc-qr226\" (UID: \"a263ae62-717d-4032-b1e2-042f1b3e936e\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.727593 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv554\" (UniqueName: \"kubernetes.io/projected/c67d85ac-ca9b-4b6f-826d-c04dd6b6850b-kube-api-access-tv554\") pod \"keystone-operator-controller-manager-b4d948c87-wdx4l\" (UID: \"c67d85ac-ca9b-4b6f-826d-c04dd6b6850b\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.731088 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.735361 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-rpfx5" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.748152 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.752588 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.759640 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rcksx" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.772518 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl\" (UID: \"f1b21582-c058-4cbc-bc6b-95d77c4a526c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.772607 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgzw6\" (UniqueName: \"kubernetes.io/projected/9e1bca77-f8b9-41d3-8547-2d7e9bf8ac71-kube-api-access-jgzw6\") pod \"ovn-operator-controller-manager-d44cf6b75-w297v\" (UID: \"9e1bca77-f8b9-41d3-8547-2d7e9bf8ac71\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-w297v" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.772691 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj4kl\" (UniqueName: \"kubernetes.io/projected/af0d0299-b4a6-47bd-99a1-567fd5126f5c-kube-api-access-kj4kl\") pod \"neutron-operator-controller-manager-64ddbf8bb-bs8mv\" (UID: \"af0d0299-b4a6-47bd-99a1-567fd5126f5c\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.772720 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w58qm\" (UniqueName: \"kubernetes.io/projected/86dd701f-396e-4fc1-8d42-29befb968db9-kube-api-access-w58qm\") pod \"manila-operator-controller-manager-54f6768c69-bg6jh\" (UID: \"86dd701f-396e-4fc1-8d42-29befb968db9\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.772752 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5zsf\" (UniqueName: \"kubernetes.io/projected/f1b21582-c058-4cbc-bc6b-95d77c4a526c-kube-api-access-t5zsf\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl\" (UID: \"f1b21582-c058-4cbc-bc6b-95d77c4a526c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.772778 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmlgq\" (UniqueName: \"kubernetes.io/projected/e2712b1a-9cd8-40f1-aedd-8c6146c4182e-kube-api-access-vmlgq\") pod \"octavia-operator-controller-manager-69f8888797-ft5sm\" (UID: \"e2712b1a-9cd8-40f1-aedd-8c6146c4182e\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.772829 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvbnh\" (UniqueName: \"kubernetes.io/projected/32942f2f-2abb-4f90-8444-ed9e77aeef57-kube-api-access-wvbnh\") pod \"nova-operator-controller-manager-567668f5cf-h6tck\" (UID: \"32942f2f-2abb-4f90-8444-ed9e77aeef57\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.772860 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cjxd\" (UniqueName: \"kubernetes.io/projected/d331d283-a586-4c54-9732-151a8050ed40-kube-api-access-4cjxd\") pod \"mariadb-operator-controller-manager-6994f66f48-lmr7j\" (UID: \"d331d283-a586-4c54-9732-151a8050ed40\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j" Feb 23 07:04:08 crc kubenswrapper[5047]: E0223 07:04:08.774383 5047 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.783735 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-cqtpj" Feb 23 07:04:08 crc kubenswrapper[5047]: E0223 07:04:08.784250 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert podName:f1b21582-c058-4cbc-bc6b-95d77c4a526c nodeName:}" failed. No retries permitted until 2026-02-23 07:04:09.284229703 +0000 UTC m=+1171.535556837 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" (UID: "f1b21582-c058-4cbc-bc6b-95d77c4a526c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.802042 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-w297v"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.811519 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.823013 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5zsf\" (UniqueName: \"kubernetes.io/projected/f1b21582-c058-4cbc-bc6b-95d77c4a526c-kube-api-access-t5zsf\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl\" (UID: \"f1b21582-c058-4cbc-bc6b-95d77c4a526c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.823850 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj4kl\" (UniqueName: \"kubernetes.io/projected/af0d0299-b4a6-47bd-99a1-567fd5126f5c-kube-api-access-kj4kl\") pod \"neutron-operator-controller-manager-64ddbf8bb-bs8mv\" (UID: \"af0d0299-b4a6-47bd-99a1-567fd5126f5c\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.823861 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvbnh\" (UniqueName: \"kubernetes.io/projected/32942f2f-2abb-4f90-8444-ed9e77aeef57-kube-api-access-wvbnh\") pod \"nova-operator-controller-manager-567668f5cf-h6tck\" (UID: \"32942f2f-2abb-4f90-8444-ed9e77aeef57\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.823871 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmlgq\" (UniqueName: \"kubernetes.io/projected/e2712b1a-9cd8-40f1-aedd-8c6146c4182e-kube-api-access-vmlgq\") pod \"octavia-operator-controller-manager-69f8888797-ft5sm\" (UID: \"e2712b1a-9cd8-40f1-aedd-8c6146c4182e\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.826602 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w58qm\" (UniqueName: \"kubernetes.io/projected/86dd701f-396e-4fc1-8d42-29befb968db9-kube-api-access-w58qm\") pod \"manila-operator-controller-manager-54f6768c69-bg6jh\" (UID: \"86dd701f-396e-4fc1-8d42-29befb968db9\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.831343 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cjxd\" (UniqueName: \"kubernetes.io/projected/d331d283-a586-4c54-9732-151a8050ed40-kube-api-access-4cjxd\") pod \"mariadb-operator-controller-manager-6994f66f48-lmr7j\" (UID: \"d331d283-a586-4c54-9732-151a8050ed40\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.864610 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.882057 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxvl8\" (UniqueName: \"kubernetes.io/projected/f45166f5-93e4-4787-a25f-0c19e1e83cd5-kube-api-access-gxvl8\") pod \"swift-operator-controller-manager-68f46476f-d8rhk\" (UID: \"f45166f5-93e4-4787-a25f-0c19e1e83cd5\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.882140 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56nqn\" (UniqueName: \"kubernetes.io/projected/f55fc40d-b5bb-4ba4-8b1c-1136aad2fc04-kube-api-access-56nqn\") pod \"placement-operator-controller-manager-8497b45c89-t4qfd\" (UID: \"f55fc40d-b5bb-4ba4-8b1c-1136aad2fc04\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.882184 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgzw6\" (UniqueName: \"kubernetes.io/projected/9e1bca77-f8b9-41d3-8547-2d7e9bf8ac71-kube-api-access-jgzw6\") pod \"ovn-operator-controller-manager-d44cf6b75-w297v\" (UID: \"9e1bca77-f8b9-41d3-8547-2d7e9bf8ac71\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-w297v" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.895850 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.898971 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.907850 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-s8vch" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.930216 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgzw6\" (UniqueName: \"kubernetes.io/projected/9e1bca77-f8b9-41d3-8547-2d7e9bf8ac71-kube-api-access-jgzw6\") pod \"ovn-operator-controller-manager-d44cf6b75-w297v\" (UID: \"9e1bca77-f8b9-41d3-8547-2d7e9bf8ac71\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-w297v" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.934811 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb"] Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.962822 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.986860 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2wcf\" (UniqueName: \"kubernetes.io/projected/ada55ddd-c9d2-47ac-b8e2-efcfe2b45bd7-kube-api-access-h2wcf\") pod \"telemetry-operator-controller-manager-7f45b4ff68-fqngb\" (UID: \"ada55ddd-c9d2-47ac-b8e2-efcfe2b45bd7\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.987007 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxvl8\" (UniqueName: \"kubernetes.io/projected/f45166f5-93e4-4787-a25f-0c19e1e83cd5-kube-api-access-gxvl8\") pod \"swift-operator-controller-manager-68f46476f-d8rhk\" (UID: \"f45166f5-93e4-4787-a25f-0c19e1e83cd5\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk" Feb 23 07:04:08 crc kubenswrapper[5047]: I0223 07:04:08.987065 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56nqn\" (UniqueName: \"kubernetes.io/projected/f55fc40d-b5bb-4ba4-8b1c-1136aad2fc04-kube-api-access-56nqn\") pod \"placement-operator-controller-manager-8497b45c89-t4qfd\" (UID: \"f55fc40d-b5bb-4ba4-8b1c-1136aad2fc04\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.015613 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.026273 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pbjl2" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.030900 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxvl8\" (UniqueName: \"kubernetes.io/projected/f45166f5-93e4-4787-a25f-0c19e1e83cd5-kube-api-access-gxvl8\") pod \"swift-operator-controller-manager-68f46476f-d8rhk\" (UID: \"f45166f5-93e4-4787-a25f-0c19e1e83cd5\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.035728 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.039967 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-5v756"] Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.048968 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-5v756" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.052502 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.054821 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56nqn\" (UniqueName: \"kubernetes.io/projected/f55fc40d-b5bb-4ba4-8b1c-1136aad2fc04-kube-api-access-56nqn\") pod \"placement-operator-controller-manager-8497b45c89-t4qfd\" (UID: \"f55fc40d-b5bb-4ba4-8b1c-1136aad2fc04\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.056463 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-h76cf" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.059201 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-5v756"] Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.062039 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.085774 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.088344 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2wcf\" (UniqueName: \"kubernetes.io/projected/ada55ddd-c9d2-47ac-b8e2-efcfe2b45bd7-kube-api-access-h2wcf\") pod \"telemetry-operator-controller-manager-7f45b4ff68-fqngb\" (UID: \"ada55ddd-c9d2-47ac-b8e2-efcfe2b45bd7\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.088483 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9dmc\" (UniqueName: \"kubernetes.io/projected/7340bc8e-daf4-474d-9b0e-3363755d8f43-kube-api-access-s9dmc\") pod \"test-operator-controller-manager-7866795846-5v756\" (UID: \"7340bc8e-daf4-474d-9b0e-3363755d8f43\") " pod="openstack-operators/test-operator-controller-manager-7866795846-5v756" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.121809 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.127727 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2wcf\" (UniqueName: \"kubernetes.io/projected/ada55ddd-c9d2-47ac-b8e2-efcfe2b45bd7-kube-api-access-h2wcf\") pod \"telemetry-operator-controller-manager-7f45b4ff68-fqngb\" (UID: \"ada55ddd-c9d2-47ac-b8e2-efcfe2b45bd7\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.128027 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp"] Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.130274 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.139034 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-zk4bv" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.163657 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp"] Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.163968 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-w297v" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.184503 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.190948 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.210661 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert\") pod \"infra-operator-controller-manager-79d975b745-hfcm8\" (UID: \"4f170913-8ccb-42e2-9113-23fb049373c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.210752 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9dmc\" (UniqueName: \"kubernetes.io/projected/7340bc8e-daf4-474d-9b0e-3363755d8f43-kube-api-access-s9dmc\") pod \"test-operator-controller-manager-7866795846-5v756\" (UID: \"7340bc8e-daf4-474d-9b0e-3363755d8f43\") " pod="openstack-operators/test-operator-controller-manager-7866795846-5v756" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.210788 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcnnl\" (UniqueName: \"kubernetes.io/projected/139b2268-6b8e-42e2-934a-639863e06507-kube-api-access-qcnnl\") pod \"watcher-operator-controller-manager-5db88f68c-6f5tp\" (UID: \"139b2268-6b8e-42e2-934a-639863e06507\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp" Feb 23 07:04:09 crc kubenswrapper[5047]: E0223 07:04:09.211210 5047 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 07:04:09 crc kubenswrapper[5047]: E0223 07:04:09.211273 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert podName:4f170913-8ccb-42e2-9113-23fb049373c9 nodeName:}" failed. No retries permitted until 2026-02-23 07:04:10.211252137 +0000 UTC m=+1172.462579271 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert") pod "infra-operator-controller-manager-79d975b745-hfcm8" (UID: "4f170913-8ccb-42e2-9113-23fb049373c9") : secret "infra-operator-webhook-server-cert" not found Feb 23 07:04:09 crc kubenswrapper[5047]: W0223 07:04:09.221543 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a20255_deb8_4cda_a48c_2735b1e66247.slice/crio-33847cbdd4dc6ecc7d814606a6c369d8b55300117e2b0677324d5489a91cb198 WatchSource:0}: Error finding container 33847cbdd4dc6ecc7d814606a6c369d8b55300117e2b0677324d5489a91cb198: Status 404 returned error can't find the container with id 33847cbdd4dc6ecc7d814606a6c369d8b55300117e2b0677324d5489a91cb198 Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.237169 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.269732 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9dmc\" (UniqueName: \"kubernetes.io/projected/7340bc8e-daf4-474d-9b0e-3363755d8f43-kube-api-access-s9dmc\") pod \"test-operator-controller-manager-7866795846-5v756\" (UID: \"7340bc8e-daf4-474d-9b0e-3363755d8f43\") " pod="openstack-operators/test-operator-controller-manager-7866795846-5v756" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.289999 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2"] Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.291202 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.312869 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcnnl\" (UniqueName: \"kubernetes.io/projected/139b2268-6b8e-42e2-934a-639863e06507-kube-api-access-qcnnl\") pod \"watcher-operator-controller-manager-5db88f68c-6f5tp\" (UID: \"139b2268-6b8e-42e2-934a-639863e06507\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.312963 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl\" (UID: \"f1b21582-c058-4cbc-bc6b-95d77c4a526c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:09 crc kubenswrapper[5047]: E0223 07:04:09.313120 5047 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:04:09 crc kubenswrapper[5047]: E0223 07:04:09.313174 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert podName:f1b21582-c058-4cbc-bc6b-95d77c4a526c nodeName:}" failed. No retries permitted until 2026-02-23 07:04:10.313156681 +0000 UTC m=+1172.564483815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" (UID: "f1b21582-c058-4cbc-bc6b-95d77c4a526c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.324981 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2"] Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.329897 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.330508 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.333926 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5thq4" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.349257 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.350786 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcnnl\" (UniqueName: \"kubernetes.io/projected/139b2268-6b8e-42e2-934a-639863e06507-kube-api-access-qcnnl\") pod \"watcher-operator-controller-manager-5db88f68c-6f5tp\" (UID: \"139b2268-6b8e-42e2-934a-639863e06507\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.364256 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-7642d"] Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.411461 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p9fc8"] Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.413026 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p9fc8" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.414848 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.414895 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.415183 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czdhz\" (UniqueName: \"kubernetes.io/projected/a6a5f402-78f7-45b9-8358-c16da1787c4e-kube-api-access-czdhz\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.418412 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-xw7bq" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.424250 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p9fc8"] Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.454667 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-5v756" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.491147 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-6sl7g"] Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.503667 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.517563 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czdhz\" (UniqueName: \"kubernetes.io/projected/a6a5f402-78f7-45b9-8358-c16da1787c4e-kube-api-access-czdhz\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.517664 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.517723 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.517769 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdkrf\" (UniqueName: \"kubernetes.io/projected/d4f545e3-e6a7-41f1-84cb-895175df22cf-kube-api-access-rdkrf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-p9fc8\" (UID: \"d4f545e3-e6a7-41f1-84cb-895175df22cf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p9fc8" Feb 23 07:04:09 crc kubenswrapper[5047]: E0223 07:04:09.518110 5047 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 07:04:09 crc kubenswrapper[5047]: E0223 07:04:09.518141 5047 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 07:04:09 crc kubenswrapper[5047]: E0223 07:04:09.518250 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs podName:a6a5f402-78f7-45b9-8358-c16da1787c4e nodeName:}" failed. No retries permitted until 2026-02-23 07:04:10.018212619 +0000 UTC m=+1172.269539743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-k9ww2" (UID: "a6a5f402-78f7-45b9-8358-c16da1787c4e") : secret "metrics-server-cert" not found Feb 23 07:04:09 crc kubenswrapper[5047]: E0223 07:04:09.518283 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs podName:a6a5f402-78f7-45b9-8358-c16da1787c4e nodeName:}" failed. No retries permitted until 2026-02-23 07:04:10.01825946 +0000 UTC m=+1172.269586594 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-k9ww2" (UID: "a6a5f402-78f7-45b9-8358-c16da1787c4e") : secret "webhook-server-cert" not found Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.536257 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-2cbb6"] Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.542471 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czdhz\" (UniqueName: \"kubernetes.io/projected/a6a5f402-78f7-45b9-8358-c16da1787c4e-kube-api-access-czdhz\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:09 crc kubenswrapper[5047]: W0223 07:04:09.558851 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd87e444_98c8_4de9_9a11_ec9678daaeaa.slice/crio-9dcae045f1d7ba48a9d64c9ae9dad01d8336b633e46a0e8cfb5f497ea5f9cf15 WatchSource:0}: Error finding container 9dcae045f1d7ba48a9d64c9ae9dad01d8336b633e46a0e8cfb5f497ea5f9cf15: Status 404 returned error can't find the container with id 9dcae045f1d7ba48a9d64c9ae9dad01d8336b633e46a0e8cfb5f497ea5f9cf15 Feb 23 07:04:09 crc kubenswrapper[5047]: W0223 07:04:09.602378 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50b2c3a5_05f7_4f75_a03b_b79278119309.slice/crio-c8526f5177c6e3f773fa82eaa2455b187bc15f6265ee9bbab0ff9e5c6ec03403 WatchSource:0}: Error finding container c8526f5177c6e3f773fa82eaa2455b187bc15f6265ee9bbab0ff9e5c6ec03403: Status 404 returned error can't find the container with id c8526f5177c6e3f773fa82eaa2455b187bc15f6265ee9bbab0ff9e5c6ec03403 Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.618961 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdkrf\" (UniqueName: \"kubernetes.io/projected/d4f545e3-e6a7-41f1-84cb-895175df22cf-kube-api-access-rdkrf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-p9fc8\" (UID: \"d4f545e3-e6a7-41f1-84cb-895175df22cf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p9fc8" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.655562 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdkrf\" (UniqueName: \"kubernetes.io/projected/d4f545e3-e6a7-41f1-84cb-895175df22cf-kube-api-access-rdkrf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-p9fc8\" (UID: \"d4f545e3-e6a7-41f1-84cb-895175df22cf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p9fc8" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.667890 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-7gz7l"] Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.708023 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-cqtpj"] Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.748398 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p9fc8" Feb 23 07:04:09 crc kubenswrapper[5047]: I0223 07:04:09.929851 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226"] Feb 23 07:04:10 crc kubenswrapper[5047]: W0223 07:04:10.016050 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda263ae62_717d_4032_b1e2_042f1b3e936e.slice/crio-00e0a956062115931a4e8e8eb6b57c6b7b8d2fb722f0606d5ab45f3aaafca159 WatchSource:0}: Error finding container 00e0a956062115931a4e8e8eb6b57c6b7b8d2fb722f0606d5ab45f3aaafca159: Status 404 returned error can't find the container with id 00e0a956062115931a4e8e8eb6b57c6b7b8d2fb722f0606d5ab45f3aaafca159 Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.029405 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.029475 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.029668 5047 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.029734 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs podName:a6a5f402-78f7-45b9-8358-c16da1787c4e nodeName:}" failed. No retries permitted until 2026-02-23 07:04:11.029710724 +0000 UTC m=+1173.281037858 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-k9ww2" (UID: "a6a5f402-78f7-45b9-8358-c16da1787c4e") : secret "webhook-server-cert" not found Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.037821 5047 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.038265 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs podName:a6a5f402-78f7-45b9-8358-c16da1787c4e nodeName:}" failed. No retries permitted until 2026-02-23 07:04:11.037890191 +0000 UTC m=+1173.289217325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-k9ww2" (UID: "a6a5f402-78f7-45b9-8358-c16da1787c4e") : secret "metrics-server-cert" not found Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.130102 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226" event={"ID":"a263ae62-717d-4032-b1e2-042f1b3e936e","Type":"ContainerStarted","Data":"00e0a956062115931a4e8e8eb6b57c6b7b8d2fb722f0606d5ab45f3aaafca159"} Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.154416 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-7gz7l" event={"ID":"8dcb699b-69b9-4a9c-86ff-d04bf088e297","Type":"ContainerStarted","Data":"f9b47c17434a201629242031b3ff0e128611b1c0b4de2d1eef7f92d2edcb6d1a"} Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.158225 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-2cbb6" event={"ID":"50b2c3a5-05f7-4f75-a03b-b79278119309","Type":"ContainerStarted","Data":"c8526f5177c6e3f773fa82eaa2455b187bc15f6265ee9bbab0ff9e5c6ec03403"} Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.166863 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6sl7g" event={"ID":"fd87e444-98c8-4de9-9a11-ec9678daaeaa","Type":"ContainerStarted","Data":"9dcae045f1d7ba48a9d64c9ae9dad01d8336b633e46a0e8cfb5f497ea5f9cf15"} Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.169131 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-cqtpj" event={"ID":"9334599a-1686-4e49-b6e2-799c2038e0df","Type":"ContainerStarted","Data":"f2abb339b3901adbe256d17e1df33ca0473cc631cc94753a22822ae3f9e9efae"} Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.175585 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-7642d" event={"ID":"d5a20255-deb8-4cda-a48c-2735b1e66247","Type":"ContainerStarted","Data":"33847cbdd4dc6ecc7d814606a6c369d8b55300117e2b0677324d5489a91cb198"} Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.232455 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert\") pod \"infra-operator-controller-manager-79d975b745-hfcm8\" (UID: \"4f170913-8ccb-42e2-9113-23fb049373c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.232696 5047 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.232763 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert podName:4f170913-8ccb-42e2-9113-23fb049373c9 nodeName:}" failed. No retries permitted until 2026-02-23 07:04:12.232741849 +0000 UTC m=+1174.484068983 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert") pod "infra-operator-controller-manager-79d975b745-hfcm8" (UID: "4f170913-8ccb-42e2-9113-23fb049373c9") : secret "infra-operator-webhook-server-cert" not found Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.336588 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl\" (UID: \"f1b21582-c058-4cbc-bc6b-95d77c4a526c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.336939 5047 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.337034 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert podName:f1b21582-c058-4cbc-bc6b-95d77c4a526c nodeName:}" failed. No retries permitted until 2026-02-23 07:04:12.337013974 +0000 UTC m=+1174.588341108 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" (UID: "f1b21582-c058-4cbc-bc6b-95d77c4a526c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.369492 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l"] Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.369532 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh"] Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.374743 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-pbjl2"] Feb 23 07:04:10 crc kubenswrapper[5047]: W0223 07:04:10.398188 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb047da04_c7bc_47c3_aae0_71b9d98a650e.slice/crio-a4b7d2255ecdfafc1e86250d40b626d6df3ba7bd3b94ce34dbc7fc8b49100a7c WatchSource:0}: Error finding container a4b7d2255ecdfafc1e86250d40b626d6df3ba7bd3b94ce34dbc7fc8b49100a7c: Status 404 returned error can't find the container with id a4b7d2255ecdfafc1e86250d40b626d6df3ba7bd3b94ce34dbc7fc8b49100a7c Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.482265 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck"] Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.517700 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv"] Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.540323 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j"] Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.540381 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-w297v"] Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.551987 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd"] Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.557221 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk"] Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.675548 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-5v756"] Feb 23 07:04:10 crc kubenswrapper[5047]: W0223 07:04:10.688462 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7340bc8e_daf4_474d_9b0e_3363755d8f43.slice/crio-63eefd4c763e1c03363ea6a5479ea9c98ce1ef3d9220dadbfcfa1c905ce1670c WatchSource:0}: Error finding container 63eefd4c763e1c03363ea6a5479ea9c98ce1ef3d9220dadbfcfa1c905ce1670c: Status 404 returned error can't find the container with id 63eefd4c763e1c03363ea6a5479ea9c98ce1ef3d9220dadbfcfa1c905ce1670c Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.693286 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm"] Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.699257 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb"] Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.703687 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp"] Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.720684 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vmlgq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-ft5sm_openstack-operators(e2712b1a-9cd8-40f1-aedd-8c6146c4182e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.721927 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm" podUID="e2712b1a-9cd8-40f1-aedd-8c6146c4182e" Feb 23 07:04:10 crc kubenswrapper[5047]: W0223 07:04:10.729129 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podada55ddd_c9d2_47ac_b8e2_efcfe2b45bd7.slice/crio-9078ee56a0890ba9f01a78d6db579e7fff35be885311caa454cd5cc98e001cb5 WatchSource:0}: Error finding container 9078ee56a0890ba9f01a78d6db579e7fff35be885311caa454cd5cc98e001cb5: Status 404 returned error can't find the container with id 9078ee56a0890ba9f01a78d6db579e7fff35be885311caa454cd5cc98e001cb5 Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.735042 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h2wcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-fqngb_openstack-operators(ada55ddd-c9d2-47ac-b8e2-efcfe2b45bd7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.736912 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb" podUID="ada55ddd-c9d2-47ac-b8e2-efcfe2b45bd7" Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.752209 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qcnnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-6f5tp_openstack-operators(139b2268-6b8e-42e2-934a-639863e06507): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 07:04:10 crc kubenswrapper[5047]: E0223 07:04:10.753626 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp" podUID="139b2268-6b8e-42e2-934a-639863e06507" Feb 23 07:04:10 crc kubenswrapper[5047]: I0223 07:04:10.810988 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p9fc8"] Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.059369 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.060019 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:11 crc kubenswrapper[5047]: E0223 07:04:11.059594 5047 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 07:04:11 crc kubenswrapper[5047]: E0223 07:04:11.060388 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs podName:a6a5f402-78f7-45b9-8358-c16da1787c4e nodeName:}" failed. No retries permitted until 2026-02-23 07:04:13.060350127 +0000 UTC m=+1175.311677451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-k9ww2" (UID: "a6a5f402-78f7-45b9-8358-c16da1787c4e") : secret "metrics-server-cert" not found Feb 23 07:04:11 crc kubenswrapper[5047]: E0223 07:04:11.060273 5047 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 07:04:11 crc kubenswrapper[5047]: E0223 07:04:11.060856 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs podName:a6a5f402-78f7-45b9-8358-c16da1787c4e nodeName:}" failed. No retries permitted until 2026-02-23 07:04:13.060832 +0000 UTC m=+1175.312159134 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-k9ww2" (UID: "a6a5f402-78f7-45b9-8358-c16da1787c4e") : secret "webhook-server-cert" not found Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.197014 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p9fc8" event={"ID":"d4f545e3-e6a7-41f1-84cb-895175df22cf","Type":"ContainerStarted","Data":"de91d6e3ebff131e0bd1d7c6099e1c1cccc79c1703c7effc64f3fef4fb60fa64"} Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.198628 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp" event={"ID":"139b2268-6b8e-42e2-934a-639863e06507","Type":"ContainerStarted","Data":"dcb80c5ad54091c85e5733932e128d6f9b9a9e4d67821c5b6bdc3915ed5b32fd"} Feb 23 07:04:11 crc kubenswrapper[5047]: E0223 07:04:11.213675 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp" podUID="139b2268-6b8e-42e2-934a-639863e06507" Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.218373 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm" event={"ID":"e2712b1a-9cd8-40f1-aedd-8c6146c4182e","Type":"ContainerStarted","Data":"7afd8ee25ab0a10796244b09da714f95b55fce014a57c7ec40f4b7a388e021ad"} Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.223262 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk" event={"ID":"f45166f5-93e4-4787-a25f-0c19e1e83cd5","Type":"ContainerStarted","Data":"9dc60de2fd9879d8bfbb946f3d394713ab8b8ec418f4e793adce4f13d310bb3b"} Feb 23 07:04:11 crc kubenswrapper[5047]: E0223 07:04:11.228478 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm" podUID="e2712b1a-9cd8-40f1-aedd-8c6146c4182e" Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.229944 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j" event={"ID":"d331d283-a586-4c54-9732-151a8050ed40","Type":"ContainerStarted","Data":"2ad01747efe03a8d0d9e8ffdf44a5bd498031d03d26ea59dffe812c5819d969d"} Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.257203 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv" event={"ID":"af0d0299-b4a6-47bd-99a1-567fd5126f5c","Type":"ContainerStarted","Data":"2415ca24e8707aed53de4c0a3470644aaa353fce318f5c5c68f81c9ee9b45863"} Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.263600 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh" event={"ID":"86dd701f-396e-4fc1-8d42-29befb968db9","Type":"ContainerStarted","Data":"ccc0b8bedffed8bbddd9346c2d60b4cce6ac0421e27a33e4002438446f229b2d"} Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.267062 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l" event={"ID":"c67d85ac-ca9b-4b6f-826d-c04dd6b6850b","Type":"ContainerStarted","Data":"bb76c4d72bf351ba7063afd6f7efe3a2a147e6b6673e6e33480f23c28a72c753"} Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.270409 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-5v756" event={"ID":"7340bc8e-daf4-474d-9b0e-3363755d8f43","Type":"ContainerStarted","Data":"63eefd4c763e1c03363ea6a5479ea9c98ce1ef3d9220dadbfcfa1c905ce1670c"} Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.272265 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck" event={"ID":"32942f2f-2abb-4f90-8444-ed9e77aeef57","Type":"ContainerStarted","Data":"0b5301558b20921fa66d8193fb7d1ff1428bfabe74bde50ff7fcb3f06192bcb3"} Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.284865 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pbjl2" event={"ID":"b047da04-c7bc-47c3-aae0-71b9d98a650e","Type":"ContainerStarted","Data":"a4b7d2255ecdfafc1e86250d40b626d6df3ba7bd3b94ce34dbc7fc8b49100a7c"} Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.288143 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd" event={"ID":"f55fc40d-b5bb-4ba4-8b1c-1136aad2fc04","Type":"ContainerStarted","Data":"79510fe73aee003d3f8b44ac485c7581455721cc3507713e03b0a98324df4836"} Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.289454 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-w297v" event={"ID":"9e1bca77-f8b9-41d3-8547-2d7e9bf8ac71","Type":"ContainerStarted","Data":"3efb72d236fdb51f47312e8eb40788aee1fef49122bf5a64b92ac26309e52c27"} Feb 23 07:04:11 crc kubenswrapper[5047]: I0223 07:04:11.297778 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb" event={"ID":"ada55ddd-c9d2-47ac-b8e2-efcfe2b45bd7","Type":"ContainerStarted","Data":"9078ee56a0890ba9f01a78d6db579e7fff35be885311caa454cd5cc98e001cb5"} Feb 23 07:04:11 crc kubenswrapper[5047]: E0223 07:04:11.302132 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb" podUID="ada55ddd-c9d2-47ac-b8e2-efcfe2b45bd7" Feb 23 07:04:12 crc kubenswrapper[5047]: I0223 07:04:12.284648 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert\") pod \"infra-operator-controller-manager-79d975b745-hfcm8\" (UID: \"4f170913-8ccb-42e2-9113-23fb049373c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:12 crc kubenswrapper[5047]: E0223 07:04:12.284963 5047 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 07:04:12 crc kubenswrapper[5047]: E0223 07:04:12.285040 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert podName:4f170913-8ccb-42e2-9113-23fb049373c9 nodeName:}" failed. No retries permitted until 2026-02-23 07:04:16.285017656 +0000 UTC m=+1178.536344780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert") pod "infra-operator-controller-manager-79d975b745-hfcm8" (UID: "4f170913-8ccb-42e2-9113-23fb049373c9") : secret "infra-operator-webhook-server-cert" not found Feb 23 07:04:12 crc kubenswrapper[5047]: E0223 07:04:12.381341 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp" podUID="139b2268-6b8e-42e2-934a-639863e06507" Feb 23 07:04:12 crc kubenswrapper[5047]: E0223 07:04:12.381467 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb" podUID="ada55ddd-c9d2-47ac-b8e2-efcfe2b45bd7" Feb 23 07:04:12 crc kubenswrapper[5047]: I0223 07:04:12.386440 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl\" (UID: \"f1b21582-c058-4cbc-bc6b-95d77c4a526c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:12 crc kubenswrapper[5047]: E0223 07:04:12.387871 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm" podUID="e2712b1a-9cd8-40f1-aedd-8c6146c4182e" Feb 23 07:04:12 crc kubenswrapper[5047]: E0223 07:04:12.388640 5047 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:04:12 crc kubenswrapper[5047]: E0223 07:04:12.388703 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert podName:f1b21582-c058-4cbc-bc6b-95d77c4a526c nodeName:}" failed. No retries permitted until 2026-02-23 07:04:16.388683596 +0000 UTC m=+1178.640010730 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" (UID: "f1b21582-c058-4cbc-bc6b-95d77c4a526c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:04:13 crc kubenswrapper[5047]: E0223 07:04:13.109247 5047 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 07:04:13 crc kubenswrapper[5047]: E0223 07:04:13.109763 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs podName:a6a5f402-78f7-45b9-8358-c16da1787c4e nodeName:}" failed. No retries permitted until 2026-02-23 07:04:17.109742359 +0000 UTC m=+1179.361069493 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-k9ww2" (UID: "a6a5f402-78f7-45b9-8358-c16da1787c4e") : secret "metrics-server-cert" not found Feb 23 07:04:13 crc kubenswrapper[5047]: I0223 07:04:13.114030 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:13 crc kubenswrapper[5047]: I0223 07:04:13.114158 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:13 crc kubenswrapper[5047]: E0223 07:04:13.114281 5047 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 07:04:13 crc kubenswrapper[5047]: E0223 07:04:13.114351 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs podName:a6a5f402-78f7-45b9-8358-c16da1787c4e nodeName:}" failed. No retries permitted until 2026-02-23 07:04:17.114338561 +0000 UTC m=+1179.365665895 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-k9ww2" (UID: "a6a5f402-78f7-45b9-8358-c16da1787c4e") : secret "webhook-server-cert" not found Feb 23 07:04:16 crc kubenswrapper[5047]: E0223 07:04:16.378922 5047 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 07:04:16 crc kubenswrapper[5047]: E0223 07:04:16.379325 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert podName:4f170913-8ccb-42e2-9113-23fb049373c9 nodeName:}" failed. No retries permitted until 2026-02-23 07:04:24.379301171 +0000 UTC m=+1186.630628305 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert") pod "infra-operator-controller-manager-79d975b745-hfcm8" (UID: "4f170913-8ccb-42e2-9113-23fb049373c9") : secret "infra-operator-webhook-server-cert" not found Feb 23 07:04:16 crc kubenswrapper[5047]: I0223 07:04:16.379027 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert\") pod \"infra-operator-controller-manager-79d975b745-hfcm8\" (UID: \"4f170913-8ccb-42e2-9113-23fb049373c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:16 crc kubenswrapper[5047]: I0223 07:04:16.481461 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl\" (UID: \"f1b21582-c058-4cbc-bc6b-95d77c4a526c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:16 crc kubenswrapper[5047]: E0223 07:04:16.481750 5047 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:04:16 crc kubenswrapper[5047]: E0223 07:04:16.481941 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert podName:f1b21582-c058-4cbc-bc6b-95d77c4a526c nodeName:}" failed. No retries permitted until 2026-02-23 07:04:24.481870421 +0000 UTC m=+1186.733197595 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" (UID: "f1b21582-c058-4cbc-bc6b-95d77c4a526c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:04:17 crc kubenswrapper[5047]: I0223 07:04:17.194827 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:17 crc kubenswrapper[5047]: I0223 07:04:17.195055 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:17 crc kubenswrapper[5047]: E0223 07:04:17.195134 5047 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 07:04:17 crc kubenswrapper[5047]: E0223 07:04:17.195246 5047 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 07:04:17 crc kubenswrapper[5047]: E0223 07:04:17.195252 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs podName:a6a5f402-78f7-45b9-8358-c16da1787c4e nodeName:}" failed. No retries permitted until 2026-02-23 07:04:25.19522396 +0000 UTC m=+1187.446551104 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-k9ww2" (UID: "a6a5f402-78f7-45b9-8358-c16da1787c4e") : secret "webhook-server-cert" not found Feb 23 07:04:17 crc kubenswrapper[5047]: E0223 07:04:17.195364 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs podName:a6a5f402-78f7-45b9-8358-c16da1787c4e nodeName:}" failed. No retries permitted until 2026-02-23 07:04:25.195325002 +0000 UTC m=+1187.446652376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-k9ww2" (UID: "a6a5f402-78f7-45b9-8358-c16da1787c4e") : secret "metrics-server-cert" not found Feb 23 07:04:23 crc kubenswrapper[5047]: E0223 07:04:23.293329 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a" Feb 23 07:04:23 crc kubenswrapper[5047]: E0223 07:04:23.294626 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4cjxd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-lmr7j_openstack-operators(d331d283-a586-4c54-9732-151a8050ed40): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:04:23 crc kubenswrapper[5047]: E0223 07:04:23.296163 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j" podUID="d331d283-a586-4c54-9732-151a8050ed40" Feb 23 07:04:23 crc kubenswrapper[5047]: E0223 07:04:23.532044 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j" podUID="d331d283-a586-4c54-9732-151a8050ed40" Feb 23 07:04:24 crc kubenswrapper[5047]: E0223 07:04:24.106557 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04" Feb 23 07:04:24 crc kubenswrapper[5047]: E0223 07:04:24.106790 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gxvl8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-d8rhk_openstack-operators(f45166f5-93e4-4787-a25f-0c19e1e83cd5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:04:24 crc kubenswrapper[5047]: E0223 07:04:24.108130 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk" podUID="f45166f5-93e4-4787-a25f-0c19e1e83cd5" Feb 23 07:04:24 crc kubenswrapper[5047]: I0223 07:04:24.435246 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert\") pod \"infra-operator-controller-manager-79d975b745-hfcm8\" (UID: \"4f170913-8ccb-42e2-9113-23fb049373c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:24 crc kubenswrapper[5047]: I0223 07:04:24.457837 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f170913-8ccb-42e2-9113-23fb049373c9-cert\") pod \"infra-operator-controller-manager-79d975b745-hfcm8\" (UID: \"4f170913-8ccb-42e2-9113-23fb049373c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:24 crc kubenswrapper[5047]: I0223 07:04:24.536851 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl\" (UID: \"f1b21582-c058-4cbc-bc6b-95d77c4a526c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:24 crc kubenswrapper[5047]: E0223 07:04:24.539764 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk" podUID="f45166f5-93e4-4787-a25f-0c19e1e83cd5" Feb 23 07:04:24 crc kubenswrapper[5047]: I0223 07:04:24.545571 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1b21582-c058-4cbc-bc6b-95d77c4a526c-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl\" (UID: \"f1b21582-c058-4cbc-bc6b-95d77c4a526c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:24 crc kubenswrapper[5047]: I0223 07:04:24.716320 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:24 crc kubenswrapper[5047]: I0223 07:04:24.752671 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:24 crc kubenswrapper[5047]: E0223 07:04:24.799376 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867" Feb 23 07:04:24 crc kubenswrapper[5047]: E0223 07:04:24.799619 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2kdlj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-554564d7fc-qr226_openstack-operators(a263ae62-717d-4032-b1e2-042f1b3e936e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:04:24 crc kubenswrapper[5047]: E0223 07:04:24.800826 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226" podUID="a263ae62-717d-4032-b1e2-042f1b3e936e" Feb 23 07:04:25 crc kubenswrapper[5047]: I0223 07:04:25.250317 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:25 crc kubenswrapper[5047]: I0223 07:04:25.250917 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:25 crc kubenswrapper[5047]: E0223 07:04:25.251098 5047 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 07:04:25 crc kubenswrapper[5047]: E0223 07:04:25.251199 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs podName:a6a5f402-78f7-45b9-8358-c16da1787c4e nodeName:}" failed. No retries permitted until 2026-02-23 07:04:41.251172241 +0000 UTC m=+1203.502499375 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-k9ww2" (UID: "a6a5f402-78f7-45b9-8358-c16da1787c4e") : secret "webhook-server-cert" not found Feb 23 07:04:25 crc kubenswrapper[5047]: I0223 07:04:25.257232 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:25 crc kubenswrapper[5047]: E0223 07:04:25.423187 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c" Feb 23 07:04:25 crc kubenswrapper[5047]: E0223 07:04:25.423429 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w58qm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-bg6jh_openstack-operators(86dd701f-396e-4fc1-8d42-29befb968db9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:04:25 crc kubenswrapper[5047]: E0223 07:04:25.424753 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh" podUID="86dd701f-396e-4fc1-8d42-29befb968db9" Feb 23 07:04:25 crc kubenswrapper[5047]: E0223 07:04:25.554638 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226" podUID="a263ae62-717d-4032-b1e2-042f1b3e936e" Feb 23 07:04:25 crc kubenswrapper[5047]: E0223 07:04:25.554870 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh" podUID="86dd701f-396e-4fc1-8d42-29befb968db9" Feb 23 07:04:26 crc kubenswrapper[5047]: E0223 07:04:26.067069 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 23 07:04:26 crc kubenswrapper[5047]: E0223 07:04:26.067487 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kj4kl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-bs8mv_openstack-operators(af0d0299-b4a6-47bd-99a1-567fd5126f5c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:04:26 crc kubenswrapper[5047]: E0223 07:04:26.068938 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv" podUID="af0d0299-b4a6-47bd-99a1-567fd5126f5c" Feb 23 07:04:26 crc kubenswrapper[5047]: E0223 07:04:26.562272 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv" podUID="af0d0299-b4a6-47bd-99a1-567fd5126f5c" Feb 23 07:04:28 crc kubenswrapper[5047]: E0223 07:04:28.197961 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd" Feb 23 07:04:28 crc kubenswrapper[5047]: E0223 07:04:28.198167 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-56nqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-t4qfd_openstack-operators(f55fc40d-b5bb-4ba4-8b1c-1136aad2fc04): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:04:28 crc kubenswrapper[5047]: E0223 07:04:28.199337 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd" podUID="f55fc40d-b5bb-4ba4-8b1c-1136aad2fc04" Feb 23 07:04:28 crc kubenswrapper[5047]: E0223 07:04:28.577558 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd" podUID="f55fc40d-b5bb-4ba4-8b1c-1136aad2fc04" Feb 23 07:04:28 crc kubenswrapper[5047]: E0223 07:04:28.846211 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 23 07:04:28 crc kubenswrapper[5047]: E0223 07:04:28.846562 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tv554,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-wdx4l_openstack-operators(c67d85ac-ca9b-4b6f-826d-c04dd6b6850b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:04:28 crc kubenswrapper[5047]: E0223 07:04:28.848791 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l" podUID="c67d85ac-ca9b-4b6f-826d-c04dd6b6850b" Feb 23 07:04:29 crc kubenswrapper[5047]: E0223 07:04:29.442010 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 23 07:04:29 crc kubenswrapper[5047]: E0223 07:04:29.443148 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wvbnh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-h6tck_openstack-operators(32942f2f-2abb-4f90-8444-ed9e77aeef57): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:04:29 crc kubenswrapper[5047]: E0223 07:04:29.444718 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck" podUID="32942f2f-2abb-4f90-8444-ed9e77aeef57" Feb 23 07:04:29 crc kubenswrapper[5047]: E0223 07:04:29.587987 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l" podUID="c67d85ac-ca9b-4b6f-826d-c04dd6b6850b" Feb 23 07:04:29 crc kubenswrapper[5047]: E0223 07:04:29.588370 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck" podUID="32942f2f-2abb-4f90-8444-ed9e77aeef57" Feb 23 07:04:31 crc kubenswrapper[5047]: E0223 07:04:31.301269 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 23 07:04:31 crc kubenswrapper[5047]: E0223 07:04:31.301625 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rdkrf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-p9fc8_openstack-operators(d4f545e3-e6a7-41f1-84cb-895175df22cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:04:31 crc kubenswrapper[5047]: E0223 07:04:31.302931 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p9fc8" podUID="d4f545e3-e6a7-41f1-84cb-895175df22cf" Feb 23 07:04:31 crc kubenswrapper[5047]: E0223 07:04:31.618755 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p9fc8" podUID="d4f545e3-e6a7-41f1-84cb-895175df22cf" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.361092 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl"] Feb 23 07:04:32 crc kubenswrapper[5047]: W0223 07:04:32.368757 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1b21582_c058_4cbc_bc6b_95d77c4a526c.slice/crio-77ce58576d88ff4e7c6b44775afe00437e94f52101d0c564aab6d49ad0512e8d WatchSource:0}: Error finding container 77ce58576d88ff4e7c6b44775afe00437e94f52101d0c564aab6d49ad0512e8d: Status 404 returned error can't find the container with id 77ce58576d88ff4e7c6b44775afe00437e94f52101d0c564aab6d49ad0512e8d Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.407752 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8"] Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.637755 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm" event={"ID":"e2712b1a-9cd8-40f1-aedd-8c6146c4182e","Type":"ContainerStarted","Data":"c300fe9084c5750894e5e90b7ade79bcae5cbc0d51bb1fd676766e921d7abd7c"} Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.638434 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.643238 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-7gz7l" event={"ID":"8dcb699b-69b9-4a9c-86ff-d04bf088e297","Type":"ContainerStarted","Data":"231eb7f78c490263983235d5b2c8fd4d9e1fa836da22ff3254d2f8856022de4e"} Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.643800 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-7gz7l" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.645536 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-w297v" event={"ID":"9e1bca77-f8b9-41d3-8547-2d7e9bf8ac71","Type":"ContainerStarted","Data":"348ef0813d2942607bad7f30aa52d2f9d6ada51a7e35536128226580f5cf3c9e"} Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.646028 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-w297v" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.650495 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-7642d" event={"ID":"d5a20255-deb8-4cda-a48c-2735b1e66247","Type":"ContainerStarted","Data":"cb866db3c70931486b9bab56fe8b6b93bc39d7afe54589d6fe65aaaf3efcc3e9"} Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.650633 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-7642d" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.663419 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-cqtpj" event={"ID":"9334599a-1686-4e49-b6e2-799c2038e0df","Type":"ContainerStarted","Data":"198785ee877fbf97ed8d84a15205db5831a35976a57585e3e81cea24e283a833"} Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.663587 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-cqtpj" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.665641 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pbjl2" event={"ID":"b047da04-c7bc-47c3-aae0-71b9d98a650e","Type":"ContainerStarted","Data":"c50ab91b87a43c28737fa353f65445838ee18ffcca4b53bef6711abf845fd50c"} Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.666015 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pbjl2" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.680575 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm" podStartSLOduration=3.340376143 podStartE2EDuration="24.680552735s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.720424852 +0000 UTC m=+1172.971751986" lastFinishedPulling="2026-02-23 07:04:32.060601444 +0000 UTC m=+1194.311928578" observedRunningTime="2026-02-23 07:04:32.679485087 +0000 UTC m=+1194.930812221" watchObservedRunningTime="2026-02-23 07:04:32.680552735 +0000 UTC m=+1194.931879869" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.682129 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb" event={"ID":"ada55ddd-c9d2-47ac-b8e2-efcfe2b45bd7","Type":"ContainerStarted","Data":"f4bdd157528924345b1f448ae7a7528a8bd64274db9674eaf073af2ceae8a919"} Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.682565 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.693093 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-2cbb6" event={"ID":"50b2c3a5-05f7-4f75-a03b-b79278119309","Type":"ContainerStarted","Data":"d3fea9ff8db8f0d10e24daa1ca96e0811a1b09e6c72517af2e2f055103d77e41"} Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.693501 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-2cbb6" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.700692 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" event={"ID":"f1b21582-c058-4cbc-bc6b-95d77c4a526c","Type":"ContainerStarted","Data":"77ce58576d88ff4e7c6b44775afe00437e94f52101d0c564aab6d49ad0512e8d"} Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.708371 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-w297v" podStartSLOduration=5.228813478 podStartE2EDuration="24.708351313s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.579754082 +0000 UTC m=+1172.831081216" lastFinishedPulling="2026-02-23 07:04:30.059291887 +0000 UTC m=+1192.310619051" observedRunningTime="2026-02-23 07:04:32.701538041 +0000 UTC m=+1194.952865175" watchObservedRunningTime="2026-02-23 07:04:32.708351313 +0000 UTC m=+1194.959678447" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.722406 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6sl7g" event={"ID":"fd87e444-98c8-4de9-9a11-ec9678daaeaa","Type":"ContainerStarted","Data":"0ac03d14fa7e129ab47d904e0c5108f78e3711f7d0495280464aec1233fa5b61"} Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.723266 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6sl7g" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.733891 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-5v756" event={"ID":"7340bc8e-daf4-474d-9b0e-3363755d8f43","Type":"ContainerStarted","Data":"26eb8343fad841c4e4d91a917359b88ace630dbe961b073a554c34f8ac3d1842"} Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.734641 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-5v756" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.739162 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" event={"ID":"4f170913-8ccb-42e2-9113-23fb049373c9","Type":"ContainerStarted","Data":"84bcc274f0e5c969ccd0c8f2cbd42bc4c32ea707ed4398550e6d61c735e94601"} Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.742644 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-cqtpj" podStartSLOduration=4.48045845 podStartE2EDuration="24.742617391s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:09.797005492 +0000 UTC m=+1172.048332626" lastFinishedPulling="2026-02-23 07:04:30.059164433 +0000 UTC m=+1192.310491567" observedRunningTime="2026-02-23 07:04:32.734499896 +0000 UTC m=+1194.985827020" watchObservedRunningTime="2026-02-23 07:04:32.742617391 +0000 UTC m=+1194.993944525" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.748462 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp" event={"ID":"139b2268-6b8e-42e2-934a-639863e06507","Type":"ContainerStarted","Data":"f23aa791e1abc43903dca52b89efdce00a715dbd83d0cc6f603757307dddffc3"} Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.748812 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.777650 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-7642d" podStartSLOduration=4.700041354 podStartE2EDuration="24.77762878s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:09.34899523 +0000 UTC m=+1171.600322364" lastFinishedPulling="2026-02-23 07:04:29.426582656 +0000 UTC m=+1191.677909790" observedRunningTime="2026-02-23 07:04:32.770654865 +0000 UTC m=+1195.021981999" watchObservedRunningTime="2026-02-23 07:04:32.77762878 +0000 UTC m=+1195.028955914" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.806635 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-7gz7l" podStartSLOduration=4.5953213569999996 podStartE2EDuration="24.806611519s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:09.847876081 +0000 UTC m=+1172.099203215" lastFinishedPulling="2026-02-23 07:04:30.059166233 +0000 UTC m=+1192.310493377" observedRunningTime="2026-02-23 07:04:32.803431594 +0000 UTC m=+1195.054758738" watchObservedRunningTime="2026-02-23 07:04:32.806611519 +0000 UTC m=+1195.057938663" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.844172 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pbjl2" podStartSLOduration=4.471890383 podStartE2EDuration="24.844150764s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.401116334 +0000 UTC m=+1172.652443468" lastFinishedPulling="2026-02-23 07:04:30.773376715 +0000 UTC m=+1193.024703849" observedRunningTime="2026-02-23 07:04:32.841583366 +0000 UTC m=+1195.092910500" watchObservedRunningTime="2026-02-23 07:04:32.844150764 +0000 UTC m=+1195.095477898" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.890070 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb" podStartSLOduration=3.646418841 podStartE2EDuration="24.890046351s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.734847465 +0000 UTC m=+1172.986174599" lastFinishedPulling="2026-02-23 07:04:31.978474975 +0000 UTC m=+1194.229802109" observedRunningTime="2026-02-23 07:04:32.882566063 +0000 UTC m=+1195.133893197" watchObservedRunningTime="2026-02-23 07:04:32.890046351 +0000 UTC m=+1195.141373485" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.914714 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6sl7g" podStartSLOduration=5.049364047 podStartE2EDuration="24.914690244s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:09.561028934 +0000 UTC m=+1171.812356068" lastFinishedPulling="2026-02-23 07:04:29.426355091 +0000 UTC m=+1191.677682265" observedRunningTime="2026-02-23 07:04:32.91113503 +0000 UTC m=+1195.162462154" watchObservedRunningTime="2026-02-23 07:04:32.914690244 +0000 UTC m=+1195.166017378" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.931161 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp" podStartSLOduration=3.760798024 podStartE2EDuration="24.931137931s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.751983409 +0000 UTC m=+1173.003310543" lastFinishedPulling="2026-02-23 07:04:31.922323316 +0000 UTC m=+1194.173650450" observedRunningTime="2026-02-23 07:04:32.926896839 +0000 UTC m=+1195.178223983" watchObservedRunningTime="2026-02-23 07:04:32.931137931 +0000 UTC m=+1195.182465065" Feb 23 07:04:32 crc kubenswrapper[5047]: I0223 07:04:32.955526 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-5v756" podStartSLOduration=5.591051754 podStartE2EDuration="24.955502307s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.695388368 +0000 UTC m=+1172.946715502" lastFinishedPulling="2026-02-23 07:04:30.059838891 +0000 UTC m=+1192.311166055" observedRunningTime="2026-02-23 07:04:32.951560763 +0000 UTC m=+1195.202887897" watchObservedRunningTime="2026-02-23 07:04:32.955502307 +0000 UTC m=+1195.206829441" Feb 23 07:04:33 crc kubenswrapper[5047]: I0223 07:04:33.004775 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-2cbb6" podStartSLOduration=3.837337824 podStartE2EDuration="25.004754894s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:09.607678671 +0000 UTC m=+1171.859005805" lastFinishedPulling="2026-02-23 07:04:30.775095731 +0000 UTC m=+1193.026422875" observedRunningTime="2026-02-23 07:04:32.998208669 +0000 UTC m=+1195.249535793" watchObservedRunningTime="2026-02-23 07:04:33.004754894 +0000 UTC m=+1195.256082028" Feb 23 07:04:36 crc kubenswrapper[5047]: I0223 07:04:36.811238 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" event={"ID":"4f170913-8ccb-42e2-9113-23fb049373c9","Type":"ContainerStarted","Data":"146482479e48048fd5637cdaf2d0810be630e173d9369ff3bd9c8bb456905203"} Feb 23 07:04:36 crc kubenswrapper[5047]: I0223 07:04:36.812278 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:36 crc kubenswrapper[5047]: I0223 07:04:36.814014 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" event={"ID":"f1b21582-c058-4cbc-bc6b-95d77c4a526c","Type":"ContainerStarted","Data":"eb1af65618fcbe517aa386717c9112d62a7edbb954d0cb4d6a061a44ae1e35e8"} Feb 23 07:04:36 crc kubenswrapper[5047]: I0223 07:04:36.815535 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:36 crc kubenswrapper[5047]: I0223 07:04:36.844279 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" podStartSLOduration=25.604477578 podStartE2EDuration="28.84424703s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:32.444926726 +0000 UTC m=+1194.696253860" lastFinishedPulling="2026-02-23 07:04:35.684696178 +0000 UTC m=+1197.936023312" observedRunningTime="2026-02-23 07:04:36.839230997 +0000 UTC m=+1199.090558181" watchObservedRunningTime="2026-02-23 07:04:36.84424703 +0000 UTC m=+1199.095574204" Feb 23 07:04:36 crc kubenswrapper[5047]: I0223 07:04:36.881826 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" podStartSLOduration=25.593761175 podStartE2EDuration="28.881794407s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:32.37117313 +0000 UTC m=+1194.622500264" lastFinishedPulling="2026-02-23 07:04:35.659206362 +0000 UTC m=+1197.910533496" observedRunningTime="2026-02-23 07:04:36.877542283 +0000 UTC m=+1199.128869497" watchObservedRunningTime="2026-02-23 07:04:36.881794407 +0000 UTC m=+1199.133121551" Feb 23 07:04:38 crc kubenswrapper[5047]: I0223 07:04:38.542336 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-7642d" Feb 23 07:04:38 crc kubenswrapper[5047]: I0223 07:04:38.642547 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-2cbb6" Feb 23 07:04:38 crc kubenswrapper[5047]: I0223 07:04:38.648362 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-6sl7g" Feb 23 07:04:38 crc kubenswrapper[5047]: I0223 07:04:38.706076 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-7gz7l" Feb 23 07:04:38 crc kubenswrapper[5047]: I0223 07:04:38.788724 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-cqtpj" Feb 23 07:04:38 crc kubenswrapper[5047]: I0223 07:04:38.831927 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv" event={"ID":"af0d0299-b4a6-47bd-99a1-567fd5126f5c","Type":"ContainerStarted","Data":"d64d219f8b740d85810b9a01946a6510699d091197e8ddd1a08a860919fa8fd2"} Feb 23 07:04:38 crc kubenswrapper[5047]: I0223 07:04:38.832887 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv" Feb 23 07:04:38 crc kubenswrapper[5047]: I0223 07:04:38.859656 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv" podStartSLOduration=3.581040557 podStartE2EDuration="30.859635031s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.573253839 +0000 UTC m=+1172.824580963" lastFinishedPulling="2026-02-23 07:04:37.851848263 +0000 UTC m=+1200.103175437" observedRunningTime="2026-02-23 07:04:38.852374828 +0000 UTC m=+1201.103701962" watchObservedRunningTime="2026-02-23 07:04:38.859635031 +0000 UTC m=+1201.110962165" Feb 23 07:04:39 crc kubenswrapper[5047]: I0223 07:04:39.028688 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pbjl2" Feb 23 07:04:39 crc kubenswrapper[5047]: I0223 07:04:39.124929 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ft5sm" Feb 23 07:04:39 crc kubenswrapper[5047]: I0223 07:04:39.167652 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-w297v" Feb 23 07:04:39 crc kubenswrapper[5047]: I0223 07:04:39.240887 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-fqngb" Feb 23 07:04:39 crc kubenswrapper[5047]: I0223 07:04:39.458525 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-5v756" Feb 23 07:04:39 crc kubenswrapper[5047]: I0223 07:04:39.509431 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6f5tp" Feb 23 07:04:39 crc kubenswrapper[5047]: I0223 07:04:39.844132 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk" event={"ID":"f45166f5-93e4-4787-a25f-0c19e1e83cd5","Type":"ContainerStarted","Data":"68d4569b2d2f444d789b3bcb8fe29c9816261d2dd4e38e147a978cbcd2493320"} Feb 23 07:04:39 crc kubenswrapper[5047]: I0223 07:04:39.845490 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh" event={"ID":"86dd701f-396e-4fc1-8d42-29befb968db9","Type":"ContainerStarted","Data":"df18e2022ac90e298b7eea1fa36589d6bb0dd53086e51bf505d50dd92af305ad"} Feb 23 07:04:39 crc kubenswrapper[5047]: I0223 07:04:39.846696 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j" event={"ID":"d331d283-a586-4c54-9732-151a8050ed40","Type":"ContainerStarted","Data":"d75f9a5cb971b4cb0b1867b794c4e46733ca25502bcf72b3dbc822aae05d3481"} Feb 23 07:04:40 crc kubenswrapper[5047]: I0223 07:04:40.861645 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226" event={"ID":"a263ae62-717d-4032-b1e2-042f1b3e936e","Type":"ContainerStarted","Data":"460406fe17e32cb4f19069da92bf347692005608a183dc315774e0835d4a0894"} Feb 23 07:04:40 crc kubenswrapper[5047]: I0223 07:04:40.863136 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226" Feb 23 07:04:40 crc kubenswrapper[5047]: I0223 07:04:40.863429 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j" Feb 23 07:04:40 crc kubenswrapper[5047]: I0223 07:04:40.863537 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk" Feb 23 07:04:40 crc kubenswrapper[5047]: I0223 07:04:40.863632 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh" Feb 23 07:04:40 crc kubenswrapper[5047]: I0223 07:04:40.882208 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j" podStartSLOduration=4.652457742 podStartE2EDuration="32.882178821s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.610691202 +0000 UTC m=+1172.862018336" lastFinishedPulling="2026-02-23 07:04:38.840412281 +0000 UTC m=+1201.091739415" observedRunningTime="2026-02-23 07:04:40.878092912 +0000 UTC m=+1203.129420076" watchObservedRunningTime="2026-02-23 07:04:40.882178821 +0000 UTC m=+1203.133505955" Feb 23 07:04:40 crc kubenswrapper[5047]: I0223 07:04:40.914988 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226" podStartSLOduration=2.7236568180000003 podStartE2EDuration="32.9149642s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.035561499 +0000 UTC m=+1172.286888633" lastFinishedPulling="2026-02-23 07:04:40.226868871 +0000 UTC m=+1202.478196015" observedRunningTime="2026-02-23 07:04:40.909071953 +0000 UTC m=+1203.160399097" watchObservedRunningTime="2026-02-23 07:04:40.9149642 +0000 UTC m=+1203.166291334" Feb 23 07:04:40 crc kubenswrapper[5047]: I0223 07:04:40.936073 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh" podStartSLOduration=4.463473509 podStartE2EDuration="32.936047799s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.371247671 +0000 UTC m=+1172.622574805" lastFinishedPulling="2026-02-23 07:04:38.843821961 +0000 UTC m=+1201.095149095" observedRunningTime="2026-02-23 07:04:40.933601654 +0000 UTC m=+1203.184928828" watchObservedRunningTime="2026-02-23 07:04:40.936047799 +0000 UTC m=+1203.187374943" Feb 23 07:04:40 crc kubenswrapper[5047]: I0223 07:04:40.955815 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk" podStartSLOduration=4.707654756 podStartE2EDuration="32.955790343s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.591972326 +0000 UTC m=+1172.843299460" lastFinishedPulling="2026-02-23 07:04:38.840107913 +0000 UTC m=+1201.091435047" observedRunningTime="2026-02-23 07:04:40.954433426 +0000 UTC m=+1203.205760570" watchObservedRunningTime="2026-02-23 07:04:40.955790343 +0000 UTC m=+1203.207117497" Feb 23 07:04:41 crc kubenswrapper[5047]: I0223 07:04:41.301458 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:41 crc kubenswrapper[5047]: I0223 07:04:41.316453 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a6a5f402-78f7-45b9-8358-c16da1787c4e-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-k9ww2\" (UID: \"a6a5f402-78f7-45b9-8358-c16da1787c4e\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:41 crc kubenswrapper[5047]: I0223 07:04:41.496165 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5thq4" Feb 23 07:04:41 crc kubenswrapper[5047]: I0223 07:04:41.502768 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:41 crc kubenswrapper[5047]: I0223 07:04:41.870505 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd" event={"ID":"f55fc40d-b5bb-4ba4-8b1c-1136aad2fc04","Type":"ContainerStarted","Data":"a5ed4e4c7154dc5bc66d3670143ca2e8c7af2989d65d51e33bdeefde1d5da914"} Feb 23 07:04:41 crc kubenswrapper[5047]: I0223 07:04:41.871072 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd" Feb 23 07:04:41 crc kubenswrapper[5047]: I0223 07:04:41.897133 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd" podStartSLOduration=3.698057778 podStartE2EDuration="33.897110137s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.610116956 +0000 UTC m=+1172.861444090" lastFinishedPulling="2026-02-23 07:04:40.809169275 +0000 UTC m=+1203.060496449" observedRunningTime="2026-02-23 07:04:41.893288026 +0000 UTC m=+1204.144615150" watchObservedRunningTime="2026-02-23 07:04:41.897110137 +0000 UTC m=+1204.148437271" Feb 23 07:04:41 crc kubenswrapper[5047]: I0223 07:04:41.916351 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2"] Feb 23 07:04:42 crc kubenswrapper[5047]: I0223 07:04:42.882855 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" event={"ID":"a6a5f402-78f7-45b9-8358-c16da1787c4e","Type":"ContainerStarted","Data":"b165e1eed7119207e9222de79d1c75315516bdfdc5fd41024b9cdc512f6d3983"} Feb 23 07:04:42 crc kubenswrapper[5047]: I0223 07:04:42.882939 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" event={"ID":"a6a5f402-78f7-45b9-8358-c16da1787c4e","Type":"ContainerStarted","Data":"32516abd2db25c1ff771c40693918d4554fe36989b0173d4294d84bfc8e9b7dc"} Feb 23 07:04:42 crc kubenswrapper[5047]: I0223 07:04:42.883040 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:42 crc kubenswrapper[5047]: I0223 07:04:42.887444 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck" event={"ID":"32942f2f-2abb-4f90-8444-ed9e77aeef57","Type":"ContainerStarted","Data":"7688d2be229753ec7f7d6a057656a0126a62fdcad9ecf71626a57fabd9f08548"} Feb 23 07:04:42 crc kubenswrapper[5047]: I0223 07:04:42.887669 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck" Feb 23 07:04:42 crc kubenswrapper[5047]: I0223 07:04:42.890104 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l" event={"ID":"c67d85ac-ca9b-4b6f-826d-c04dd6b6850b","Type":"ContainerStarted","Data":"23b86ffbae9d29ac4044b180c2f092b246d588b476e6ca409ecb26fbe4d9b6d5"} Feb 23 07:04:42 crc kubenswrapper[5047]: I0223 07:04:42.890445 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l" Feb 23 07:04:42 crc kubenswrapper[5047]: I0223 07:04:42.921631 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" podStartSLOduration=34.921600678 podStartE2EDuration="34.921600678s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:04:42.919385609 +0000 UTC m=+1205.170712753" watchObservedRunningTime="2026-02-23 07:04:42.921600678 +0000 UTC m=+1205.172927812" Feb 23 07:04:42 crc kubenswrapper[5047]: I0223 07:04:42.943975 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l" podStartSLOduration=3.485992056 podStartE2EDuration="34.943947151s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.365674254 +0000 UTC m=+1172.617001388" lastFinishedPulling="2026-02-23 07:04:41.823629349 +0000 UTC m=+1204.074956483" observedRunningTime="2026-02-23 07:04:42.940704644 +0000 UTC m=+1205.192031798" watchObservedRunningTime="2026-02-23 07:04:42.943947151 +0000 UTC m=+1205.195274285" Feb 23 07:04:42 crc kubenswrapper[5047]: I0223 07:04:42.958156 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck" podStartSLOduration=3.692632218 podStartE2EDuration="34.958120007s" podCreationTimestamp="2026-02-23 07:04:08 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.487525216 +0000 UTC m=+1172.738852350" lastFinishedPulling="2026-02-23 07:04:41.753013005 +0000 UTC m=+1204.004340139" observedRunningTime="2026-02-23 07:04:42.954766677 +0000 UTC m=+1205.206093811" watchObservedRunningTime="2026-02-23 07:04:42.958120007 +0000 UTC m=+1205.209447181" Feb 23 07:04:44 crc kubenswrapper[5047]: I0223 07:04:44.728572 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-hfcm8" Feb 23 07:04:44 crc kubenswrapper[5047]: I0223 07:04:44.763750 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl" Feb 23 07:04:48 crc kubenswrapper[5047]: I0223 07:04:48.966424 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qr226" Feb 23 07:04:49 crc kubenswrapper[5047]: I0223 07:04:49.019473 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-wdx4l" Feb 23 07:04:49 crc kubenswrapper[5047]: I0223 07:04:49.046308 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lmr7j" Feb 23 07:04:49 crc kubenswrapper[5047]: I0223 07:04:49.057254 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bg6jh" Feb 23 07:04:49 crc kubenswrapper[5047]: I0223 07:04:49.073974 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bs8mv" Feb 23 07:04:49 crc kubenswrapper[5047]: I0223 07:04:49.097485 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-h6tck" Feb 23 07:04:49 crc kubenswrapper[5047]: I0223 07:04:49.187958 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-t4qfd" Feb 23 07:04:49 crc kubenswrapper[5047]: I0223 07:04:49.194580 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8rhk" Feb 23 07:04:51 crc kubenswrapper[5047]: I0223 07:04:51.513558 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-k9ww2" Feb 23 07:04:56 crc kubenswrapper[5047]: I0223 07:04:56.064390 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p9fc8" event={"ID":"d4f545e3-e6a7-41f1-84cb-895175df22cf","Type":"ContainerStarted","Data":"8923f99fe8ee98be7b40c2f62ec457ac477c41f2e3a626301978cd89aeb2f0e9"} Feb 23 07:04:56 crc kubenswrapper[5047]: I0223 07:04:56.098310 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p9fc8" podStartSLOduration=3.079043233 podStartE2EDuration="47.098273096s" podCreationTimestamp="2026-02-23 07:04:09 +0000 UTC" firstStartedPulling="2026-02-23 07:04:10.850170723 +0000 UTC m=+1173.101497857" lastFinishedPulling="2026-02-23 07:04:54.869400556 +0000 UTC m=+1217.120727720" observedRunningTime="2026-02-23 07:04:56.090945633 +0000 UTC m=+1218.342272837" watchObservedRunningTime="2026-02-23 07:04:56.098273096 +0000 UTC m=+1218.349600270" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:11.999567 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tkntc"] Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.001609 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-tkntc" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.004770 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-wdcxx" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.004781 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.005716 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.014058 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tkntc"] Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.015054 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.068469 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-slgj7"] Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.069665 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.074176 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.081822 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9tkk\" (UniqueName: \"kubernetes.io/projected/0a0431ae-3647-4aac-819e-9dde125e48d7-kube-api-access-g9tkk\") pod \"dnsmasq-dns-855cbc58c5-tkntc\" (UID: \"0a0431ae-3647-4aac-819e-9dde125e48d7\") " pod="openstack/dnsmasq-dns-855cbc58c5-tkntc" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.081881 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a0431ae-3647-4aac-819e-9dde125e48d7-config\") pod \"dnsmasq-dns-855cbc58c5-tkntc\" (UID: \"0a0431ae-3647-4aac-819e-9dde125e48d7\") " pod="openstack/dnsmasq-dns-855cbc58c5-tkntc" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.084377 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-slgj7"] Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.184016 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxhq9\" (UniqueName: \"kubernetes.io/projected/c1375f75-5068-4574-903d-82d9c1c4fe51-kube-api-access-gxhq9\") pod \"dnsmasq-dns-6fcf94d689-slgj7\" (UID: \"c1375f75-5068-4574-903d-82d9c1c4fe51\") " pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.184097 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1375f75-5068-4574-903d-82d9c1c4fe51-config\") pod \"dnsmasq-dns-6fcf94d689-slgj7\" (UID: \"c1375f75-5068-4574-903d-82d9c1c4fe51\") " pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.184151 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9tkk\" (UniqueName: \"kubernetes.io/projected/0a0431ae-3647-4aac-819e-9dde125e48d7-kube-api-access-g9tkk\") pod \"dnsmasq-dns-855cbc58c5-tkntc\" (UID: \"0a0431ae-3647-4aac-819e-9dde125e48d7\") " pod="openstack/dnsmasq-dns-855cbc58c5-tkntc" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.184174 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a0431ae-3647-4aac-819e-9dde125e48d7-config\") pod \"dnsmasq-dns-855cbc58c5-tkntc\" (UID: \"0a0431ae-3647-4aac-819e-9dde125e48d7\") " pod="openstack/dnsmasq-dns-855cbc58c5-tkntc" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.184201 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1375f75-5068-4574-903d-82d9c1c4fe51-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-slgj7\" (UID: \"c1375f75-5068-4574-903d-82d9c1c4fe51\") " pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.185506 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a0431ae-3647-4aac-819e-9dde125e48d7-config\") pod \"dnsmasq-dns-855cbc58c5-tkntc\" (UID: \"0a0431ae-3647-4aac-819e-9dde125e48d7\") " pod="openstack/dnsmasq-dns-855cbc58c5-tkntc" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.209241 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9tkk\" (UniqueName: \"kubernetes.io/projected/0a0431ae-3647-4aac-819e-9dde125e48d7-kube-api-access-g9tkk\") pod \"dnsmasq-dns-855cbc58c5-tkntc\" (UID: \"0a0431ae-3647-4aac-819e-9dde125e48d7\") " pod="openstack/dnsmasq-dns-855cbc58c5-tkntc" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.285316 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1375f75-5068-4574-903d-82d9c1c4fe51-config\") pod \"dnsmasq-dns-6fcf94d689-slgj7\" (UID: \"c1375f75-5068-4574-903d-82d9c1c4fe51\") " pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.285439 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1375f75-5068-4574-903d-82d9c1c4fe51-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-slgj7\" (UID: \"c1375f75-5068-4574-903d-82d9c1c4fe51\") " pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.285487 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxhq9\" (UniqueName: \"kubernetes.io/projected/c1375f75-5068-4574-903d-82d9c1c4fe51-kube-api-access-gxhq9\") pod \"dnsmasq-dns-6fcf94d689-slgj7\" (UID: \"c1375f75-5068-4574-903d-82d9c1c4fe51\") " pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.287188 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1375f75-5068-4574-903d-82d9c1c4fe51-config\") pod \"dnsmasq-dns-6fcf94d689-slgj7\" (UID: \"c1375f75-5068-4574-903d-82d9c1c4fe51\") " pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.287781 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1375f75-5068-4574-903d-82d9c1c4fe51-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-slgj7\" (UID: \"c1375f75-5068-4574-903d-82d9c1c4fe51\") " pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.303933 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxhq9\" (UniqueName: \"kubernetes.io/projected/c1375f75-5068-4574-903d-82d9c1c4fe51-kube-api-access-gxhq9\") pod \"dnsmasq-dns-6fcf94d689-slgj7\" (UID: \"c1375f75-5068-4574-903d-82d9c1c4fe51\") " pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.322466 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-tkntc" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.386345 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.617165 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tkntc"] Feb 23 07:05:12 crc kubenswrapper[5047]: W0223 07:05:12.628005 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a0431ae_3647_4aac_819e_9dde125e48d7.slice/crio-95e79a8a049b7cc91d79ee953a010f3388f1d8025469cb95535f98bd1e33ce19 WatchSource:0}: Error finding container 95e79a8a049b7cc91d79ee953a010f3388f1d8025469cb95535f98bd1e33ce19: Status 404 returned error can't find the container with id 95e79a8a049b7cc91d79ee953a010f3388f1d8025469cb95535f98bd1e33ce19 Feb 23 07:05:12 crc kubenswrapper[5047]: I0223 07:05:12.893816 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-slgj7"] Feb 23 07:05:12 crc kubenswrapper[5047]: W0223 07:05:12.900304 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1375f75_5068_4574_903d_82d9c1c4fe51.slice/crio-2923bea16fa8a0803806d4034f79e17922b42eaec05585000fb00c7026e0db99 WatchSource:0}: Error finding container 2923bea16fa8a0803806d4034f79e17922b42eaec05585000fb00c7026e0db99: Status 404 returned error can't find the container with id 2923bea16fa8a0803806d4034f79e17922b42eaec05585000fb00c7026e0db99 Feb 23 07:05:13 crc kubenswrapper[5047]: I0223 07:05:13.243746 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" event={"ID":"c1375f75-5068-4574-903d-82d9c1c4fe51","Type":"ContainerStarted","Data":"2923bea16fa8a0803806d4034f79e17922b42eaec05585000fb00c7026e0db99"} Feb 23 07:05:13 crc kubenswrapper[5047]: I0223 07:05:13.245122 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-tkntc" event={"ID":"0a0431ae-3647-4aac-819e-9dde125e48d7","Type":"ContainerStarted","Data":"95e79a8a049b7cc91d79ee953a010f3388f1d8025469cb95535f98bd1e33ce19"} Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.422452 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tkntc"] Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.493809 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-ndhg2"] Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.509284 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.515690 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-ndhg2"] Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.632805 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw4hd\" (UniqueName: \"kubernetes.io/projected/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-kube-api-access-bw4hd\") pod \"dnsmasq-dns-f54874ffc-ndhg2\" (UID: \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\") " pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.632961 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-config\") pod \"dnsmasq-dns-f54874ffc-ndhg2\" (UID: \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\") " pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.632993 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-dns-svc\") pod \"dnsmasq-dns-f54874ffc-ndhg2\" (UID: \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\") " pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.734115 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-dns-svc\") pod \"dnsmasq-dns-f54874ffc-ndhg2\" (UID: \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\") " pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.738095 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-dns-svc\") pod \"dnsmasq-dns-f54874ffc-ndhg2\" (UID: \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\") " pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.738272 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw4hd\" (UniqueName: \"kubernetes.io/projected/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-kube-api-access-bw4hd\") pod \"dnsmasq-dns-f54874ffc-ndhg2\" (UID: \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\") " pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.738649 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-config\") pod \"dnsmasq-dns-f54874ffc-ndhg2\" (UID: \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\") " pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.739288 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-config\") pod \"dnsmasq-dns-f54874ffc-ndhg2\" (UID: \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\") " pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.766346 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-slgj7"] Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.767269 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw4hd\" (UniqueName: \"kubernetes.io/projected/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-kube-api-access-bw4hd\") pod \"dnsmasq-dns-f54874ffc-ndhg2\" (UID: \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\") " pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.791621 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-7jhwn"] Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.793946 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.826407 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.827215 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-7jhwn"] Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.942296 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtdtj\" (UniqueName: \"kubernetes.io/projected/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-kube-api-access-qtdtj\") pod \"dnsmasq-dns-67ff45466c-7jhwn\" (UID: \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\") " pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.942641 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-config\") pod \"dnsmasq-dns-67ff45466c-7jhwn\" (UID: \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\") " pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" Feb 23 07:05:14 crc kubenswrapper[5047]: I0223 07:05:14.942813 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-dns-svc\") pod \"dnsmasq-dns-67ff45466c-7jhwn\" (UID: \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\") " pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.044682 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-config\") pod \"dnsmasq-dns-67ff45466c-7jhwn\" (UID: \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\") " pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.044744 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-dns-svc\") pod \"dnsmasq-dns-67ff45466c-7jhwn\" (UID: \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\") " pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.044814 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtdtj\" (UniqueName: \"kubernetes.io/projected/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-kube-api-access-qtdtj\") pod \"dnsmasq-dns-67ff45466c-7jhwn\" (UID: \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\") " pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.046031 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-dns-svc\") pod \"dnsmasq-dns-67ff45466c-7jhwn\" (UID: \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\") " pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.046125 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-config\") pod \"dnsmasq-dns-67ff45466c-7jhwn\" (UID: \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\") " pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.075878 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtdtj\" (UniqueName: \"kubernetes.io/projected/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-kube-api-access-qtdtj\") pod \"dnsmasq-dns-67ff45466c-7jhwn\" (UID: \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\") " pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.127653 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.169498 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-ndhg2"] Feb 23 07:05:15 crc kubenswrapper[5047]: W0223 07:05:15.182815 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf74f3c47_526d_483a_bf9b_5b319e6bf5b9.slice/crio-3f246833dacd98201d24c809cc0979b90a4d96befbb490841cb1486c69dd7cd6 WatchSource:0}: Error finding container 3f246833dacd98201d24c809cc0979b90a4d96befbb490841cb1486c69dd7cd6: Status 404 returned error can't find the container with id 3f246833dacd98201d24c809cc0979b90a4d96befbb490841cb1486c69dd7cd6 Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.264766 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" event={"ID":"f74f3c47-526d-483a-bf9b-5b319e6bf5b9","Type":"ContainerStarted","Data":"3f246833dacd98201d24c809cc0979b90a4d96befbb490841cb1486c69dd7cd6"} Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.619887 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-7jhwn"] Feb 23 07:05:15 crc kubenswrapper[5047]: W0223 07:05:15.629174 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6ea1afd_fd03_4b55_86f1_0aedb614ed5d.slice/crio-2be837834caca8a75c89207a1ae27368d0cac4b7c64fa84072916a5751bd3ab7 WatchSource:0}: Error finding container 2be837834caca8a75c89207a1ae27368d0cac4b7c64fa84072916a5751bd3ab7: Status 404 returned error can't find the container with id 2be837834caca8a75c89207a1ae27368d0cac4b7c64fa84072916a5751bd3ab7 Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.649611 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.651711 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.655774 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.655991 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9qj6z" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.656099 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.656222 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.656326 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.656452 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.656615 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.663627 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.757227 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.757294 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.757336 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjgtg\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-kube-api-access-hjgtg\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.757373 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.757575 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6776acf2-e53f-4892-847d-8667669a5eb9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.757630 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.757666 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.757801 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.757882 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6776acf2-e53f-4892-847d-8667669a5eb9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.757994 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.758027 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.859354 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.859494 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.860008 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.860038 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.860153 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjgtg\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-kube-api-access-hjgtg\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.860189 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.860588 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6776acf2-e53f-4892-847d-8667669a5eb9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.860608 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.861006 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.861390 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.861441 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.861530 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6776acf2-e53f-4892-847d-8667669a5eb9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.861566 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.861605 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.862403 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.862754 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.863896 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.868302 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6776acf2-e53f-4892-847d-8667669a5eb9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.869687 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6776acf2-e53f-4892-847d-8667669a5eb9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.871452 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.871783 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.880845 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjgtg\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-kube-api-access-hjgtg\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.898690 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " pod="openstack/rabbitmq-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.929493 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.934643 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.937172 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.937452 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.937556 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.937560 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.937616 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.937716 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.939761 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8dr62" Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.942082 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:05:15 crc kubenswrapper[5047]: I0223 07:05:15.987761 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.067088 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.067215 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.067240 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.067270 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.067302 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.067325 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.067342 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e83857b-5e17-4878-8f9b-e8d1a65325ba-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.067370 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.067388 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.067418 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e83857b-5e17-4878-8f9b-e8d1a65325ba-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.067444 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4pqg\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-kube-api-access-q4pqg\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.175865 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e83857b-5e17-4878-8f9b-e8d1a65325ba-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.175960 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4pqg\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-kube-api-access-q4pqg\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.176089 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.176111 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.176135 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.176170 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.176214 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.176250 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.176270 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e83857b-5e17-4878-8f9b-e8d1a65325ba-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.176297 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.176323 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.176785 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.177299 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.178231 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.178763 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.182321 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.185747 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.190394 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e83857b-5e17-4878-8f9b-e8d1a65325ba-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.195665 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e83857b-5e17-4878-8f9b-e8d1a65325ba-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.200557 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.200651 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.220215 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4pqg\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-kube-api-access-q4pqg\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.300029 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" event={"ID":"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d","Type":"ContainerStarted","Data":"2be837834caca8a75c89207a1ae27368d0cac4b7c64fa84072916a5751bd3ab7"} Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.306360 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.574933 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:16 crc kubenswrapper[5047]: I0223 07:05:16.603309 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.297718 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.300796 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.303897 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.305002 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.304934 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-27gbm" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.310112 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.312155 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.313414 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.420093 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-kolla-config\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.420223 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.420272 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.420392 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-config-data-default\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.420442 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.420594 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.420615 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.420633 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw7sm\" (UniqueName: \"kubernetes.io/projected/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-kube-api-access-jw7sm\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.529130 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw7sm\" (UniqueName: \"kubernetes.io/projected/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-kube-api-access-jw7sm\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.529179 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.529200 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.529285 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-kolla-config\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.529309 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.529329 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.529364 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-config-data-default\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.529381 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.531082 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.538624 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.539204 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.539241 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.539993 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-config-data-default\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.540037 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.540472 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-kolla-config\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.549571 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw7sm\" (UniqueName: \"kubernetes.io/projected/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-kube-api-access-jw7sm\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.557963 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " pod="openstack/openstack-galera-0" Feb 23 07:05:17 crc kubenswrapper[5047]: I0223 07:05:17.622673 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.638727 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.641325 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.645555 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.645738 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.645935 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9xp84" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.646132 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.647169 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.769962 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b66eae6-4565-4f56-8bdc-009aa1101a64-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.770075 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.770100 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfn69\" (UniqueName: \"kubernetes.io/projected/9b66eae6-4565-4f56-8bdc-009aa1101a64-kube-api-access-pfn69\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.770133 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b66eae6-4565-4f56-8bdc-009aa1101a64-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.770168 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.770203 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.770236 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b66eae6-4565-4f56-8bdc-009aa1101a64-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.770267 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.871884 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b66eae6-4565-4f56-8bdc-009aa1101a64-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.871988 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.872011 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfn69\" (UniqueName: \"kubernetes.io/projected/9b66eae6-4565-4f56-8bdc-009aa1101a64-kube-api-access-pfn69\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.872032 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b66eae6-4565-4f56-8bdc-009aa1101a64-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.872065 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.872091 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.872112 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.872132 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b66eae6-4565-4f56-8bdc-009aa1101a64-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.872619 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b66eae6-4565-4f56-8bdc-009aa1101a64-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.873692 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.874753 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.876023 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.876728 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.908135 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b66eae6-4565-4f56-8bdc-009aa1101a64-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.912631 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b66eae6-4565-4f56-8bdc-009aa1101a64-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.919342 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfn69\" (UniqueName: \"kubernetes.io/projected/9b66eae6-4565-4f56-8bdc-009aa1101a64-kube-api-access-pfn69\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.929136 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.931523 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.935671 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-pt2td" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.936697 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.936849 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.938861 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.957429 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 23 07:05:18 crc kubenswrapper[5047]: I0223 07:05:18.987819 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.075578 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c814692-df9e-470f-8aad-364d48f82b81-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.075694 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c814692-df9e-470f-8aad-364d48f82b81-kolla-config\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.075714 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfk2q\" (UniqueName: \"kubernetes.io/projected/4c814692-df9e-470f-8aad-364d48f82b81-kube-api-access-xfk2q\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.075737 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c814692-df9e-470f-8aad-364d48f82b81-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.075804 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c814692-df9e-470f-8aad-364d48f82b81-config-data\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.177483 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c814692-df9e-470f-8aad-364d48f82b81-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.177609 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c814692-df9e-470f-8aad-364d48f82b81-kolla-config\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.177639 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfk2q\" (UniqueName: \"kubernetes.io/projected/4c814692-df9e-470f-8aad-364d48f82b81-kube-api-access-xfk2q\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.177668 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c814692-df9e-470f-8aad-364d48f82b81-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.177691 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c814692-df9e-470f-8aad-364d48f82b81-config-data\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.182207 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c814692-df9e-470f-8aad-364d48f82b81-kolla-config\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.182998 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c814692-df9e-470f-8aad-364d48f82b81-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.187985 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c814692-df9e-470f-8aad-364d48f82b81-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.199112 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfk2q\" (UniqueName: \"kubernetes.io/projected/4c814692-df9e-470f-8aad-364d48f82b81-kube-api-access-xfk2q\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.199706 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c814692-df9e-470f-8aad-364d48f82b81-config-data\") pod \"memcached-0\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " pod="openstack/memcached-0" Feb 23 07:05:19 crc kubenswrapper[5047]: I0223 07:05:19.277065 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 07:05:20 crc kubenswrapper[5047]: W0223 07:05:20.575811 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6776acf2_e53f_4892_847d_8667669a5eb9.slice/crio-4cb31b7c6c4c44a36739dd396af6d10d4f7cad518a35bfea978bc2caa4d8e43e WatchSource:0}: Error finding container 4cb31b7c6c4c44a36739dd396af6d10d4f7cad518a35bfea978bc2caa4d8e43e: Status 404 returned error can't find the container with id 4cb31b7c6c4c44a36739dd396af6d10d4f7cad518a35bfea978bc2caa4d8e43e Feb 23 07:05:21 crc kubenswrapper[5047]: I0223 07:05:21.065564 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:05:21 crc kubenswrapper[5047]: I0223 07:05:21.066685 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:05:21 crc kubenswrapper[5047]: I0223 07:05:21.069293 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-7wgvz" Feb 23 07:05:21 crc kubenswrapper[5047]: I0223 07:05:21.078120 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:05:21 crc kubenswrapper[5047]: I0223 07:05:21.235980 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptpq6\" (UniqueName: \"kubernetes.io/projected/c0b4e975-a97e-42a0-80a8-f97734bcaff2-kube-api-access-ptpq6\") pod \"kube-state-metrics-0\" (UID: \"c0b4e975-a97e-42a0-80a8-f97734bcaff2\") " pod="openstack/kube-state-metrics-0" Feb 23 07:05:21 crc kubenswrapper[5047]: I0223 07:05:21.337742 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptpq6\" (UniqueName: \"kubernetes.io/projected/c0b4e975-a97e-42a0-80a8-f97734bcaff2-kube-api-access-ptpq6\") pod \"kube-state-metrics-0\" (UID: \"c0b4e975-a97e-42a0-80a8-f97734bcaff2\") " pod="openstack/kube-state-metrics-0" Feb 23 07:05:21 crc kubenswrapper[5047]: I0223 07:05:21.367788 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptpq6\" (UniqueName: \"kubernetes.io/projected/c0b4e975-a97e-42a0-80a8-f97734bcaff2-kube-api-access-ptpq6\") pod \"kube-state-metrics-0\" (UID: \"c0b4e975-a97e-42a0-80a8-f97734bcaff2\") " pod="openstack/kube-state-metrics-0" Feb 23 07:05:21 crc kubenswrapper[5047]: I0223 07:05:21.368829 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6776acf2-e53f-4892-847d-8667669a5eb9","Type":"ContainerStarted","Data":"4cb31b7c6c4c44a36739dd396af6d10d4f7cad518a35bfea978bc2caa4d8e43e"} Feb 23 07:05:21 crc kubenswrapper[5047]: I0223 07:05:21.445566 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.791971 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fk6gc"] Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.798165 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.800946 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-krv26" Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.801166 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.801336 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.813229 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fk6gc"] Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.867580 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-f7lbh"] Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.870346 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.896210 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-f7lbh"] Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.908141 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-run\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.908212 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9e1257-3765-4d4e-8110-81c55d1546d4-combined-ca-bundle\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.908237 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-log-ovn\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.908266 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79mt8\" (UniqueName: \"kubernetes.io/projected/6f9e1257-3765-4d4e-8110-81c55d1546d4-kube-api-access-79mt8\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.908341 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-run-ovn\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.908415 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f9e1257-3765-4d4e-8110-81c55d1546d4-scripts\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:24 crc kubenswrapper[5047]: I0223 07:05:24.908506 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f9e1257-3765-4d4e-8110-81c55d1546d4-ovn-controller-tls-certs\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.010458 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-etc-ovs\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.010973 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f9e1257-3765-4d4e-8110-81c55d1546d4-scripts\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.011028 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-log\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.011047 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwq4t\" (UniqueName: \"kubernetes.io/projected/df2d42b2-545c-47ab-ba87-ff81a4cced8d-kube-api-access-mwq4t\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.011073 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f9e1257-3765-4d4e-8110-81c55d1546d4-ovn-controller-tls-certs\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.011184 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-lib\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.011353 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-run\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.011450 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-run\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.011509 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9e1257-3765-4d4e-8110-81c55d1546d4-combined-ca-bundle\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.011557 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-log-ovn\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.011593 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df2d42b2-545c-47ab-ba87-ff81a4cced8d-scripts\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.011616 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79mt8\" (UniqueName: \"kubernetes.io/projected/6f9e1257-3765-4d4e-8110-81c55d1546d4-kube-api-access-79mt8\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.011668 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-run-ovn\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.012299 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-run-ovn\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.012409 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-run\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.012555 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-log-ovn\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.014071 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f9e1257-3765-4d4e-8110-81c55d1546d4-scripts\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.027518 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9e1257-3765-4d4e-8110-81c55d1546d4-combined-ca-bundle\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.027692 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f9e1257-3765-4d4e-8110-81c55d1546d4-ovn-controller-tls-certs\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.029824 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79mt8\" (UniqueName: \"kubernetes.io/projected/6f9e1257-3765-4d4e-8110-81c55d1546d4-kube-api-access-79mt8\") pod \"ovn-controller-fk6gc\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.113711 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-etc-ovs\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.113806 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-log\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.113830 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwq4t\" (UniqueName: \"kubernetes.io/projected/df2d42b2-545c-47ab-ba87-ff81a4cced8d-kube-api-access-mwq4t\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.113863 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-lib\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.113920 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-run\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.113964 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df2d42b2-545c-47ab-ba87-ff81a4cced8d-scripts\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.114114 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-log\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.114110 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-run\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.114213 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-lib\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.114215 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-etc-ovs\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.117756 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df2d42b2-545c-47ab-ba87-ff81a4cced8d-scripts\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.118763 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.143859 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwq4t\" (UniqueName: \"kubernetes.io/projected/df2d42b2-545c-47ab-ba87-ff81a4cced8d-kube-api-access-mwq4t\") pod \"ovn-controller-ovs-f7lbh\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:25 crc kubenswrapper[5047]: I0223 07:05:25.200222 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.329265 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.331037 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.333636 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.333863 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.334406 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.334607 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5cdgv" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.335044 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.351948 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.453494 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.453589 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-config\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.453637 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.453972 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.454202 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.454325 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.454473 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.454578 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6x24\" (UniqueName: \"kubernetes.io/projected/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-kube-api-access-k6x24\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.556861 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.556944 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.556980 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.557006 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.557037 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6x24\" (UniqueName: \"kubernetes.io/projected/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-kube-api-access-k6x24\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.557114 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.557155 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-config\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.557185 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.557628 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.557653 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.558763 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-config\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.566536 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.568007 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.569204 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.572245 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.580788 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.589538 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6x24\" (UniqueName: \"kubernetes.io/projected/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-kube-api-access-k6x24\") pod \"ovsdbserver-sb-0\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:27 crc kubenswrapper[5047]: I0223 07:05:27.675925 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.046443 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.048495 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.052829 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.052951 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.053290 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-m4df9" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.053375 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.073389 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.169158 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ca037a-0388-4fdc-9106-1abbfc17566d-config\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.169232 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.169265 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpb6b\" (UniqueName: \"kubernetes.io/projected/01ca037a-0388-4fdc-9106-1abbfc17566d-kube-api-access-tpb6b\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.169515 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/01ca037a-0388-4fdc-9106-1abbfc17566d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.169589 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.169626 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.169792 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.169847 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01ca037a-0388-4fdc-9106-1abbfc17566d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.272098 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.272186 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpb6b\" (UniqueName: \"kubernetes.io/projected/01ca037a-0388-4fdc-9106-1abbfc17566d-kube-api-access-tpb6b\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.272260 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/01ca037a-0388-4fdc-9106-1abbfc17566d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.272293 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.272320 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.272366 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.272400 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01ca037a-0388-4fdc-9106-1abbfc17566d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.272454 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ca037a-0388-4fdc-9106-1abbfc17566d-config\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.273228 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/01ca037a-0388-4fdc-9106-1abbfc17566d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.273394 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.274269 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ca037a-0388-4fdc-9106-1abbfc17566d-config\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.274699 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01ca037a-0388-4fdc-9106-1abbfc17566d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.279000 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.280490 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.281547 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.298302 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpb6b\" (UniqueName: \"kubernetes.io/projected/01ca037a-0388-4fdc-9106-1abbfc17566d-kube-api-access-tpb6b\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.303027 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:28 crc kubenswrapper[5047]: I0223 07:05:28.411990 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:33 crc kubenswrapper[5047]: E0223 07:05:33.539822 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 23 07:05:33 crc kubenswrapper[5047]: E0223 07:05:33.540318 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g9tkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-855cbc58c5-tkntc_openstack(0a0431ae-3647-4aac-819e-9dde125e48d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:05:33 crc kubenswrapper[5047]: E0223 07:05:33.541544 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-855cbc58c5-tkntc" podUID="0a0431ae-3647-4aac-819e-9dde125e48d7" Feb 23 07:05:33 crc kubenswrapper[5047]: E0223 07:05:33.606956 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 23 07:05:33 crc kubenswrapper[5047]: E0223 07:05:33.607324 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gxhq9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6fcf94d689-slgj7_openstack(c1375f75-5068-4574-903d-82d9c1c4fe51): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:05:33 crc kubenswrapper[5047]: E0223 07:05:33.608577 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" podUID="c1375f75-5068-4574-903d-82d9c1c4fe51" Feb 23 07:05:33 crc kubenswrapper[5047]: E0223 07:05:33.608740 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 23 07:05:33 crc kubenswrapper[5047]: E0223 07:05:33.609048 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bw4hd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-f54874ffc-ndhg2_openstack(f74f3c47-526d-483a-bf9b-5b319e6bf5b9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:05:33 crc kubenswrapper[5047]: E0223 07:05:33.610259 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" podUID="f74f3c47-526d-483a-bf9b-5b319e6bf5b9" Feb 23 07:05:34 crc kubenswrapper[5047]: E0223 07:05:34.489346 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" podUID="f74f3c47-526d-483a-bf9b-5b319e6bf5b9" Feb 23 07:05:34 crc kubenswrapper[5047]: E0223 07:05:34.861881 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 23 07:05:34 crc kubenswrapper[5047]: E0223 07:05:34.862584 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qtdtj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-67ff45466c-7jhwn_openstack(e6ea1afd-fd03-4b55-86f1-0aedb614ed5d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:05:34 crc kubenswrapper[5047]: E0223 07:05:34.863816 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" podUID="e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" Feb 23 07:05:34 crc kubenswrapper[5047]: I0223 07:05:34.986315 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-tkntc" Feb 23 07:05:34 crc kubenswrapper[5047]: I0223 07:05:34.994658 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.105650 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9tkk\" (UniqueName: \"kubernetes.io/projected/0a0431ae-3647-4aac-819e-9dde125e48d7-kube-api-access-g9tkk\") pod \"0a0431ae-3647-4aac-819e-9dde125e48d7\" (UID: \"0a0431ae-3647-4aac-819e-9dde125e48d7\") " Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.105730 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a0431ae-3647-4aac-819e-9dde125e48d7-config\") pod \"0a0431ae-3647-4aac-819e-9dde125e48d7\" (UID: \"0a0431ae-3647-4aac-819e-9dde125e48d7\") " Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.105791 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1375f75-5068-4574-903d-82d9c1c4fe51-dns-svc\") pod \"c1375f75-5068-4574-903d-82d9c1c4fe51\" (UID: \"c1375f75-5068-4574-903d-82d9c1c4fe51\") " Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.105820 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxhq9\" (UniqueName: \"kubernetes.io/projected/c1375f75-5068-4574-903d-82d9c1c4fe51-kube-api-access-gxhq9\") pod \"c1375f75-5068-4574-903d-82d9c1c4fe51\" (UID: \"c1375f75-5068-4574-903d-82d9c1c4fe51\") " Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.105855 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1375f75-5068-4574-903d-82d9c1c4fe51-config\") pod \"c1375f75-5068-4574-903d-82d9c1c4fe51\" (UID: \"c1375f75-5068-4574-903d-82d9c1c4fe51\") " Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.108521 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1375f75-5068-4574-903d-82d9c1c4fe51-config" (OuterVolumeSpecName: "config") pod "c1375f75-5068-4574-903d-82d9c1c4fe51" (UID: "c1375f75-5068-4574-903d-82d9c1c4fe51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.108619 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a0431ae-3647-4aac-819e-9dde125e48d7-config" (OuterVolumeSpecName: "config") pod "0a0431ae-3647-4aac-819e-9dde125e48d7" (UID: "0a0431ae-3647-4aac-819e-9dde125e48d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.108965 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1375f75-5068-4574-903d-82d9c1c4fe51-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c1375f75-5068-4574-903d-82d9c1c4fe51" (UID: "c1375f75-5068-4574-903d-82d9c1c4fe51"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.113496 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1375f75-5068-4574-903d-82d9c1c4fe51-kube-api-access-gxhq9" (OuterVolumeSpecName: "kube-api-access-gxhq9") pod "c1375f75-5068-4574-903d-82d9c1c4fe51" (UID: "c1375f75-5068-4574-903d-82d9c1c4fe51"). InnerVolumeSpecName "kube-api-access-gxhq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.114383 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0431ae-3647-4aac-819e-9dde125e48d7-kube-api-access-g9tkk" (OuterVolumeSpecName: "kube-api-access-g9tkk") pod "0a0431ae-3647-4aac-819e-9dde125e48d7" (UID: "0a0431ae-3647-4aac-819e-9dde125e48d7"). InnerVolumeSpecName "kube-api-access-g9tkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.208094 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9tkk\" (UniqueName: \"kubernetes.io/projected/0a0431ae-3647-4aac-819e-9dde125e48d7-kube-api-access-g9tkk\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.208130 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a0431ae-3647-4aac-819e-9dde125e48d7-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.208143 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1375f75-5068-4574-903d-82d9c1c4fe51-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.208154 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxhq9\" (UniqueName: \"kubernetes.io/projected/c1375f75-5068-4574-903d-82d9c1c4fe51-kube-api-access-gxhq9\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.208166 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1375f75-5068-4574-903d-82d9c1c4fe51-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.384994 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.394317 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 23 07:05:35 crc kubenswrapper[5047]: W0223 07:05:35.395161 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c814692_df9e_470f_8aad_364d48f82b81.slice/crio-393bac14fc8ea3deed8f5dfee5c88ca4c99ed425ce94b96d2881cc600d293cf4 WatchSource:0}: Error finding container 393bac14fc8ea3deed8f5dfee5c88ca4c99ed425ce94b96d2881cc600d293cf4: Status 404 returned error can't find the container with id 393bac14fc8ea3deed8f5dfee5c88ca4c99ed425ce94b96d2881cc600d293cf4 Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.511499 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4c814692-df9e-470f-8aad-364d48f82b81","Type":"ContainerStarted","Data":"393bac14fc8ea3deed8f5dfee5c88ca4c99ed425ce94b96d2881cc600d293cf4"} Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.513607 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9b66eae6-4565-4f56-8bdc-009aa1101a64","Type":"ContainerStarted","Data":"560cfda7a2ac6dcc6ad6cf9845a31396ed7f647407dc569b698b26c35498c194"} Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.515404 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-tkntc" Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.515389 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-tkntc" event={"ID":"0a0431ae-3647-4aac-819e-9dde125e48d7","Type":"ContainerDied","Data":"95e79a8a049b7cc91d79ee953a010f3388f1d8025469cb95535f98bd1e33ce19"} Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.515752 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.517578 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" event={"ID":"c1375f75-5068-4574-903d-82d9c1c4fe51","Type":"ContainerDied","Data":"2923bea16fa8a0803806d4034f79e17922b42eaec05585000fb00c7026e0db99"} Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.517629 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-slgj7" Feb 23 07:05:35 crc kubenswrapper[5047]: E0223 07:05:35.518948 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" podUID="e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" Feb 23 07:05:35 crc kubenswrapper[5047]: W0223 07:05:35.545202 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e83857b_5e17_4878_8f9b_e8d1a65325ba.slice/crio-396ccfbf501c118e907af06c08d0447705957a8468cc48604d4379479b6bfce0 WatchSource:0}: Error finding container 396ccfbf501c118e907af06c08d0447705957a8468cc48604d4379479b6bfce0: Status 404 returned error can't find the container with id 396ccfbf501c118e907af06c08d0447705957a8468cc48604d4379479b6bfce0 Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.619338 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.674121 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.722859 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-slgj7"] Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.733441 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-slgj7"] Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.739287 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fk6gc"] Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.744600 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.755460 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tkntc"] Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.760399 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tkntc"] Feb 23 07:05:35 crc kubenswrapper[5047]: I0223 07:05:35.765321 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 07:05:36 crc kubenswrapper[5047]: I0223 07:05:36.198017 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-f7lbh"] Feb 23 07:05:36 crc kubenswrapper[5047]: I0223 07:05:36.359687 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a0431ae-3647-4aac-819e-9dde125e48d7" path="/var/lib/kubelet/pods/0a0431ae-3647-4aac-819e-9dde125e48d7/volumes" Feb 23 07:05:36 crc kubenswrapper[5047]: I0223 07:05:36.361028 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1375f75-5068-4574-903d-82d9c1c4fe51" path="/var/lib/kubelet/pods/c1375f75-5068-4574-903d-82d9c1c4fe51/volumes" Feb 23 07:05:36 crc kubenswrapper[5047]: I0223 07:05:36.533038 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8","Type":"ContainerStarted","Data":"53842cd9436ba4b00e89054c5237078030ef7cd1d36577b1aec59563f9b5b96d"} Feb 23 07:05:36 crc kubenswrapper[5047]: I0223 07:05:36.537109 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6776acf2-e53f-4892-847d-8667669a5eb9","Type":"ContainerStarted","Data":"1550a785ef91ff8b4c17d9b0e6b0acd706caa1828e577823dabee876eb27c6e4"} Feb 23 07:05:36 crc kubenswrapper[5047]: I0223 07:05:36.539088 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e83857b-5e17-4878-8f9b-e8d1a65325ba","Type":"ContainerStarted","Data":"396ccfbf501c118e907af06c08d0447705957a8468cc48604d4379479b6bfce0"} Feb 23 07:05:36 crc kubenswrapper[5047]: I0223 07:05:36.541952 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b","Type":"ContainerStarted","Data":"18353dc955d9ec5be7959ce8e71b36cef9f279e75b3d00947b619e9d431c8749"} Feb 23 07:05:36 crc kubenswrapper[5047]: I0223 07:05:36.544270 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c0b4e975-a97e-42a0-80a8-f97734bcaff2","Type":"ContainerStarted","Data":"19817381454e5b946ff94789c84fd485fc5ea8df527ca113f294803b838b2d7e"} Feb 23 07:05:36 crc kubenswrapper[5047]: I0223 07:05:36.546594 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fk6gc" event={"ID":"6f9e1257-3765-4d4e-8110-81c55d1546d4","Type":"ContainerStarted","Data":"37a65ce9726d6562bc400aab960c5380596ebe4d5aeeb9f93d6f608a1755cf92"} Feb 23 07:05:36 crc kubenswrapper[5047]: I0223 07:05:36.548182 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"01ca037a-0388-4fdc-9106-1abbfc17566d","Type":"ContainerStarted","Data":"cf311530a77d8812faec78173b72066894f73fbdc50a2ec099fdf21241d0732f"} Feb 23 07:05:36 crc kubenswrapper[5047]: I0223 07:05:36.549720 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f7lbh" event={"ID":"df2d42b2-545c-47ab-ba87-ff81a4cced8d","Type":"ContainerStarted","Data":"afc14c445101fdef81b94ea4fa12525cc441fc7d87326336d97f9bbbadb51fef"} Feb 23 07:05:37 crc kubenswrapper[5047]: I0223 07:05:37.558652 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e83857b-5e17-4878-8f9b-e8d1a65325ba","Type":"ContainerStarted","Data":"57718f64876e2968f7e2d3c286ddd88b4f77442fd1898477e29b9138ee17816a"} Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.690981 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f7lbh" event={"ID":"df2d42b2-545c-47ab-ba87-ff81a4cced8d","Type":"ContainerStarted","Data":"7153e512ce13449b45e83ffde83e02e89d094a91d5e62134e5877251c16f084c"} Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.692420 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8","Type":"ContainerStarted","Data":"8656eb8e7f120230bec3b1f18fdcd072d9f82031f44305eb6078bf7e798a30c4"} Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.693744 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9b66eae6-4565-4f56-8bdc-009aa1101a64","Type":"ContainerStarted","Data":"8b62822f363fb0c003c60c543c9eb3209f0b6f65b4cfe08cf2245bd1ec54bdae"} Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.695355 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b","Type":"ContainerStarted","Data":"49d027d860ba5223563212cb898b35637b4e15b37b156251bf043d0b350a6586"} Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.697051 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c0b4e975-a97e-42a0-80a8-f97734bcaff2","Type":"ContainerStarted","Data":"162cb7f8c6e55e75e580ec58b388952191f54181ae019a1fa7f507a11cbff878"} Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.697200 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.698916 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4c814692-df9e-470f-8aad-364d48f82b81","Type":"ContainerStarted","Data":"bcb9e71f1342662d147a950f9bce3acb70d72672c9ecd0106c2560c4f8b80214"} Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.699123 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.701845 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fk6gc" event={"ID":"6f9e1257-3765-4d4e-8110-81c55d1546d4","Type":"ContainerStarted","Data":"6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2"} Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.701970 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-fk6gc" Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.704028 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"01ca037a-0388-4fdc-9106-1abbfc17566d","Type":"ContainerStarted","Data":"dbc428f4bdef3134e02420519caff4df17024718afa37dadd912cd858a13c0cc"} Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.797404 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-fk6gc" podStartSLOduration=12.458819321 podStartE2EDuration="20.797379754s" podCreationTimestamp="2026-02-23 07:05:24 +0000 UTC" firstStartedPulling="2026-02-23 07:05:35.719699508 +0000 UTC m=+1257.971026642" lastFinishedPulling="2026-02-23 07:05:44.058259941 +0000 UTC m=+1266.309587075" observedRunningTime="2026-02-23 07:05:44.793419219 +0000 UTC m=+1267.044746363" watchObservedRunningTime="2026-02-23 07:05:44.797379754 +0000 UTC m=+1267.048706908" Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.823436 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.988667259 podStartE2EDuration="26.823408006s" podCreationTimestamp="2026-02-23 07:05:18 +0000 UTC" firstStartedPulling="2026-02-23 07:05:35.397588024 +0000 UTC m=+1257.648915178" lastFinishedPulling="2026-02-23 07:05:43.232328791 +0000 UTC m=+1265.483655925" observedRunningTime="2026-02-23 07:05:44.812776874 +0000 UTC m=+1267.064104008" watchObservedRunningTime="2026-02-23 07:05:44.823408006 +0000 UTC m=+1267.074735150" Feb 23 07:05:44 crc kubenswrapper[5047]: I0223 07:05:44.872661 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.363516346 podStartE2EDuration="23.872634655s" podCreationTimestamp="2026-02-23 07:05:21 +0000 UTC" firstStartedPulling="2026-02-23 07:05:35.648025182 +0000 UTC m=+1257.899352316" lastFinishedPulling="2026-02-23 07:05:44.157143491 +0000 UTC m=+1266.408470625" observedRunningTime="2026-02-23 07:05:44.860737369 +0000 UTC m=+1267.112064503" watchObservedRunningTime="2026-02-23 07:05:44.872634655 +0000 UTC m=+1267.123961789" Feb 23 07:05:45 crc kubenswrapper[5047]: I0223 07:05:45.716153 5047 generic.go:334] "Generic (PLEG): container finished" podID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerID="7153e512ce13449b45e83ffde83e02e89d094a91d5e62134e5877251c16f084c" exitCode=0 Feb 23 07:05:45 crc kubenswrapper[5047]: I0223 07:05:45.716279 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f7lbh" event={"ID":"df2d42b2-545c-47ab-ba87-ff81a4cced8d","Type":"ContainerDied","Data":"7153e512ce13449b45e83ffde83e02e89d094a91d5e62134e5877251c16f084c"} Feb 23 07:05:46 crc kubenswrapper[5047]: I0223 07:05:46.727442 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f7lbh" event={"ID":"df2d42b2-545c-47ab-ba87-ff81a4cced8d","Type":"ContainerStarted","Data":"0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48"} Feb 23 07:05:46 crc kubenswrapper[5047]: I0223 07:05:46.728225 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:46 crc kubenswrapper[5047]: I0223 07:05:46.728246 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:05:46 crc kubenswrapper[5047]: I0223 07:05:46.728265 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f7lbh" event={"ID":"df2d42b2-545c-47ab-ba87-ff81a4cced8d","Type":"ContainerStarted","Data":"62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945"} Feb 23 07:05:46 crc kubenswrapper[5047]: I0223 07:05:46.729891 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8","Type":"ContainerStarted","Data":"98cb4fcb07ef6bff93f43c15de37a5ab70e29f4de36a5da8359c405261211da7"} Feb 23 07:05:46 crc kubenswrapper[5047]: I0223 07:05:46.731274 5047 generic.go:334] "Generic (PLEG): container finished" podID="f74f3c47-526d-483a-bf9b-5b319e6bf5b9" containerID="83424b2b24f79daa8cfd22a12cfe274adc2d2263cc29b5b57bb07860a21fb696" exitCode=0 Feb 23 07:05:46 crc kubenswrapper[5047]: I0223 07:05:46.731355 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" event={"ID":"f74f3c47-526d-483a-bf9b-5b319e6bf5b9","Type":"ContainerDied","Data":"83424b2b24f79daa8cfd22a12cfe274adc2d2263cc29b5b57bb07860a21fb696"} Feb 23 07:05:46 crc kubenswrapper[5047]: I0223 07:05:46.733732 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"01ca037a-0388-4fdc-9106-1abbfc17566d","Type":"ContainerStarted","Data":"aab582d93c645f1cd1f93f646d13990d4cebf1f69a19788d254697fedd7fc7f7"} Feb 23 07:05:46 crc kubenswrapper[5047]: I0223 07:05:46.759531 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:05:46 crc kubenswrapper[5047]: I0223 07:05:46.759628 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:05:46 crc kubenswrapper[5047]: I0223 07:05:46.766436 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-f7lbh" podStartSLOduration=14.928120516 podStartE2EDuration="22.766413168s" podCreationTimestamp="2026-02-23 07:05:24 +0000 UTC" firstStartedPulling="2026-02-23 07:05:36.218923612 +0000 UTC m=+1258.470250746" lastFinishedPulling="2026-02-23 07:05:44.057216264 +0000 UTC m=+1266.308543398" observedRunningTime="2026-02-23 07:05:46.760047269 +0000 UTC m=+1269.011374413" watchObservedRunningTime="2026-02-23 07:05:46.766413168 +0000 UTC m=+1269.017740302" Feb 23 07:05:46 crc kubenswrapper[5047]: I0223 07:05:46.835450 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.692735621 podStartE2EDuration="19.835417893s" podCreationTimestamp="2026-02-23 07:05:27 +0000 UTC" firstStartedPulling="2026-02-23 07:05:35.738653143 +0000 UTC m=+1257.989980277" lastFinishedPulling="2026-02-23 07:05:45.881335415 +0000 UTC m=+1268.132662549" observedRunningTime="2026-02-23 07:05:46.824465722 +0000 UTC m=+1269.075792856" watchObservedRunningTime="2026-02-23 07:05:46.835417893 +0000 UTC m=+1269.086745037" Feb 23 07:05:46 crc kubenswrapper[5047]: I0223 07:05:46.852546 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.689179067 podStartE2EDuration="20.852522768s" podCreationTimestamp="2026-02-23 07:05:26 +0000 UTC" firstStartedPulling="2026-02-23 07:05:35.719018211 +0000 UTC m=+1257.970345345" lastFinishedPulling="2026-02-23 07:05:45.882361912 +0000 UTC m=+1268.133689046" observedRunningTime="2026-02-23 07:05:46.8518381 +0000 UTC m=+1269.103165234" watchObservedRunningTime="2026-02-23 07:05:46.852522768 +0000 UTC m=+1269.103849922" Feb 23 07:05:47 crc kubenswrapper[5047]: I0223 07:05:47.677585 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:47 crc kubenswrapper[5047]: I0223 07:05:47.748615 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" event={"ID":"f74f3c47-526d-483a-bf9b-5b319e6bf5b9","Type":"ContainerStarted","Data":"e25f4b49e7d65d39e17d899ae294165e10a6e3e040ea9c881305923af8671c35"} Feb 23 07:05:47 crc kubenswrapper[5047]: I0223 07:05:47.783627 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" podStartSLOduration=3.103329238 podStartE2EDuration="33.783595834s" podCreationTimestamp="2026-02-23 07:05:14 +0000 UTC" firstStartedPulling="2026-02-23 07:05:15.203226247 +0000 UTC m=+1237.454553381" lastFinishedPulling="2026-02-23 07:05:45.883492843 +0000 UTC m=+1268.134819977" observedRunningTime="2026-02-23 07:05:47.775614402 +0000 UTC m=+1270.026941576" watchObservedRunningTime="2026-02-23 07:05:47.783595834 +0000 UTC m=+1270.034923008" Feb 23 07:05:48 crc kubenswrapper[5047]: I0223 07:05:48.412197 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:48 crc kubenswrapper[5047]: I0223 07:05:48.676768 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:48 crc kubenswrapper[5047]: I0223 07:05:48.748132 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:48 crc kubenswrapper[5047]: I0223 07:05:48.759936 5047 generic.go:334] "Generic (PLEG): container finished" podID="cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" containerID="49d027d860ba5223563212cb898b35637b4e15b37b156251bf043d0b350a6586" exitCode=0 Feb 23 07:05:48 crc kubenswrapper[5047]: I0223 07:05:48.760037 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b","Type":"ContainerDied","Data":"49d027d860ba5223563212cb898b35637b4e15b37b156251bf043d0b350a6586"} Feb 23 07:05:48 crc kubenswrapper[5047]: I0223 07:05:48.764848 5047 generic.go:334] "Generic (PLEG): container finished" podID="9b66eae6-4565-4f56-8bdc-009aa1101a64" containerID="8b62822f363fb0c003c60c543c9eb3209f0b6f65b4cfe08cf2245bd1ec54bdae" exitCode=0 Feb 23 07:05:48 crc kubenswrapper[5047]: I0223 07:05:48.766495 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9b66eae6-4565-4f56-8bdc-009aa1101a64","Type":"ContainerDied","Data":"8b62822f363fb0c003c60c543c9eb3209f0b6f65b4cfe08cf2245bd1ec54bdae"} Feb 23 07:05:49 crc kubenswrapper[5047]: I0223 07:05:49.280566 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 23 07:05:49 crc kubenswrapper[5047]: I0223 07:05:49.413314 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:49 crc kubenswrapper[5047]: I0223 07:05:49.472368 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:49 crc kubenswrapper[5047]: I0223 07:05:49.777680 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b","Type":"ContainerStarted","Data":"585471b95568f45e2db9a4efdec5ea803c6a7ec6ad4820bfee8358da781424fe"} Feb 23 07:05:49 crc kubenswrapper[5047]: I0223 07:05:49.782191 5047 generic.go:334] "Generic (PLEG): container finished" podID="e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" containerID="7c47637603783c1e4d108ef8ff2f579528131484ba7b615e78e7e9ee722c9d47" exitCode=0 Feb 23 07:05:49 crc kubenswrapper[5047]: I0223 07:05:49.782303 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" event={"ID":"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d","Type":"ContainerDied","Data":"7c47637603783c1e4d108ef8ff2f579528131484ba7b615e78e7e9ee722c9d47"} Feb 23 07:05:49 crc kubenswrapper[5047]: I0223 07:05:49.786459 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9b66eae6-4565-4f56-8bdc-009aa1101a64","Type":"ContainerStarted","Data":"2336c09e2d0dcabaf0c9bd5811bb1fa5b57db421e70518d3ce36701f7bc66f57"} Feb 23 07:05:49 crc kubenswrapper[5047]: I0223 07:05:49.828404 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:49 crc kubenswrapper[5047]: I0223 07:05:49.832124 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.424270245 podStartE2EDuration="33.832000979s" podCreationTimestamp="2026-02-23 07:05:16 +0000 UTC" firstStartedPulling="2026-02-23 07:05:35.648945706 +0000 UTC m=+1257.900272840" lastFinishedPulling="2026-02-23 07:05:44.05667644 +0000 UTC m=+1266.308003574" observedRunningTime="2026-02-23 07:05:49.812890721 +0000 UTC m=+1272.064217885" watchObservedRunningTime="2026-02-23 07:05:49.832000979 +0000 UTC m=+1272.083328213" Feb 23 07:05:49 crc kubenswrapper[5047]: I0223 07:05:49.859438 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:49 crc kubenswrapper[5047]: I0223 07:05:49.897433 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.737996149 podStartE2EDuration="32.897407238s" podCreationTimestamp="2026-02-23 07:05:17 +0000 UTC" firstStartedPulling="2026-02-23 07:05:35.400643475 +0000 UTC m=+1257.651970619" lastFinishedPulling="2026-02-23 07:05:43.560054534 +0000 UTC m=+1265.811381708" observedRunningTime="2026-02-23 07:05:49.877738256 +0000 UTC m=+1272.129065400" watchObservedRunningTime="2026-02-23 07:05:49.897407238 +0000 UTC m=+1272.148734382" Feb 23 07:05:49 crc kubenswrapper[5047]: I0223 07:05:49.905687 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.088374 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-7jhwn"] Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.131171 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-6w2hm"] Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.132707 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.136173 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.142826 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-6w2hm"] Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.196409 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dwd7\" (UniqueName: \"kubernetes.io/projected/c7651c70-c537-4aa9-88ca-daf3ac0076d4-kube-api-access-4dwd7\") pod \"dnsmasq-dns-57bdd75c-6w2hm\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.196617 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-6w2hm\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.196859 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-config\") pod \"dnsmasq-dns-57bdd75c-6w2hm\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.197061 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-dns-svc\") pod \"dnsmasq-dns-57bdd75c-6w2hm\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.298673 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-6w2hm\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.299232 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-config\") pod \"dnsmasq-dns-57bdd75c-6w2hm\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.299260 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-dns-svc\") pod \"dnsmasq-dns-57bdd75c-6w2hm\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.299289 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dwd7\" (UniqueName: \"kubernetes.io/projected/c7651c70-c537-4aa9-88ca-daf3ac0076d4-kube-api-access-4dwd7\") pod \"dnsmasq-dns-57bdd75c-6w2hm\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.300198 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-ndhg2"] Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.300627 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-6w2hm\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.301424 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-dns-svc\") pod \"dnsmasq-dns-57bdd75c-6w2hm\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.301491 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-config\") pod \"dnsmasq-dns-57bdd75c-6w2hm\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.359664 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dwd7\" (UniqueName: \"kubernetes.io/projected/c7651c70-c537-4aa9-88ca-daf3ac0076d4-kube-api-access-4dwd7\") pod \"dnsmasq-dns-57bdd75c-6w2hm\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.373930 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ds2lt"] Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.375770 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.379975 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ds2lt"] Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.382122 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.385737 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-hzs78"] Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.387991 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.397113 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.415458 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.422979 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.424781 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.425109 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.425283 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-tnbwk" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.425946 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.427631 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hzs78"] Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.447687 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.451136 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.505138 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.507145 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9ebb281-4310-483e-b599-3d3c8775e341-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.507206 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxxt\" (UniqueName: \"kubernetes.io/projected/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-kube-api-access-fnxxt\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.507248 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.507327 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-config\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.507371 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.507402 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-config\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.507512 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-ovs-rundir\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.507551 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.507598 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ebb281-4310-483e-b599-3d3c8775e341-config\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.507622 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.507715 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.507883 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9ebb281-4310-483e-b599-3d3c8775e341-scripts\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.507960 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-ovn-rundir\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.508030 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-combined-ca-bundle\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.508063 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gncfh\" (UniqueName: \"kubernetes.io/projected/b9ebb281-4310-483e-b599-3d3c8775e341-kube-api-access-gncfh\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.508092 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n59jp\" (UniqueName: \"kubernetes.io/projected/d7c0b33b-271e-4121-b14e-892fbed8edd8-kube-api-access-n59jp\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.508184 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.610110 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.610588 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.610611 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9ebb281-4310-483e-b599-3d3c8775e341-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.610644 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnxxt\" (UniqueName: \"kubernetes.io/projected/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-kube-api-access-fnxxt\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.610672 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.610698 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-config\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.612103 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9ebb281-4310-483e-b599-3d3c8775e341-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.612217 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.612816 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-config\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.613125 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.613508 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-config\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.613669 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-ovs-rundir\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.613713 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.613781 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ebb281-4310-483e-b599-3d3c8775e341-config\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.613841 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.613886 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.614082 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9ebb281-4310-483e-b599-3d3c8775e341-scripts\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.614153 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-ovn-rundir\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.614248 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-combined-ca-bundle\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.614284 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gncfh\" (UniqueName: \"kubernetes.io/projected/b9ebb281-4310-483e-b599-3d3c8775e341-kube-api-access-gncfh\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.614323 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n59jp\" (UniqueName: \"kubernetes.io/projected/d7c0b33b-271e-4121-b14e-892fbed8edd8-kube-api-access-n59jp\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.616612 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.616894 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-ovn-rundir\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.617343 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-ovs-rundir\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.617570 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.618015 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9ebb281-4310-483e-b599-3d3c8775e341-scripts\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.618276 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-config\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.618900 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ebb281-4310-483e-b599-3d3c8775e341-config\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.619002 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.619017 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.619356 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.622330 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-combined-ca-bundle\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.629428 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnxxt\" (UniqueName: \"kubernetes.io/projected/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-kube-api-access-fnxxt\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.634570 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n59jp\" (UniqueName: \"kubernetes.io/projected/d7c0b33b-271e-4121-b14e-892fbed8edd8-kube-api-access-n59jp\") pod \"dnsmasq-dns-75b7bcc64f-ds2lt\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.636184 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hzs78\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.636591 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gncfh\" (UniqueName: \"kubernetes.io/projected/b9ebb281-4310-483e-b599-3d3c8775e341-kube-api-access-gncfh\") pod \"ovn-northd-0\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.729763 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.744845 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.769425 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.813287 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" event={"ID":"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d","Type":"ContainerStarted","Data":"20e1066af98349c345b9c13a71f963cface0277d4dae09a7e39700e0b6e30998"} Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.813944 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" podUID="e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" containerName="dnsmasq-dns" containerID="cri-o://20e1066af98349c345b9c13a71f963cface0277d4dae09a7e39700e0b6e30998" gracePeriod=10 Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.814789 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" podUID="f74f3c47-526d-483a-bf9b-5b319e6bf5b9" containerName="dnsmasq-dns" containerID="cri-o://e25f4b49e7d65d39e17d899ae294165e10a6e3e040ea9c881305923af8671c35" gracePeriod=10 Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.842163 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" podStartSLOduration=-9223372000.012634 podStartE2EDuration="36.842141928s" podCreationTimestamp="2026-02-23 07:05:14 +0000 UTC" firstStartedPulling="2026-02-23 07:05:15.634763451 +0000 UTC m=+1237.886090585" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:05:50.838761068 +0000 UTC m=+1273.090088202" watchObservedRunningTime="2026-02-23 07:05:50.842141928 +0000 UTC m=+1273.093469062" Feb 23 07:05:50 crc kubenswrapper[5047]: I0223 07:05:50.941143 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-6w2hm"] Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.305192 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ds2lt"] Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.450275 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.462245 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.471570 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-6w2hm"] Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.486416 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.535693 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hzs78"] Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.591049 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-tx8hv"] Feb 23 07:05:51 crc kubenswrapper[5047]: E0223 07:05:51.595065 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74f3c47-526d-483a-bf9b-5b319e6bf5b9" containerName="init" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.595122 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74f3c47-526d-483a-bf9b-5b319e6bf5b9" containerName="init" Feb 23 07:05:51 crc kubenswrapper[5047]: E0223 07:05:51.595136 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74f3c47-526d-483a-bf9b-5b319e6bf5b9" containerName="dnsmasq-dns" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.595142 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74f3c47-526d-483a-bf9b-5b319e6bf5b9" containerName="dnsmasq-dns" Feb 23 07:05:51 crc kubenswrapper[5047]: E0223 07:05:51.595191 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" containerName="init" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.595201 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" containerName="init" Feb 23 07:05:51 crc kubenswrapper[5047]: E0223 07:05:51.595239 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" containerName="dnsmasq-dns" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.595249 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" containerName="dnsmasq-dns" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.595652 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74f3c47-526d-483a-bf9b-5b319e6bf5b9" containerName="dnsmasq-dns" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.595677 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" containerName="dnsmasq-dns" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.643619 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.643825 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.656283 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtdtj\" (UniqueName: \"kubernetes.io/projected/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-kube-api-access-qtdtj\") pod \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\" (UID: \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\") " Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.656384 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-config\") pod \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\" (UID: \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\") " Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.656492 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-config\") pod \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\" (UID: \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\") " Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.656586 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-dns-svc\") pod \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\" (UID: \"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d\") " Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.656629 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-dns-svc\") pod \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\" (UID: \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\") " Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.656749 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw4hd\" (UniqueName: \"kubernetes.io/projected/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-kube-api-access-bw4hd\") pod \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\" (UID: \"f74f3c47-526d-483a-bf9b-5b319e6bf5b9\") " Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.676703 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-tx8hv"] Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.679874 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-kube-api-access-qtdtj" (OuterVolumeSpecName: "kube-api-access-qtdtj") pod "e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" (UID: "e6ea1afd-fd03-4b55-86f1-0aedb614ed5d"). InnerVolumeSpecName "kube-api-access-qtdtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.684095 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-kube-api-access-bw4hd" (OuterVolumeSpecName: "kube-api-access-bw4hd") pod "f74f3c47-526d-483a-bf9b-5b319e6bf5b9" (UID: "f74f3c47-526d-483a-bf9b-5b319e6bf5b9"). InnerVolumeSpecName "kube-api-access-bw4hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.720293 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" (UID: "e6ea1afd-fd03-4b55-86f1-0aedb614ed5d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.726780 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-config" (OuterVolumeSpecName: "config") pod "f74f3c47-526d-483a-bf9b-5b319e6bf5b9" (UID: "f74f3c47-526d-483a-bf9b-5b319e6bf5b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.730923 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-config" (OuterVolumeSpecName: "config") pod "e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" (UID: "e6ea1afd-fd03-4b55-86f1-0aedb614ed5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.742762 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f74f3c47-526d-483a-bf9b-5b319e6bf5b9" (UID: "f74f3c47-526d-483a-bf9b-5b319e6bf5b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.759650 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-dns-svc\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.763183 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-config\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.763229 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.763262 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtzl8\" (UniqueName: \"kubernetes.io/projected/8c765369-da46-4bff-96f0-279d6c0b3f2c-kube-api-access-qtzl8\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.763283 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.763409 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.763434 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.763443 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.763454 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw4hd\" (UniqueName: \"kubernetes.io/projected/f74f3c47-526d-483a-bf9b-5b319e6bf5b9-kube-api-access-bw4hd\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.763463 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtdtj\" (UniqueName: \"kubernetes.io/projected/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-kube-api-access-qtdtj\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.763472 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.835965 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hzs78" event={"ID":"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d","Type":"ContainerStarted","Data":"3103dd1797324eceb063028eb91c70e4411d7a06ccdd9013ce0fb5bbb26c17d7"} Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.839360 5047 generic.go:334] "Generic (PLEG): container finished" podID="d7c0b33b-271e-4121-b14e-892fbed8edd8" containerID="b9f7aceebda11a124f924bc8957fcdb2e8818ee87d9db91cd8812a2cc0bf5b04" exitCode=0 Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.839414 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" event={"ID":"d7c0b33b-271e-4121-b14e-892fbed8edd8","Type":"ContainerDied","Data":"b9f7aceebda11a124f924bc8957fcdb2e8818ee87d9db91cd8812a2cc0bf5b04"} Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.839433 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" event={"ID":"d7c0b33b-271e-4121-b14e-892fbed8edd8","Type":"ContainerStarted","Data":"012f05ed58492e69a79053ef74dd67af8603302c09253bfa2047122e33411c3f"} Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.868056 5047 generic.go:334] "Generic (PLEG): container finished" podID="c7651c70-c537-4aa9-88ca-daf3ac0076d4" containerID="00a7f088734a75d96b7656ddad33afada199fb34a41d6c188a90b01ff17d902b" exitCode=0 Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.868224 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" event={"ID":"c7651c70-c537-4aa9-88ca-daf3ac0076d4","Type":"ContainerDied","Data":"00a7f088734a75d96b7656ddad33afada199fb34a41d6c188a90b01ff17d902b"} Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.868265 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" event={"ID":"c7651c70-c537-4aa9-88ca-daf3ac0076d4","Type":"ContainerStarted","Data":"62df6943de4619fd57c74c16322f398c7bd40b4c9d701a4dfa43d8098126c568"} Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.868541 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzl8\" (UniqueName: \"kubernetes.io/projected/8c765369-da46-4bff-96f0-279d6c0b3f2c-kube-api-access-qtzl8\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.868590 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.868674 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-dns-svc\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.868735 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-config\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.868787 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.873139 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-config\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.873780 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.874796 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.884150 5047 generic.go:334] "Generic (PLEG): container finished" podID="e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" containerID="20e1066af98349c345b9c13a71f963cface0277d4dae09a7e39700e0b6e30998" exitCode=0 Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.884231 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" event={"ID":"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d","Type":"ContainerDied","Data":"20e1066af98349c345b9c13a71f963cface0277d4dae09a7e39700e0b6e30998"} Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.884266 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" event={"ID":"e6ea1afd-fd03-4b55-86f1-0aedb614ed5d","Type":"ContainerDied","Data":"2be837834caca8a75c89207a1ae27368d0cac4b7c64fa84072916a5751bd3ab7"} Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.884285 5047 scope.go:117] "RemoveContainer" containerID="20e1066af98349c345b9c13a71f963cface0277d4dae09a7e39700e0b6e30998" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.884468 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-7jhwn" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.886509 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-dns-svc\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.896095 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtzl8\" (UniqueName: \"kubernetes.io/projected/8c765369-da46-4bff-96f0-279d6c0b3f2c-kube-api-access-qtzl8\") pod \"dnsmasq-dns-689df5d84f-tx8hv\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.916301 5047 generic.go:334] "Generic (PLEG): container finished" podID="f74f3c47-526d-483a-bf9b-5b319e6bf5b9" containerID="e25f4b49e7d65d39e17d899ae294165e10a6e3e040ea9c881305923af8671c35" exitCode=0 Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.916411 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" event={"ID":"f74f3c47-526d-483a-bf9b-5b319e6bf5b9","Type":"ContainerDied","Data":"e25f4b49e7d65d39e17d899ae294165e10a6e3e040ea9c881305923af8671c35"} Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.925117 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" event={"ID":"f74f3c47-526d-483a-bf9b-5b319e6bf5b9","Type":"ContainerDied","Data":"3f246833dacd98201d24c809cc0979b90a4d96befbb490841cb1486c69dd7cd6"} Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.916491 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-ndhg2" Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.931756 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9ebb281-4310-483e-b599-3d3c8775e341","Type":"ContainerStarted","Data":"b5afbdb946c72bb7bbbfe183450ae3c3aad2bc7a179458e53fb8a61799f6f391"} Feb 23 07:05:51 crc kubenswrapper[5047]: I0223 07:05:51.995077 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.059351 5047 scope.go:117] "RemoveContainer" containerID="7c47637603783c1e4d108ef8ff2f579528131484ba7b615e78e7e9ee722c9d47" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.085093 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-ndhg2"] Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.100891 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-ndhg2"] Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.122998 5047 scope.go:117] "RemoveContainer" containerID="20e1066af98349c345b9c13a71f963cface0277d4dae09a7e39700e0b6e30998" Feb 23 07:05:52 crc kubenswrapper[5047]: E0223 07:05:52.123561 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20e1066af98349c345b9c13a71f963cface0277d4dae09a7e39700e0b6e30998\": container with ID starting with 20e1066af98349c345b9c13a71f963cface0277d4dae09a7e39700e0b6e30998 not found: ID does not exist" containerID="20e1066af98349c345b9c13a71f963cface0277d4dae09a7e39700e0b6e30998" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.123611 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e1066af98349c345b9c13a71f963cface0277d4dae09a7e39700e0b6e30998"} err="failed to get container status \"20e1066af98349c345b9c13a71f963cface0277d4dae09a7e39700e0b6e30998\": rpc error: code = NotFound desc = could not find container \"20e1066af98349c345b9c13a71f963cface0277d4dae09a7e39700e0b6e30998\": container with ID starting with 20e1066af98349c345b9c13a71f963cface0277d4dae09a7e39700e0b6e30998 not found: ID does not exist" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.123671 5047 scope.go:117] "RemoveContainer" containerID="7c47637603783c1e4d108ef8ff2f579528131484ba7b615e78e7e9ee722c9d47" Feb 23 07:05:52 crc kubenswrapper[5047]: E0223 07:05:52.127812 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c47637603783c1e4d108ef8ff2f579528131484ba7b615e78e7e9ee722c9d47\": container with ID starting with 7c47637603783c1e4d108ef8ff2f579528131484ba7b615e78e7e9ee722c9d47 not found: ID does not exist" containerID="7c47637603783c1e4d108ef8ff2f579528131484ba7b615e78e7e9ee722c9d47" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.127873 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c47637603783c1e4d108ef8ff2f579528131484ba7b615e78e7e9ee722c9d47"} err="failed to get container status \"7c47637603783c1e4d108ef8ff2f579528131484ba7b615e78e7e9ee722c9d47\": rpc error: code = NotFound desc = could not find container \"7c47637603783c1e4d108ef8ff2f579528131484ba7b615e78e7e9ee722c9d47\": container with ID starting with 7c47637603783c1e4d108ef8ff2f579528131484ba7b615e78e7e9ee722c9d47 not found: ID does not exist" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.127937 5047 scope.go:117] "RemoveContainer" containerID="e25f4b49e7d65d39e17d899ae294165e10a6e3e040ea9c881305923af8671c35" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.131480 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-7jhwn"] Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.185137 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-7jhwn"] Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.283184 5047 scope.go:117] "RemoveContainer" containerID="83424b2b24f79daa8cfd22a12cfe274adc2d2263cc29b5b57bb07860a21fb696" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.376511 5047 scope.go:117] "RemoveContainer" containerID="e25f4b49e7d65d39e17d899ae294165e10a6e3e040ea9c881305923af8671c35" Feb 23 07:05:52 crc kubenswrapper[5047]: E0223 07:05:52.377359 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e25f4b49e7d65d39e17d899ae294165e10a6e3e040ea9c881305923af8671c35\": container with ID starting with e25f4b49e7d65d39e17d899ae294165e10a6e3e040ea9c881305923af8671c35 not found: ID does not exist" containerID="e25f4b49e7d65d39e17d899ae294165e10a6e3e040ea9c881305923af8671c35" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.377406 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25f4b49e7d65d39e17d899ae294165e10a6e3e040ea9c881305923af8671c35"} err="failed to get container status \"e25f4b49e7d65d39e17d899ae294165e10a6e3e040ea9c881305923af8671c35\": rpc error: code = NotFound desc = could not find container \"e25f4b49e7d65d39e17d899ae294165e10a6e3e040ea9c881305923af8671c35\": container with ID starting with e25f4b49e7d65d39e17d899ae294165e10a6e3e040ea9c881305923af8671c35 not found: ID does not exist" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.377442 5047 scope.go:117] "RemoveContainer" containerID="83424b2b24f79daa8cfd22a12cfe274adc2d2263cc29b5b57bb07860a21fb696" Feb 23 07:05:52 crc kubenswrapper[5047]: E0223 07:05:52.389447 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83424b2b24f79daa8cfd22a12cfe274adc2d2263cc29b5b57bb07860a21fb696\": container with ID starting with 83424b2b24f79daa8cfd22a12cfe274adc2d2263cc29b5b57bb07860a21fb696 not found: ID does not exist" containerID="83424b2b24f79daa8cfd22a12cfe274adc2d2263cc29b5b57bb07860a21fb696" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.389513 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83424b2b24f79daa8cfd22a12cfe274adc2d2263cc29b5b57bb07860a21fb696"} err="failed to get container status \"83424b2b24f79daa8cfd22a12cfe274adc2d2263cc29b5b57bb07860a21fb696\": rpc error: code = NotFound desc = could not find container \"83424b2b24f79daa8cfd22a12cfe274adc2d2263cc29b5b57bb07860a21fb696\": container with ID starting with 83424b2b24f79daa8cfd22a12cfe274adc2d2263cc29b5b57bb07860a21fb696 not found: ID does not exist" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.391993 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6ea1afd-fd03-4b55-86f1-0aedb614ed5d" path="/var/lib/kubelet/pods/e6ea1afd-fd03-4b55-86f1-0aedb614ed5d/volumes" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.393064 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74f3c47-526d-483a-bf9b-5b319e6bf5b9" path="/var/lib/kubelet/pods/f74f3c47-526d-483a-bf9b-5b319e6bf5b9/volumes" Feb 23 07:05:52 crc kubenswrapper[5047]: E0223 07:05:52.403528 5047 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 23 07:05:52 crc kubenswrapper[5047]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/d7c0b33b-271e-4121-b14e-892fbed8edd8/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 23 07:05:52 crc kubenswrapper[5047]: > podSandboxID="012f05ed58492e69a79053ef74dd67af8603302c09253bfa2047122e33411c3f" Feb 23 07:05:52 crc kubenswrapper[5047]: E0223 07:05:52.403758 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:05:52 crc kubenswrapper[5047]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h99h64ch5dbh6dh555h587h64bh5cfh647h5fdh57ch679h9h597h5f5hbch59bh54fh575h566h667h586h5f5h65ch5bch57h68h65ch58bh694h5cfq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n59jp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-75b7bcc64f-ds2lt_openstack(d7c0b33b-271e-4121-b14e-892fbed8edd8): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/d7c0b33b-271e-4121-b14e-892fbed8edd8/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 23 07:05:52 crc kubenswrapper[5047]: > logger="UnhandledError" Feb 23 07:05:52 crc kubenswrapper[5047]: E0223 07:05:52.405042 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/d7c0b33b-271e-4121-b14e-892fbed8edd8/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" podUID="d7c0b33b-271e-4121-b14e-892fbed8edd8" Feb 23 07:05:52 crc kubenswrapper[5047]: E0223 07:05:52.446264 5047 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 23 07:05:52 crc kubenswrapper[5047]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/c7651c70-c537-4aa9-88ca-daf3ac0076d4/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 23 07:05:52 crc kubenswrapper[5047]: > podSandboxID="62df6943de4619fd57c74c16322f398c7bd40b4c9d701a4dfa43d8098126c568" Feb 23 07:05:52 crc kubenswrapper[5047]: E0223 07:05:52.446491 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:05:52 crc kubenswrapper[5047]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58bh65dh95hf6h595hf6hf5h59dh6h57dh558h55ch5dbh5f5h565h5f7h9fh76h58ch54dh84h59bh7fh6bh5b9h59h67fh566h56h5f4h554h58fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4dwd7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57bdd75c-6w2hm_openstack(c7651c70-c537-4aa9-88ca-daf3ac0076d4): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/c7651c70-c537-4aa9-88ca-daf3ac0076d4/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 23 07:05:52 crc kubenswrapper[5047]: > logger="UnhandledError" Feb 23 07:05:52 crc kubenswrapper[5047]: E0223 07:05:52.451120 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/c7651c70-c537-4aa9-88ca-daf3ac0076d4/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" podUID="c7651c70-c537-4aa9-88ca-daf3ac0076d4" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.623613 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-tx8hv"] Feb 23 07:05:52 crc kubenswrapper[5047]: W0223 07:05:52.635314 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c765369_da46_4bff_96f0_279d6c0b3f2c.slice/crio-aa35409474a6e9fe7bdbc1a97dbe79db571b94bfde41cde6ede6d634b1fd7e1f WatchSource:0}: Error finding container aa35409474a6e9fe7bdbc1a97dbe79db571b94bfde41cde6ede6d634b1fd7e1f: Status 404 returned error can't find the container with id aa35409474a6e9fe7bdbc1a97dbe79db571b94bfde41cde6ede6d634b1fd7e1f Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.674453 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.685161 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.687427 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.689379 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.692538 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-bkdmf" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.692576 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.702316 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.800631 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/62c288fd-a798-4337-882a-ab4ebb8331cb-lock\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.800676 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/62c288fd-a798-4337-882a-ab4ebb8331cb-cache\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.800708 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c288fd-a798-4337-882a-ab4ebb8331cb-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.800753 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.800779 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wffzr\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-kube-api-access-wffzr\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.800829 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.902189 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.902275 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/62c288fd-a798-4337-882a-ab4ebb8331cb-lock\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.902299 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/62c288fd-a798-4337-882a-ab4ebb8331cb-cache\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.902329 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c288fd-a798-4337-882a-ab4ebb8331cb-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.902371 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.902399 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wffzr\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-kube-api-access-wffzr\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.902633 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.902956 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/62c288fd-a798-4337-882a-ab4ebb8331cb-lock\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.903197 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/62c288fd-a798-4337-882a-ab4ebb8331cb-cache\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: E0223 07:05:52.903199 5047 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 07:05:52 crc kubenswrapper[5047]: E0223 07:05:52.903250 5047 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 07:05:52 crc kubenswrapper[5047]: E0223 07:05:52.903294 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift podName:62c288fd-a798-4337-882a-ab4ebb8331cb nodeName:}" failed. No retries permitted until 2026-02-23 07:05:53.403275741 +0000 UTC m=+1275.654602875 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift") pod "swift-storage-0" (UID: "62c288fd-a798-4337-882a-ab4ebb8331cb") : configmap "swift-ring-files" not found Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.914262 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c288fd-a798-4337-882a-ab4ebb8331cb-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.928051 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wffzr\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-kube-api-access-wffzr\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.945890 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.955837 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hzs78" event={"ID":"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d","Type":"ContainerStarted","Data":"13fb30f0e98afd5102c75b063d5e9dedad3a88eb8a1cb4f9dfb35321bd545830"} Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.961966 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" event={"ID":"8c765369-da46-4bff-96f0-279d6c0b3f2c","Type":"ContainerStarted","Data":"aa35409474a6e9fe7bdbc1a97dbe79db571b94bfde41cde6ede6d634b1fd7e1f"} Feb 23 07:05:52 crc kubenswrapper[5047]: I0223 07:05:52.991552 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-hzs78" podStartSLOduration=2.991526457 podStartE2EDuration="2.991526457s" podCreationTimestamp="2026-02-23 07:05:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:05:52.989848043 +0000 UTC m=+1275.241175177" watchObservedRunningTime="2026-02-23 07:05:52.991526457 +0000 UTC m=+1275.242853591" Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.315844 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.437612 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-ovsdbserver-nb\") pod \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.437771 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dwd7\" (UniqueName: \"kubernetes.io/projected/c7651c70-c537-4aa9-88ca-daf3ac0076d4-kube-api-access-4dwd7\") pod \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.437833 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-dns-svc\") pod \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.437994 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-config\") pod \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\" (UID: \"c7651c70-c537-4aa9-88ca-daf3ac0076d4\") " Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.438405 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:53 crc kubenswrapper[5047]: E0223 07:05:53.439977 5047 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 07:05:53 crc kubenswrapper[5047]: E0223 07:05:53.440008 5047 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 07:05:53 crc kubenswrapper[5047]: E0223 07:05:53.440067 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift podName:62c288fd-a798-4337-882a-ab4ebb8331cb nodeName:}" failed. No retries permitted until 2026-02-23 07:05:54.440045413 +0000 UTC m=+1276.691372547 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift") pod "swift-storage-0" (UID: "62c288fd-a798-4337-882a-ab4ebb8331cb") : configmap "swift-ring-files" not found Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.453874 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7651c70-c537-4aa9-88ca-daf3ac0076d4-kube-api-access-4dwd7" (OuterVolumeSpecName: "kube-api-access-4dwd7") pod "c7651c70-c537-4aa9-88ca-daf3ac0076d4" (UID: "c7651c70-c537-4aa9-88ca-daf3ac0076d4"). InnerVolumeSpecName "kube-api-access-4dwd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.484561 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c7651c70-c537-4aa9-88ca-daf3ac0076d4" (UID: "c7651c70-c537-4aa9-88ca-daf3ac0076d4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.485675 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-config" (OuterVolumeSpecName: "config") pod "c7651c70-c537-4aa9-88ca-daf3ac0076d4" (UID: "c7651c70-c537-4aa9-88ca-daf3ac0076d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.485797 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7651c70-c537-4aa9-88ca-daf3ac0076d4" (UID: "c7651c70-c537-4aa9-88ca-daf3ac0076d4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.540486 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.540536 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dwd7\" (UniqueName: \"kubernetes.io/projected/c7651c70-c537-4aa9-88ca-daf3ac0076d4-kube-api-access-4dwd7\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.540549 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.540559 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7651c70-c537-4aa9-88ca-daf3ac0076d4-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.970605 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.970605 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-6w2hm" event={"ID":"c7651c70-c537-4aa9-88ca-daf3ac0076d4","Type":"ContainerDied","Data":"62df6943de4619fd57c74c16322f398c7bd40b4c9d701a4dfa43d8098126c568"} Feb 23 07:05:53 crc kubenswrapper[5047]: I0223 07:05:53.970690 5047 scope.go:117] "RemoveContainer" containerID="00a7f088734a75d96b7656ddad33afada199fb34a41d6c188a90b01ff17d902b" Feb 23 07:05:54 crc kubenswrapper[5047]: I0223 07:05:54.076103 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-6w2hm"] Feb 23 07:05:54 crc kubenswrapper[5047]: I0223 07:05:54.082284 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-6w2hm"] Feb 23 07:05:54 crc kubenswrapper[5047]: I0223 07:05:54.353210 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7651c70-c537-4aa9-88ca-daf3ac0076d4" path="/var/lib/kubelet/pods/c7651c70-c537-4aa9-88ca-daf3ac0076d4/volumes" Feb 23 07:05:54 crc kubenswrapper[5047]: I0223 07:05:54.477397 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:54 crc kubenswrapper[5047]: E0223 07:05:54.479318 5047 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 07:05:54 crc kubenswrapper[5047]: E0223 07:05:54.479346 5047 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 07:05:54 crc kubenswrapper[5047]: E0223 07:05:54.479398 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift podName:62c288fd-a798-4337-882a-ab4ebb8331cb nodeName:}" failed. No retries permitted until 2026-02-23 07:05:56.479375937 +0000 UTC m=+1278.730703071 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift") pod "swift-storage-0" (UID: "62c288fd-a798-4337-882a-ab4ebb8331cb") : configmap "swift-ring-files" not found Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.540964 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:05:56 crc kubenswrapper[5047]: E0223 07:05:56.541402 5047 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 07:05:56 crc kubenswrapper[5047]: E0223 07:05:56.541534 5047 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 07:05:56 crc kubenswrapper[5047]: E0223 07:05:56.541598 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift podName:62c288fd-a798-4337-882a-ab4ebb8331cb nodeName:}" failed. No retries permitted until 2026-02-23 07:06:00.541577329 +0000 UTC m=+1282.792904463 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift") pod "swift-storage-0" (UID: "62c288fd-a798-4337-882a-ab4ebb8331cb") : configmap "swift-ring-files" not found Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.567306 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-kbprb"] Feb 23 07:05:56 crc kubenswrapper[5047]: E0223 07:05:56.569779 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7651c70-c537-4aa9-88ca-daf3ac0076d4" containerName="init" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.569861 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7651c70-c537-4aa9-88ca-daf3ac0076d4" containerName="init" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.570932 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7651c70-c537-4aa9-88ca-daf3ac0076d4" containerName="init" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.572725 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.578933 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.579233 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.579537 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.601974 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kbprb"] Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.643843 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-dispersionconf\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.643954 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfs2p\" (UniqueName: \"kubernetes.io/projected/156f1174-6481-45c2-8a34-bc744828e345-kube-api-access-cfs2p\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.644007 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/156f1174-6481-45c2-8a34-bc744828e345-ring-data-devices\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.644034 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-swiftconf\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.644231 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-combined-ca-bundle\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.644277 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/156f1174-6481-45c2-8a34-bc744828e345-scripts\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.644336 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/156f1174-6481-45c2-8a34-bc744828e345-etc-swift\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.747108 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-dispersionconf\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.747185 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfs2p\" (UniqueName: \"kubernetes.io/projected/156f1174-6481-45c2-8a34-bc744828e345-kube-api-access-cfs2p\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.747219 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/156f1174-6481-45c2-8a34-bc744828e345-ring-data-devices\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.747257 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-swiftconf\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.747382 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-combined-ca-bundle\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.747418 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/156f1174-6481-45c2-8a34-bc744828e345-scripts\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.747483 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/156f1174-6481-45c2-8a34-bc744828e345-etc-swift\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.749059 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/156f1174-6481-45c2-8a34-bc744828e345-etc-swift\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.749304 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/156f1174-6481-45c2-8a34-bc744828e345-ring-data-devices\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.749457 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/156f1174-6481-45c2-8a34-bc744828e345-scripts\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.757247 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-combined-ca-bundle\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.762736 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-swiftconf\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.765354 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-dispersionconf\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.770418 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfs2p\" (UniqueName: \"kubernetes.io/projected/156f1174-6481-45c2-8a34-bc744828e345-kube-api-access-cfs2p\") pod \"swift-ring-rebalance-kbprb\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:56 crc kubenswrapper[5047]: I0223 07:05:56.909635 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:05:57 crc kubenswrapper[5047]: I0223 07:05:57.442326 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kbprb"] Feb 23 07:05:57 crc kubenswrapper[5047]: I0223 07:05:57.623696 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 23 07:05:57 crc kubenswrapper[5047]: I0223 07:05:57.623770 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 23 07:05:58 crc kubenswrapper[5047]: I0223 07:05:58.035820 5047 generic.go:334] "Generic (PLEG): container finished" podID="8c765369-da46-4bff-96f0-279d6c0b3f2c" containerID="0737c350066d281c56cf052bf14fe4e41d910589f64b82514b9bb7568b6dfc8b" exitCode=0 Feb 23 07:05:58 crc kubenswrapper[5047]: I0223 07:05:58.036013 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" event={"ID":"8c765369-da46-4bff-96f0-279d6c0b3f2c","Type":"ContainerDied","Data":"0737c350066d281c56cf052bf14fe4e41d910589f64b82514b9bb7568b6dfc8b"} Feb 23 07:05:58 crc kubenswrapper[5047]: I0223 07:05:58.038745 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kbprb" event={"ID":"156f1174-6481-45c2-8a34-bc744828e345","Type":"ContainerStarted","Data":"b6139725321d6788c01c1eab200e4edb70a2d0e0b01e45905300d6bbfd1f6e40"} Feb 23 07:05:58 crc kubenswrapper[5047]: I0223 07:05:58.043686 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" event={"ID":"d7c0b33b-271e-4121-b14e-892fbed8edd8","Type":"ContainerStarted","Data":"f0458b12101f7d74254c12fcc50eb022287bb9bd6626f32c2ebdb3c6979dadf7"} Feb 23 07:05:58 crc kubenswrapper[5047]: I0223 07:05:58.044136 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:05:58 crc kubenswrapper[5047]: I0223 07:05:58.082168 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" podStartSLOduration=8.082145181 podStartE2EDuration="8.082145181s" podCreationTimestamp="2026-02-23 07:05:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:05:58.08096477 +0000 UTC m=+1280.332291904" watchObservedRunningTime="2026-02-23 07:05:58.082145181 +0000 UTC m=+1280.333472315" Feb 23 07:05:58 crc kubenswrapper[5047]: I0223 07:05:58.988555 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:58 crc kubenswrapper[5047]: I0223 07:05:58.988885 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:59 crc kubenswrapper[5047]: I0223 07:05:59.057198 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" event={"ID":"8c765369-da46-4bff-96f0-279d6c0b3f2c","Type":"ContainerStarted","Data":"828610b690bbe81f9a30874069a355a617bd2aee568a38e0062ad76bb08b84ce"} Feb 23 07:05:59 crc kubenswrapper[5047]: I0223 07:05:59.057679 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:05:59 crc kubenswrapper[5047]: I0223 07:05:59.060348 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9ebb281-4310-483e-b599-3d3c8775e341","Type":"ContainerStarted","Data":"1495f22cd17554121ec2c04f6d6b705647c2b88d975d0e151cc974cc3354d8b0"} Feb 23 07:05:59 crc kubenswrapper[5047]: I0223 07:05:59.060402 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9ebb281-4310-483e-b599-3d3c8775e341","Type":"ContainerStarted","Data":"5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22"} Feb 23 07:05:59 crc kubenswrapper[5047]: I0223 07:05:59.060732 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 23 07:05:59 crc kubenswrapper[5047]: I0223 07:05:59.087806 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" podStartSLOduration=8.087734679 podStartE2EDuration="8.087734679s" podCreationTimestamp="2026-02-23 07:05:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:05:59.077948729 +0000 UTC m=+1281.329275913" watchObservedRunningTime="2026-02-23 07:05:59.087734679 +0000 UTC m=+1281.339061823" Feb 23 07:05:59 crc kubenswrapper[5047]: I0223 07:05:59.110202 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.808802139 podStartE2EDuration="9.110175305s" podCreationTimestamp="2026-02-23 07:05:50 +0000 UTC" firstStartedPulling="2026-02-23 07:05:51.573698289 +0000 UTC m=+1273.825025423" lastFinishedPulling="2026-02-23 07:05:57.875071455 +0000 UTC m=+1280.126398589" observedRunningTime="2026-02-23 07:05:59.106548069 +0000 UTC m=+1281.357875223" watchObservedRunningTime="2026-02-23 07:05:59.110175305 +0000 UTC m=+1281.361502439" Feb 23 07:06:00 crc kubenswrapper[5047]: I0223 07:06:00.627465 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:06:00 crc kubenswrapper[5047]: E0223 07:06:00.627707 5047 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 07:06:00 crc kubenswrapper[5047]: E0223 07:06:00.627728 5047 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 07:06:00 crc kubenswrapper[5047]: E0223 07:06:00.627790 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift podName:62c288fd-a798-4337-882a-ab4ebb8331cb nodeName:}" failed. No retries permitted until 2026-02-23 07:06:08.627768627 +0000 UTC m=+1290.879095781 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift") pod "swift-storage-0" (UID: "62c288fd-a798-4337-882a-ab4ebb8331cb") : configmap "swift-ring-files" not found Feb 23 07:06:01 crc kubenswrapper[5047]: I0223 07:06:01.079762 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kbprb" event={"ID":"156f1174-6481-45c2-8a34-bc744828e345","Type":"ContainerStarted","Data":"9019902801f87614a70096198dfd3e7b6168c72d3eeaac06fc70c4c41623d9e7"} Feb 23 07:06:01 crc kubenswrapper[5047]: I0223 07:06:01.116762 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-kbprb" podStartSLOduration=1.86843608 podStartE2EDuration="5.116724898s" podCreationTimestamp="2026-02-23 07:05:56 +0000 UTC" firstStartedPulling="2026-02-23 07:05:57.444650281 +0000 UTC m=+1279.695977455" lastFinishedPulling="2026-02-23 07:06:00.692939129 +0000 UTC m=+1282.944266273" observedRunningTime="2026-02-23 07:06:01.109879415 +0000 UTC m=+1283.361206579" watchObservedRunningTime="2026-02-23 07:06:01.116724898 +0000 UTC m=+1283.368052072" Feb 23 07:06:01 crc kubenswrapper[5047]: I0223 07:06:01.338781 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 23 07:06:01 crc kubenswrapper[5047]: I0223 07:06:01.459575 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 23 07:06:01 crc kubenswrapper[5047]: I0223 07:06:01.720585 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 23 07:06:01 crc kubenswrapper[5047]: I0223 07:06:01.840514 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 23 07:06:05 crc kubenswrapper[5047]: I0223 07:06:05.732198 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:06:06 crc kubenswrapper[5047]: I0223 07:06:06.461844 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9ktkb"] Feb 23 07:06:06 crc kubenswrapper[5047]: I0223 07:06:06.463347 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9ktkb" Feb 23 07:06:06 crc kubenswrapper[5047]: I0223 07:06:06.466448 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 23 07:06:06 crc kubenswrapper[5047]: I0223 07:06:06.475886 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9ktkb"] Feb 23 07:06:06 crc kubenswrapper[5047]: I0223 07:06:06.573697 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e093bba7-db7f-4f7c-b498-ab4a4d06c661-operator-scripts\") pod \"root-account-create-update-9ktkb\" (UID: \"e093bba7-db7f-4f7c-b498-ab4a4d06c661\") " pod="openstack/root-account-create-update-9ktkb" Feb 23 07:06:06 crc kubenswrapper[5047]: I0223 07:06:06.573975 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4gw5\" (UniqueName: \"kubernetes.io/projected/e093bba7-db7f-4f7c-b498-ab4a4d06c661-kube-api-access-r4gw5\") pod \"root-account-create-update-9ktkb\" (UID: \"e093bba7-db7f-4f7c-b498-ab4a4d06c661\") " pod="openstack/root-account-create-update-9ktkb" Feb 23 07:06:06 crc kubenswrapper[5047]: I0223 07:06:06.675332 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4gw5\" (UniqueName: \"kubernetes.io/projected/e093bba7-db7f-4f7c-b498-ab4a4d06c661-kube-api-access-r4gw5\") pod \"root-account-create-update-9ktkb\" (UID: \"e093bba7-db7f-4f7c-b498-ab4a4d06c661\") " pod="openstack/root-account-create-update-9ktkb" Feb 23 07:06:06 crc kubenswrapper[5047]: I0223 07:06:06.675423 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e093bba7-db7f-4f7c-b498-ab4a4d06c661-operator-scripts\") pod \"root-account-create-update-9ktkb\" (UID: \"e093bba7-db7f-4f7c-b498-ab4a4d06c661\") " pod="openstack/root-account-create-update-9ktkb" Feb 23 07:06:06 crc kubenswrapper[5047]: I0223 07:06:06.676938 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e093bba7-db7f-4f7c-b498-ab4a4d06c661-operator-scripts\") pod \"root-account-create-update-9ktkb\" (UID: \"e093bba7-db7f-4f7c-b498-ab4a4d06c661\") " pod="openstack/root-account-create-update-9ktkb" Feb 23 07:06:06 crc kubenswrapper[5047]: I0223 07:06:06.726346 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4gw5\" (UniqueName: \"kubernetes.io/projected/e093bba7-db7f-4f7c-b498-ab4a4d06c661-kube-api-access-r4gw5\") pod \"root-account-create-update-9ktkb\" (UID: \"e093bba7-db7f-4f7c-b498-ab4a4d06c661\") " pod="openstack/root-account-create-update-9ktkb" Feb 23 07:06:06 crc kubenswrapper[5047]: I0223 07:06:06.811217 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9ktkb" Feb 23 07:06:06 crc kubenswrapper[5047]: I0223 07:06:06.996156 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.065969 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ds2lt"] Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.066286 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" podUID="d7c0b33b-271e-4121-b14e-892fbed8edd8" containerName="dnsmasq-dns" containerID="cri-o://f0458b12101f7d74254c12fcc50eb022287bb9bd6626f32c2ebdb3c6979dadf7" gracePeriod=10 Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.373696 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9ktkb"] Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.484274 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.600521 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-ovsdbserver-nb\") pod \"d7c0b33b-271e-4121-b14e-892fbed8edd8\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.600638 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-config\") pod \"d7c0b33b-271e-4121-b14e-892fbed8edd8\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.600680 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-dns-svc\") pod \"d7c0b33b-271e-4121-b14e-892fbed8edd8\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.600830 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-ovsdbserver-sb\") pod \"d7c0b33b-271e-4121-b14e-892fbed8edd8\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.600897 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n59jp\" (UniqueName: \"kubernetes.io/projected/d7c0b33b-271e-4121-b14e-892fbed8edd8-kube-api-access-n59jp\") pod \"d7c0b33b-271e-4121-b14e-892fbed8edd8\" (UID: \"d7c0b33b-271e-4121-b14e-892fbed8edd8\") " Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.608131 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c0b33b-271e-4121-b14e-892fbed8edd8-kube-api-access-n59jp" (OuterVolumeSpecName: "kube-api-access-n59jp") pod "d7c0b33b-271e-4121-b14e-892fbed8edd8" (UID: "d7c0b33b-271e-4121-b14e-892fbed8edd8"). InnerVolumeSpecName "kube-api-access-n59jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.650333 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7c0b33b-271e-4121-b14e-892fbed8edd8" (UID: "d7c0b33b-271e-4121-b14e-892fbed8edd8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.665844 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-config" (OuterVolumeSpecName: "config") pod "d7c0b33b-271e-4121-b14e-892fbed8edd8" (UID: "d7c0b33b-271e-4121-b14e-892fbed8edd8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.674822 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7c0b33b-271e-4121-b14e-892fbed8edd8" (UID: "d7c0b33b-271e-4121-b14e-892fbed8edd8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.676099 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7c0b33b-271e-4121-b14e-892fbed8edd8" (UID: "d7c0b33b-271e-4121-b14e-892fbed8edd8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.704115 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.704176 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.704196 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.704212 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n59jp\" (UniqueName: \"kubernetes.io/projected/d7c0b33b-271e-4121-b14e-892fbed8edd8-kube-api-access-n59jp\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:07 crc kubenswrapper[5047]: I0223 07:06:07.704226 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c0b33b-271e-4121-b14e-892fbed8edd8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.156732 5047 generic.go:334] "Generic (PLEG): container finished" podID="e093bba7-db7f-4f7c-b498-ab4a4d06c661" containerID="7f3ef48ce4065d19a564ca8ebcc229080a58904b941c27c470fd874c0374ed7f" exitCode=0 Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.157196 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9ktkb" event={"ID":"e093bba7-db7f-4f7c-b498-ab4a4d06c661","Type":"ContainerDied","Data":"7f3ef48ce4065d19a564ca8ebcc229080a58904b941c27c470fd874c0374ed7f"} Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.157462 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9ktkb" event={"ID":"e093bba7-db7f-4f7c-b498-ab4a4d06c661","Type":"ContainerStarted","Data":"74ad97cee02cf02e9798b2aa1d9396dc6c8bc3475bbca857a52f740913a0ec01"} Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.160751 5047 generic.go:334] "Generic (PLEG): container finished" podID="d7c0b33b-271e-4121-b14e-892fbed8edd8" containerID="f0458b12101f7d74254c12fcc50eb022287bb9bd6626f32c2ebdb3c6979dadf7" exitCode=0 Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.160870 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.160953 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" event={"ID":"d7c0b33b-271e-4121-b14e-892fbed8edd8","Type":"ContainerDied","Data":"f0458b12101f7d74254c12fcc50eb022287bb9bd6626f32c2ebdb3c6979dadf7"} Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.161023 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ds2lt" event={"ID":"d7c0b33b-271e-4121-b14e-892fbed8edd8","Type":"ContainerDied","Data":"012f05ed58492e69a79053ef74dd67af8603302c09253bfa2047122e33411c3f"} Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.161061 5047 scope.go:117] "RemoveContainer" containerID="f0458b12101f7d74254c12fcc50eb022287bb9bd6626f32c2ebdb3c6979dadf7" Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.164210 5047 generic.go:334] "Generic (PLEG): container finished" podID="156f1174-6481-45c2-8a34-bc744828e345" containerID="9019902801f87614a70096198dfd3e7b6168c72d3eeaac06fc70c4c41623d9e7" exitCode=0 Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.164276 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kbprb" event={"ID":"156f1174-6481-45c2-8a34-bc744828e345","Type":"ContainerDied","Data":"9019902801f87614a70096198dfd3e7b6168c72d3eeaac06fc70c4c41623d9e7"} Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.197465 5047 scope.go:117] "RemoveContainer" containerID="b9f7aceebda11a124f924bc8957fcdb2e8818ee87d9db91cd8812a2cc0bf5b04" Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.249330 5047 scope.go:117] "RemoveContainer" containerID="f0458b12101f7d74254c12fcc50eb022287bb9bd6626f32c2ebdb3c6979dadf7" Feb 23 07:06:08 crc kubenswrapper[5047]: E0223 07:06:08.251984 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0458b12101f7d74254c12fcc50eb022287bb9bd6626f32c2ebdb3c6979dadf7\": container with ID starting with f0458b12101f7d74254c12fcc50eb022287bb9bd6626f32c2ebdb3c6979dadf7 not found: ID does not exist" containerID="f0458b12101f7d74254c12fcc50eb022287bb9bd6626f32c2ebdb3c6979dadf7" Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.252106 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0458b12101f7d74254c12fcc50eb022287bb9bd6626f32c2ebdb3c6979dadf7"} err="failed to get container status \"f0458b12101f7d74254c12fcc50eb022287bb9bd6626f32c2ebdb3c6979dadf7\": rpc error: code = NotFound desc = could not find container \"f0458b12101f7d74254c12fcc50eb022287bb9bd6626f32c2ebdb3c6979dadf7\": container with ID starting with f0458b12101f7d74254c12fcc50eb022287bb9bd6626f32c2ebdb3c6979dadf7 not found: ID does not exist" Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.252164 5047 scope.go:117] "RemoveContainer" containerID="b9f7aceebda11a124f924bc8957fcdb2e8818ee87d9db91cd8812a2cc0bf5b04" Feb 23 07:06:08 crc kubenswrapper[5047]: E0223 07:06:08.252767 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f7aceebda11a124f924bc8957fcdb2e8818ee87d9db91cd8812a2cc0bf5b04\": container with ID starting with b9f7aceebda11a124f924bc8957fcdb2e8818ee87d9db91cd8812a2cc0bf5b04 not found: ID does not exist" containerID="b9f7aceebda11a124f924bc8957fcdb2e8818ee87d9db91cd8812a2cc0bf5b04" Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.252894 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f7aceebda11a124f924bc8957fcdb2e8818ee87d9db91cd8812a2cc0bf5b04"} err="failed to get container status \"b9f7aceebda11a124f924bc8957fcdb2e8818ee87d9db91cd8812a2cc0bf5b04\": rpc error: code = NotFound desc = could not find container \"b9f7aceebda11a124f924bc8957fcdb2e8818ee87d9db91cd8812a2cc0bf5b04\": container with ID starting with b9f7aceebda11a124f924bc8957fcdb2e8818ee87d9db91cd8812a2cc0bf5b04 not found: ID does not exist" Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.267981 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ds2lt"] Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.281693 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ds2lt"] Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.361813 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c0b33b-271e-4121-b14e-892fbed8edd8" path="/var/lib/kubelet/pods/d7c0b33b-271e-4121-b14e-892fbed8edd8/volumes" Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.639126 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.654762 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift\") pod \"swift-storage-0\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " pod="openstack/swift-storage-0" Feb 23 07:06:08 crc kubenswrapper[5047]: I0223 07:06:08.938500 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.186526 5047 generic.go:334] "Generic (PLEG): container finished" podID="6776acf2-e53f-4892-847d-8667669a5eb9" containerID="1550a785ef91ff8b4c17d9b0e6b0acd706caa1828e577823dabee876eb27c6e4" exitCode=0 Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.187141 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6776acf2-e53f-4892-847d-8667669a5eb9","Type":"ContainerDied","Data":"1550a785ef91ff8b4c17d9b0e6b0acd706caa1828e577823dabee876eb27c6e4"} Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.196473 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e83857b-5e17-4878-8f9b-e8d1a65325ba","Type":"ContainerDied","Data":"57718f64876e2968f7e2d3c286ddd88b4f77442fd1898477e29b9138ee17816a"} Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.196629 5047 generic.go:334] "Generic (PLEG): container finished" podID="1e83857b-5e17-4878-8f9b-e8d1a65325ba" containerID="57718f64876e2968f7e2d3c286ddd88b4f77442fd1898477e29b9138ee17816a" exitCode=0 Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.573796 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.586629 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-dbb59"] Feb 23 07:06:09 crc kubenswrapper[5047]: E0223 07:06:09.587113 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c0b33b-271e-4121-b14e-892fbed8edd8" containerName="dnsmasq-dns" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.587134 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c0b33b-271e-4121-b14e-892fbed8edd8" containerName="dnsmasq-dns" Feb 23 07:06:09 crc kubenswrapper[5047]: E0223 07:06:09.587172 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c0b33b-271e-4121-b14e-892fbed8edd8" containerName="init" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.587180 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c0b33b-271e-4121-b14e-892fbed8edd8" containerName="init" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.587343 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c0b33b-271e-4121-b14e-892fbed8edd8" containerName="dnsmasq-dns" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.588064 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dbb59" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.609411 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dbb59"] Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.618012 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-86db-account-create-update-5dpfx"] Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.619838 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-86db-account-create-update-5dpfx" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.622668 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.653763 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-86db-account-create-update-5dpfx"] Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.672370 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.672582 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2629401-ad79-4aab-b3ec-665c69eeaf78-operator-scripts\") pod \"glance-db-create-dbb59\" (UID: \"c2629401-ad79-4aab-b3ec-665c69eeaf78\") " pod="openstack/glance-db-create-dbb59" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.672629 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n84qj\" (UniqueName: \"kubernetes.io/projected/c2629401-ad79-4aab-b3ec-665c69eeaf78-kube-api-access-n84qj\") pod \"glance-db-create-dbb59\" (UID: \"c2629401-ad79-4aab-b3ec-665c69eeaf78\") " pod="openstack/glance-db-create-dbb59" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.701534 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9ktkb" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.774179 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfs2p\" (UniqueName: \"kubernetes.io/projected/156f1174-6481-45c2-8a34-bc744828e345-kube-api-access-cfs2p\") pod \"156f1174-6481-45c2-8a34-bc744828e345\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.774231 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/156f1174-6481-45c2-8a34-bc744828e345-etc-swift\") pod \"156f1174-6481-45c2-8a34-bc744828e345\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.774269 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-swiftconf\") pod \"156f1174-6481-45c2-8a34-bc744828e345\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.774397 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e093bba7-db7f-4f7c-b498-ab4a4d06c661-operator-scripts\") pod \"e093bba7-db7f-4f7c-b498-ab4a4d06c661\" (UID: \"e093bba7-db7f-4f7c-b498-ab4a4d06c661\") " Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.774419 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-combined-ca-bundle\") pod \"156f1174-6481-45c2-8a34-bc744828e345\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.774440 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/156f1174-6481-45c2-8a34-bc744828e345-ring-data-devices\") pod \"156f1174-6481-45c2-8a34-bc744828e345\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.774480 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-dispersionconf\") pod \"156f1174-6481-45c2-8a34-bc744828e345\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.774504 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4gw5\" (UniqueName: \"kubernetes.io/projected/e093bba7-db7f-4f7c-b498-ab4a4d06c661-kube-api-access-r4gw5\") pod \"e093bba7-db7f-4f7c-b498-ab4a4d06c661\" (UID: \"e093bba7-db7f-4f7c-b498-ab4a4d06c661\") " Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.774572 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/156f1174-6481-45c2-8a34-bc744828e345-scripts\") pod \"156f1174-6481-45c2-8a34-bc744828e345\" (UID: \"156f1174-6481-45c2-8a34-bc744828e345\") " Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.774930 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5k47\" (UniqueName: \"kubernetes.io/projected/b8693532-d88f-49d6-a72c-4d751df3a3eb-kube-api-access-t5k47\") pod \"glance-86db-account-create-update-5dpfx\" (UID: \"b8693532-d88f-49d6-a72c-4d751df3a3eb\") " pod="openstack/glance-86db-account-create-update-5dpfx" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.774998 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8693532-d88f-49d6-a72c-4d751df3a3eb-operator-scripts\") pod \"glance-86db-account-create-update-5dpfx\" (UID: \"b8693532-d88f-49d6-a72c-4d751df3a3eb\") " pod="openstack/glance-86db-account-create-update-5dpfx" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.775104 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2629401-ad79-4aab-b3ec-665c69eeaf78-operator-scripts\") pod \"glance-db-create-dbb59\" (UID: \"c2629401-ad79-4aab-b3ec-665c69eeaf78\") " pod="openstack/glance-db-create-dbb59" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.775137 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n84qj\" (UniqueName: \"kubernetes.io/projected/c2629401-ad79-4aab-b3ec-665c69eeaf78-kube-api-access-n84qj\") pod \"glance-db-create-dbb59\" (UID: \"c2629401-ad79-4aab-b3ec-665c69eeaf78\") " pod="openstack/glance-db-create-dbb59" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.775469 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/156f1174-6481-45c2-8a34-bc744828e345-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "156f1174-6481-45c2-8a34-bc744828e345" (UID: "156f1174-6481-45c2-8a34-bc744828e345"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.776887 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2629401-ad79-4aab-b3ec-665c69eeaf78-operator-scripts\") pod \"glance-db-create-dbb59\" (UID: \"c2629401-ad79-4aab-b3ec-665c69eeaf78\") " pod="openstack/glance-db-create-dbb59" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.776886 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/156f1174-6481-45c2-8a34-bc744828e345-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "156f1174-6481-45c2-8a34-bc744828e345" (UID: "156f1174-6481-45c2-8a34-bc744828e345"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.777486 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e093bba7-db7f-4f7c-b498-ab4a4d06c661-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e093bba7-db7f-4f7c-b498-ab4a4d06c661" (UID: "e093bba7-db7f-4f7c-b498-ab4a4d06c661"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.779749 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e093bba7-db7f-4f7c-b498-ab4a4d06c661-kube-api-access-r4gw5" (OuterVolumeSpecName: "kube-api-access-r4gw5") pod "e093bba7-db7f-4f7c-b498-ab4a4d06c661" (UID: "e093bba7-db7f-4f7c-b498-ab4a4d06c661"). InnerVolumeSpecName "kube-api-access-r4gw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.780383 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/156f1174-6481-45c2-8a34-bc744828e345-kube-api-access-cfs2p" (OuterVolumeSpecName: "kube-api-access-cfs2p") pod "156f1174-6481-45c2-8a34-bc744828e345" (UID: "156f1174-6481-45c2-8a34-bc744828e345"). InnerVolumeSpecName "kube-api-access-cfs2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.782925 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "156f1174-6481-45c2-8a34-bc744828e345" (UID: "156f1174-6481-45c2-8a34-bc744828e345"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.792089 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n84qj\" (UniqueName: \"kubernetes.io/projected/c2629401-ad79-4aab-b3ec-665c69eeaf78-kube-api-access-n84qj\") pod \"glance-db-create-dbb59\" (UID: \"c2629401-ad79-4aab-b3ec-665c69eeaf78\") " pod="openstack/glance-db-create-dbb59" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.796007 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/156f1174-6481-45c2-8a34-bc744828e345-scripts" (OuterVolumeSpecName: "scripts") pod "156f1174-6481-45c2-8a34-bc744828e345" (UID: "156f1174-6481-45c2-8a34-bc744828e345"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.803996 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "156f1174-6481-45c2-8a34-bc744828e345" (UID: "156f1174-6481-45c2-8a34-bc744828e345"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.808340 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "156f1174-6481-45c2-8a34-bc744828e345" (UID: "156f1174-6481-45c2-8a34-bc744828e345"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.876962 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5k47\" (UniqueName: \"kubernetes.io/projected/b8693532-d88f-49d6-a72c-4d751df3a3eb-kube-api-access-t5k47\") pod \"glance-86db-account-create-update-5dpfx\" (UID: \"b8693532-d88f-49d6-a72c-4d751df3a3eb\") " pod="openstack/glance-86db-account-create-update-5dpfx" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.877063 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8693532-d88f-49d6-a72c-4d751df3a3eb-operator-scripts\") pod \"glance-86db-account-create-update-5dpfx\" (UID: \"b8693532-d88f-49d6-a72c-4d751df3a3eb\") " pod="openstack/glance-86db-account-create-update-5dpfx" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.877221 5047 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.877332 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4gw5\" (UniqueName: \"kubernetes.io/projected/e093bba7-db7f-4f7c-b498-ab4a4d06c661-kube-api-access-r4gw5\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.877357 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/156f1174-6481-45c2-8a34-bc744828e345-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.877371 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfs2p\" (UniqueName: \"kubernetes.io/projected/156f1174-6481-45c2-8a34-bc744828e345-kube-api-access-cfs2p\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.877384 5047 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/156f1174-6481-45c2-8a34-bc744828e345-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.877395 5047 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.877410 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e093bba7-db7f-4f7c-b498-ab4a4d06c661-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.877418 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156f1174-6481-45c2-8a34-bc744828e345-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.877435 5047 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/156f1174-6481-45c2-8a34-bc744828e345-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.878111 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8693532-d88f-49d6-a72c-4d751df3a3eb-operator-scripts\") pod \"glance-86db-account-create-update-5dpfx\" (UID: \"b8693532-d88f-49d6-a72c-4d751df3a3eb\") " pod="openstack/glance-86db-account-create-update-5dpfx" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.892069 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5k47\" (UniqueName: \"kubernetes.io/projected/b8693532-d88f-49d6-a72c-4d751df3a3eb-kube-api-access-t5k47\") pod \"glance-86db-account-create-update-5dpfx\" (UID: \"b8693532-d88f-49d6-a72c-4d751df3a3eb\") " pod="openstack/glance-86db-account-create-update-5dpfx" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.956172 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dbb59" Feb 23 07:06:09 crc kubenswrapper[5047]: I0223 07:06:09.978443 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-86db-account-create-update-5dpfx" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.173525 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-cglrk"] Feb 23 07:06:10 crc kubenswrapper[5047]: E0223 07:06:10.174332 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e093bba7-db7f-4f7c-b498-ab4a4d06c661" containerName="mariadb-account-create-update" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.174345 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e093bba7-db7f-4f7c-b498-ab4a4d06c661" containerName="mariadb-account-create-update" Feb 23 07:06:10 crc kubenswrapper[5047]: E0223 07:06:10.174378 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156f1174-6481-45c2-8a34-bc744828e345" containerName="swift-ring-rebalance" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.174385 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="156f1174-6481-45c2-8a34-bc744828e345" containerName="swift-ring-rebalance" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.174550 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e093bba7-db7f-4f7c-b498-ab4a4d06c661" containerName="mariadb-account-create-update" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.174571 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="156f1174-6481-45c2-8a34-bc744828e345" containerName="swift-ring-rebalance" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.175195 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cglrk" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.190870 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cglrk"] Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.207526 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kbprb" event={"ID":"156f1174-6481-45c2-8a34-bc744828e345","Type":"ContainerDied","Data":"b6139725321d6788c01c1eab200e4edb70a2d0e0b01e45905300d6bbfd1f6e40"} Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.207577 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6139725321d6788c01c1eab200e4edb70a2d0e0b01e45905300d6bbfd1f6e40" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.207648 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kbprb" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.212506 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9ktkb" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.212517 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9ktkb" event={"ID":"e093bba7-db7f-4f7c-b498-ab4a4d06c661","Type":"ContainerDied","Data":"74ad97cee02cf02e9798b2aa1d9396dc6c8bc3475bbca857a52f740913a0ec01"} Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.212598 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74ad97cee02cf02e9798b2aa1d9396dc6c8bc3475bbca857a52f740913a0ec01" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.215478 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"69eae6d331ca4b69ebdf013793798bcaf88077b4d26b91c8edac4565e0eb41e4"} Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.217351 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6776acf2-e53f-4892-847d-8667669a5eb9","Type":"ContainerStarted","Data":"4ace61959f8493ad41766db44b20ac96b4147030ebc66648117da372593fea6c"} Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.217555 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.227800 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e83857b-5e17-4878-8f9b-e8d1a65325ba","Type":"ContainerStarted","Data":"2ac5bc2b66d5e8526a58472b2784d5b39320ba72eae19f8bc017391874b9616a"} Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.228110 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.247472 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=41.913027127 podStartE2EDuration="56.247452124s" podCreationTimestamp="2026-02-23 07:05:14 +0000 UTC" firstStartedPulling="2026-02-23 07:05:20.581024447 +0000 UTC m=+1242.832351581" lastFinishedPulling="2026-02-23 07:05:34.915449444 +0000 UTC m=+1257.166776578" observedRunningTime="2026-02-23 07:06:10.244578398 +0000 UTC m=+1292.495905522" watchObservedRunningTime="2026-02-23 07:06:10.247452124 +0000 UTC m=+1292.498779258" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.288777 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxqxj\" (UniqueName: \"kubernetes.io/projected/b78afe08-d5e4-45d8-9149-4c0a1020a4cf-kube-api-access-pxqxj\") pod \"keystone-db-create-cglrk\" (UID: \"b78afe08-d5e4-45d8-9149-4c0a1020a4cf\") " pod="openstack/keystone-db-create-cglrk" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.288939 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b78afe08-d5e4-45d8-9149-4c0a1020a4cf-operator-scripts\") pod \"keystone-db-create-cglrk\" (UID: \"b78afe08-d5e4-45d8-9149-4c0a1020a4cf\") " pod="openstack/keystone-db-create-cglrk" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.362673 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8b47-account-create-update-btdp7"] Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.364176 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b47-account-create-update-btdp7" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.368794 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8b47-account-create-update-btdp7"] Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.369416 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.384847 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=56.384809326 podStartE2EDuration="56.384809326s" podCreationTimestamp="2026-02-23 07:05:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:10.301511801 +0000 UTC m=+1292.552838935" watchObservedRunningTime="2026-02-23 07:06:10.384809326 +0000 UTC m=+1292.636136460" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.390838 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxqxj\" (UniqueName: \"kubernetes.io/projected/b78afe08-d5e4-45d8-9149-4c0a1020a4cf-kube-api-access-pxqxj\") pod \"keystone-db-create-cglrk\" (UID: \"b78afe08-d5e4-45d8-9149-4c0a1020a4cf\") " pod="openstack/keystone-db-create-cglrk" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.391301 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b78afe08-d5e4-45d8-9149-4c0a1020a4cf-operator-scripts\") pod \"keystone-db-create-cglrk\" (UID: \"b78afe08-d5e4-45d8-9149-4c0a1020a4cf\") " pod="openstack/keystone-db-create-cglrk" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.395667 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b78afe08-d5e4-45d8-9149-4c0a1020a4cf-operator-scripts\") pod \"keystone-db-create-cglrk\" (UID: \"b78afe08-d5e4-45d8-9149-4c0a1020a4cf\") " pod="openstack/keystone-db-create-cglrk" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.416387 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxqxj\" (UniqueName: \"kubernetes.io/projected/b78afe08-d5e4-45d8-9149-4c0a1020a4cf-kube-api-access-pxqxj\") pod \"keystone-db-create-cglrk\" (UID: \"b78afe08-d5e4-45d8-9149-4c0a1020a4cf\") " pod="openstack/keystone-db-create-cglrk" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.432813 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-s9nk6"] Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.433997 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s9nk6" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.447807 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-s9nk6"] Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.472631 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dbb59"] Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.492968 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a23e58e-d907-4f3e-92cb-be6652ceb9d6-operator-scripts\") pod \"keystone-8b47-account-create-update-btdp7\" (UID: \"2a23e58e-d907-4f3e-92cb-be6652ceb9d6\") " pod="openstack/keystone-8b47-account-create-update-btdp7" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.493062 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk2qp\" (UniqueName: \"kubernetes.io/projected/0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5-kube-api-access-bk2qp\") pod \"placement-db-create-s9nk6\" (UID: \"0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5\") " pod="openstack/placement-db-create-s9nk6" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.493260 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5-operator-scripts\") pod \"placement-db-create-s9nk6\" (UID: \"0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5\") " pod="openstack/placement-db-create-s9nk6" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.493552 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqhq6\" (UniqueName: \"kubernetes.io/projected/2a23e58e-d907-4f3e-92cb-be6652ceb9d6-kube-api-access-rqhq6\") pod \"keystone-8b47-account-create-update-btdp7\" (UID: \"2a23e58e-d907-4f3e-92cb-be6652ceb9d6\") " pod="openstack/keystone-8b47-account-create-update-btdp7" Feb 23 07:06:10 crc kubenswrapper[5047]: W0223 07:06:10.495131 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2629401_ad79_4aab_b3ec_665c69eeaf78.slice/crio-f6660fee5c143c2335881bfa39dda7c0c95b037f17778ec86619d9e0e3882d21 WatchSource:0}: Error finding container f6660fee5c143c2335881bfa39dda7c0c95b037f17778ec86619d9e0e3882d21: Status 404 returned error can't find the container with id f6660fee5c143c2335881bfa39dda7c0c95b037f17778ec86619d9e0e3882d21 Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.501868 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cglrk" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.538541 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b51d-account-create-update-9qmbt"] Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.539696 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b51d-account-create-update-9qmbt" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.544064 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.551624 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b51d-account-create-update-9qmbt"] Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.586278 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-86db-account-create-update-5dpfx"] Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.595861 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5-operator-scripts\") pod \"placement-db-create-s9nk6\" (UID: \"0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5\") " pod="openstack/placement-db-create-s9nk6" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.595970 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9-operator-scripts\") pod \"placement-b51d-account-create-update-9qmbt\" (UID: \"a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9\") " pod="openstack/placement-b51d-account-create-update-9qmbt" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.596067 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqhq6\" (UniqueName: \"kubernetes.io/projected/2a23e58e-d907-4f3e-92cb-be6652ceb9d6-kube-api-access-rqhq6\") pod \"keystone-8b47-account-create-update-btdp7\" (UID: \"2a23e58e-d907-4f3e-92cb-be6652ceb9d6\") " pod="openstack/keystone-8b47-account-create-update-btdp7" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.596106 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a23e58e-d907-4f3e-92cb-be6652ceb9d6-operator-scripts\") pod \"keystone-8b47-account-create-update-btdp7\" (UID: \"2a23e58e-d907-4f3e-92cb-be6652ceb9d6\") " pod="openstack/keystone-8b47-account-create-update-btdp7" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.596453 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk2qp\" (UniqueName: \"kubernetes.io/projected/0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5-kube-api-access-bk2qp\") pod \"placement-db-create-s9nk6\" (UID: \"0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5\") " pod="openstack/placement-db-create-s9nk6" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.596585 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jmmz\" (UniqueName: \"kubernetes.io/projected/a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9-kube-api-access-6jmmz\") pod \"placement-b51d-account-create-update-9qmbt\" (UID: \"a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9\") " pod="openstack/placement-b51d-account-create-update-9qmbt" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.597010 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a23e58e-d907-4f3e-92cb-be6652ceb9d6-operator-scripts\") pod \"keystone-8b47-account-create-update-btdp7\" (UID: \"2a23e58e-d907-4f3e-92cb-be6652ceb9d6\") " pod="openstack/keystone-8b47-account-create-update-btdp7" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.597064 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5-operator-scripts\") pod \"placement-db-create-s9nk6\" (UID: \"0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5\") " pod="openstack/placement-db-create-s9nk6" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.617132 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqhq6\" (UniqueName: \"kubernetes.io/projected/2a23e58e-d907-4f3e-92cb-be6652ceb9d6-kube-api-access-rqhq6\") pod \"keystone-8b47-account-create-update-btdp7\" (UID: \"2a23e58e-d907-4f3e-92cb-be6652ceb9d6\") " pod="openstack/keystone-8b47-account-create-update-btdp7" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.627826 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk2qp\" (UniqueName: \"kubernetes.io/projected/0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5-kube-api-access-bk2qp\") pod \"placement-db-create-s9nk6\" (UID: \"0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5\") " pod="openstack/placement-db-create-s9nk6" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.693298 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b47-account-create-update-btdp7" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.698190 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jmmz\" (UniqueName: \"kubernetes.io/projected/a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9-kube-api-access-6jmmz\") pod \"placement-b51d-account-create-update-9qmbt\" (UID: \"a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9\") " pod="openstack/placement-b51d-account-create-update-9qmbt" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.698261 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9-operator-scripts\") pod \"placement-b51d-account-create-update-9qmbt\" (UID: \"a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9\") " pod="openstack/placement-b51d-account-create-update-9qmbt" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.699097 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9-operator-scripts\") pod \"placement-b51d-account-create-update-9qmbt\" (UID: \"a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9\") " pod="openstack/placement-b51d-account-create-update-9qmbt" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.716834 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jmmz\" (UniqueName: \"kubernetes.io/projected/a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9-kube-api-access-6jmmz\") pod \"placement-b51d-account-create-update-9qmbt\" (UID: \"a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9\") " pod="openstack/placement-b51d-account-create-update-9qmbt" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.759786 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s9nk6" Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.833748 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 23 07:06:10 crc kubenswrapper[5047]: W0223 07:06:10.849991 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8693532_d88f_49d6_a72c_4d751df3a3eb.slice/crio-cd1749260948cbb442575177f18b68e30c60722a1a5130ea840bab08205dbce6 WatchSource:0}: Error finding container cd1749260948cbb442575177f18b68e30c60722a1a5130ea840bab08205dbce6: Status 404 returned error can't find the container with id cd1749260948cbb442575177f18b68e30c60722a1a5130ea840bab08205dbce6 Feb 23 07:06:10 crc kubenswrapper[5047]: I0223 07:06:10.865146 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b51d-account-create-update-9qmbt" Feb 23 07:06:11 crc kubenswrapper[5047]: I0223 07:06:11.237926 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dbb59" event={"ID":"c2629401-ad79-4aab-b3ec-665c69eeaf78","Type":"ContainerStarted","Data":"752c7e2e7f7fbc780ee3298c5345b10c649bdfe5f98e119767fa42545912892e"} Feb 23 07:06:11 crc kubenswrapper[5047]: I0223 07:06:11.239396 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dbb59" event={"ID":"c2629401-ad79-4aab-b3ec-665c69eeaf78","Type":"ContainerStarted","Data":"f6660fee5c143c2335881bfa39dda7c0c95b037f17778ec86619d9e0e3882d21"} Feb 23 07:06:11 crc kubenswrapper[5047]: I0223 07:06:11.240529 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-86db-account-create-update-5dpfx" event={"ID":"b8693532-d88f-49d6-a72c-4d751df3a3eb","Type":"ContainerStarted","Data":"cd1749260948cbb442575177f18b68e30c60722a1a5130ea840bab08205dbce6"} Feb 23 07:06:11 crc kubenswrapper[5047]: I0223 07:06:11.254151 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-dbb59" podStartSLOduration=2.25413323 podStartE2EDuration="2.25413323s" podCreationTimestamp="2026-02-23 07:06:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:11.252661802 +0000 UTC m=+1293.503988936" watchObservedRunningTime="2026-02-23 07:06:11.25413323 +0000 UTC m=+1293.505460364" Feb 23 07:06:11 crc kubenswrapper[5047]: I0223 07:06:11.523077 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8b47-account-create-update-btdp7"] Feb 23 07:06:11 crc kubenswrapper[5047]: I0223 07:06:11.534161 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cglrk"] Feb 23 07:06:11 crc kubenswrapper[5047]: W0223 07:06:11.544155 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a23e58e_d907_4f3e_92cb_be6652ceb9d6.slice/crio-3a5a85339eec5608544c4de1483ddd2cc5bcf1c5afece4f3af2083a8d213338c WatchSource:0}: Error finding container 3a5a85339eec5608544c4de1483ddd2cc5bcf1c5afece4f3af2083a8d213338c: Status 404 returned error can't find the container with id 3a5a85339eec5608544c4de1483ddd2cc5bcf1c5afece4f3af2083a8d213338c Feb 23 07:06:11 crc kubenswrapper[5047]: W0223 07:06:11.544677 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb78afe08_d5e4_45d8_9149_4c0a1020a4cf.slice/crio-2456574d0db0418095cd599763057bf43f0ff0a6de7e1b0a0051ee0d908c2189 WatchSource:0}: Error finding container 2456574d0db0418095cd599763057bf43f0ff0a6de7e1b0a0051ee0d908c2189: Status 404 returned error can't find the container with id 2456574d0db0418095cd599763057bf43f0ff0a6de7e1b0a0051ee0d908c2189 Feb 23 07:06:11 crc kubenswrapper[5047]: I0223 07:06:11.634836 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-s9nk6"] Feb 23 07:06:11 crc kubenswrapper[5047]: I0223 07:06:11.646134 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b51d-account-create-update-9qmbt"] Feb 23 07:06:11 crc kubenswrapper[5047]: W0223 07:06:11.678839 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dc3bfa6_76a6_45fb_bf1f_7c12b0959ce5.slice/crio-d1576d2707147dc8aaf51a6888db0ac9df0bf5f4a09fdee99021837e745a3cee WatchSource:0}: Error finding container d1576d2707147dc8aaf51a6888db0ac9df0bf5f4a09fdee99021837e745a3cee: Status 404 returned error can't find the container with id d1576d2707147dc8aaf51a6888db0ac9df0bf5f4a09fdee99021837e745a3cee Feb 23 07:06:11 crc kubenswrapper[5047]: W0223 07:06:11.680077 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda90c97f0_3c5e_4b65_aee9_bd9c2dd5fbb9.slice/crio-564613d17c8c5f3477c33b163fc55f7b1389159732f369284061712229a11b14 WatchSource:0}: Error finding container 564613d17c8c5f3477c33b163fc55f7b1389159732f369284061712229a11b14: Status 404 returned error can't find the container with id 564613d17c8c5f3477c33b163fc55f7b1389159732f369284061712229a11b14 Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.251728 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"17ebe3519431a9953fc33fed5f2c323c9c8f0bbaad32401ddf2b831449414025"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.251790 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"a19fd42fd14901f6743c36bb99a68aa21bd85e41d93af3960094cb69a6484383"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.251804 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"ef7d56ab512ba7ddcef234fef06dfaa7575dadd48c522e8ea9b685da5f62a286"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.251815 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"07872e17cdb6ef25b82b90ca7744f33fec2b7c83d31131b37d885019a804232f"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.253626 5047 generic.go:334] "Generic (PLEG): container finished" podID="a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9" containerID="f8b0272cd2fe35be240a50c0e711eb436bc05335fe0527ea4e69dae436cba0f8" exitCode=0 Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.253699 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b51d-account-create-update-9qmbt" event={"ID":"a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9","Type":"ContainerDied","Data":"f8b0272cd2fe35be240a50c0e711eb436bc05335fe0527ea4e69dae436cba0f8"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.253728 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b51d-account-create-update-9qmbt" event={"ID":"a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9","Type":"ContainerStarted","Data":"564613d17c8c5f3477c33b163fc55f7b1389159732f369284061712229a11b14"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.256205 5047 generic.go:334] "Generic (PLEG): container finished" podID="b8693532-d88f-49d6-a72c-4d751df3a3eb" containerID="79c4349a7439839109ef0f83c46bc4618e14ddc90456258b12240e41e67a98fb" exitCode=0 Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.256278 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-86db-account-create-update-5dpfx" event={"ID":"b8693532-d88f-49d6-a72c-4d751df3a3eb","Type":"ContainerDied","Data":"79c4349a7439839109ef0f83c46bc4618e14ddc90456258b12240e41e67a98fb"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.258169 5047 generic.go:334] "Generic (PLEG): container finished" podID="0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5" containerID="1267325c6c80fc648355dcf0f22835eb1816566e49499882efde0dc53f579027" exitCode=0 Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.258232 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s9nk6" event={"ID":"0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5","Type":"ContainerDied","Data":"1267325c6c80fc648355dcf0f22835eb1816566e49499882efde0dc53f579027"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.258315 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s9nk6" event={"ID":"0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5","Type":"ContainerStarted","Data":"d1576d2707147dc8aaf51a6888db0ac9df0bf5f4a09fdee99021837e745a3cee"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.259993 5047 generic.go:334] "Generic (PLEG): container finished" podID="c2629401-ad79-4aab-b3ec-665c69eeaf78" containerID="752c7e2e7f7fbc780ee3298c5345b10c649bdfe5f98e119767fa42545912892e" exitCode=0 Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.260065 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dbb59" event={"ID":"c2629401-ad79-4aab-b3ec-665c69eeaf78","Type":"ContainerDied","Data":"752c7e2e7f7fbc780ee3298c5345b10c649bdfe5f98e119767fa42545912892e"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.262082 5047 generic.go:334] "Generic (PLEG): container finished" podID="2a23e58e-d907-4f3e-92cb-be6652ceb9d6" containerID="db02e6016211bb19ce14e8399eb56289a991d891d3f38632e18d21c7226ecf33" exitCode=0 Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.262138 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8b47-account-create-update-btdp7" event={"ID":"2a23e58e-d907-4f3e-92cb-be6652ceb9d6","Type":"ContainerDied","Data":"db02e6016211bb19ce14e8399eb56289a991d891d3f38632e18d21c7226ecf33"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.262197 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8b47-account-create-update-btdp7" event={"ID":"2a23e58e-d907-4f3e-92cb-be6652ceb9d6","Type":"ContainerStarted","Data":"3a5a85339eec5608544c4de1483ddd2cc5bcf1c5afece4f3af2083a8d213338c"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.263756 5047 generic.go:334] "Generic (PLEG): container finished" podID="b78afe08-d5e4-45d8-9149-4c0a1020a4cf" containerID="a03cc75d00468d42e82d8719df0248c48ca5c394864f1ed186b8504f10f3ac61" exitCode=0 Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.263813 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cglrk" event={"ID":"b78afe08-d5e4-45d8-9149-4c0a1020a4cf","Type":"ContainerDied","Data":"a03cc75d00468d42e82d8719df0248c48ca5c394864f1ed186b8504f10f3ac61"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.263843 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cglrk" event={"ID":"b78afe08-d5e4-45d8-9149-4c0a1020a4cf","Type":"ContainerStarted","Data":"2456574d0db0418095cd599763057bf43f0ff0a6de7e1b0a0051ee0d908c2189"} Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.685193 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9ktkb"] Feb 23 07:06:12 crc kubenswrapper[5047]: I0223 07:06:12.704542 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9ktkb"] Feb 23 07:06:13 crc kubenswrapper[5047]: I0223 07:06:13.551295 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cglrk" Feb 23 07:06:13 crc kubenswrapper[5047]: I0223 07:06:13.694435 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b78afe08-d5e4-45d8-9149-4c0a1020a4cf-operator-scripts\") pod \"b78afe08-d5e4-45d8-9149-4c0a1020a4cf\" (UID: \"b78afe08-d5e4-45d8-9149-4c0a1020a4cf\") " Feb 23 07:06:13 crc kubenswrapper[5047]: I0223 07:06:13.694624 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxqxj\" (UniqueName: \"kubernetes.io/projected/b78afe08-d5e4-45d8-9149-4c0a1020a4cf-kube-api-access-pxqxj\") pod \"b78afe08-d5e4-45d8-9149-4c0a1020a4cf\" (UID: \"b78afe08-d5e4-45d8-9149-4c0a1020a4cf\") " Feb 23 07:06:13 crc kubenswrapper[5047]: I0223 07:06:13.696167 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b78afe08-d5e4-45d8-9149-4c0a1020a4cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b78afe08-d5e4-45d8-9149-4c0a1020a4cf" (UID: "b78afe08-d5e4-45d8-9149-4c0a1020a4cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:13 crc kubenswrapper[5047]: I0223 07:06:13.700247 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b78afe08-d5e4-45d8-9149-4c0a1020a4cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:13 crc kubenswrapper[5047]: I0223 07:06:13.715643 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b78afe08-d5e4-45d8-9149-4c0a1020a4cf-kube-api-access-pxqxj" (OuterVolumeSpecName: "kube-api-access-pxqxj") pod "b78afe08-d5e4-45d8-9149-4c0a1020a4cf" (UID: "b78afe08-d5e4-45d8-9149-4c0a1020a4cf"). InnerVolumeSpecName "kube-api-access-pxqxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:13 crc kubenswrapper[5047]: I0223 07:06:13.802564 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxqxj\" (UniqueName: \"kubernetes.io/projected/b78afe08-d5e4-45d8-9149-4c0a1020a4cf-kube-api-access-pxqxj\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:13 crc kubenswrapper[5047]: I0223 07:06:13.958819 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-86db-account-create-update-5dpfx" Feb 23 07:06:13 crc kubenswrapper[5047]: I0223 07:06:13.963936 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b47-account-create-update-btdp7" Feb 23 07:06:13 crc kubenswrapper[5047]: I0223 07:06:13.971732 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b51d-account-create-update-9qmbt" Feb 23 07:06:13 crc kubenswrapper[5047]: I0223 07:06:13.980106 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s9nk6" Feb 23 07:06:13 crc kubenswrapper[5047]: I0223 07:06:13.982827 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dbb59" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.009368 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9-operator-scripts\") pod \"a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9\" (UID: \"a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9\") " Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.009446 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jmmz\" (UniqueName: \"kubernetes.io/projected/a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9-kube-api-access-6jmmz\") pod \"a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9\" (UID: \"a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9\") " Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.009538 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2629401-ad79-4aab-b3ec-665c69eeaf78-operator-scripts\") pod \"c2629401-ad79-4aab-b3ec-665c69eeaf78\" (UID: \"c2629401-ad79-4aab-b3ec-665c69eeaf78\") " Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.009573 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5-operator-scripts\") pod \"0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5\" (UID: \"0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5\") " Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.009591 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5k47\" (UniqueName: \"kubernetes.io/projected/b8693532-d88f-49d6-a72c-4d751df3a3eb-kube-api-access-t5k47\") pod \"b8693532-d88f-49d6-a72c-4d751df3a3eb\" (UID: \"b8693532-d88f-49d6-a72c-4d751df3a3eb\") " Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.009659 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8693532-d88f-49d6-a72c-4d751df3a3eb-operator-scripts\") pod \"b8693532-d88f-49d6-a72c-4d751df3a3eb\" (UID: \"b8693532-d88f-49d6-a72c-4d751df3a3eb\") " Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.009716 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n84qj\" (UniqueName: \"kubernetes.io/projected/c2629401-ad79-4aab-b3ec-665c69eeaf78-kube-api-access-n84qj\") pod \"c2629401-ad79-4aab-b3ec-665c69eeaf78\" (UID: \"c2629401-ad79-4aab-b3ec-665c69eeaf78\") " Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.009775 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a23e58e-d907-4f3e-92cb-be6652ceb9d6-operator-scripts\") pod \"2a23e58e-d907-4f3e-92cb-be6652ceb9d6\" (UID: \"2a23e58e-d907-4f3e-92cb-be6652ceb9d6\") " Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.009855 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk2qp\" (UniqueName: \"kubernetes.io/projected/0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5-kube-api-access-bk2qp\") pod \"0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5\" (UID: \"0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5\") " Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.009882 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqhq6\" (UniqueName: \"kubernetes.io/projected/2a23e58e-d907-4f3e-92cb-be6652ceb9d6-kube-api-access-rqhq6\") pod \"2a23e58e-d907-4f3e-92cb-be6652ceb9d6\" (UID: \"2a23e58e-d907-4f3e-92cb-be6652ceb9d6\") " Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.012262 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8693532-d88f-49d6-a72c-4d751df3a3eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8693532-d88f-49d6-a72c-4d751df3a3eb" (UID: "b8693532-d88f-49d6-a72c-4d751df3a3eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.012480 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a23e58e-d907-4f3e-92cb-be6652ceb9d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a23e58e-d907-4f3e-92cb-be6652ceb9d6" (UID: "2a23e58e-d907-4f3e-92cb-be6652ceb9d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.012996 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2629401-ad79-4aab-b3ec-665c69eeaf78-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2629401-ad79-4aab-b3ec-665c69eeaf78" (UID: "c2629401-ad79-4aab-b3ec-665c69eeaf78"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.013069 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5" (UID: "0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.014249 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9" (UID: "a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.015150 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2629401-ad79-4aab-b3ec-665c69eeaf78-kube-api-access-n84qj" (OuterVolumeSpecName: "kube-api-access-n84qj") pod "c2629401-ad79-4aab-b3ec-665c69eeaf78" (UID: "c2629401-ad79-4aab-b3ec-665c69eeaf78"). InnerVolumeSpecName "kube-api-access-n84qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.016739 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5-kube-api-access-bk2qp" (OuterVolumeSpecName: "kube-api-access-bk2qp") pod "0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5" (UID: "0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5"). InnerVolumeSpecName "kube-api-access-bk2qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.018048 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8693532-d88f-49d6-a72c-4d751df3a3eb-kube-api-access-t5k47" (OuterVolumeSpecName: "kube-api-access-t5k47") pod "b8693532-d88f-49d6-a72c-4d751df3a3eb" (UID: "b8693532-d88f-49d6-a72c-4d751df3a3eb"). InnerVolumeSpecName "kube-api-access-t5k47". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.026374 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9-kube-api-access-6jmmz" (OuterVolumeSpecName: "kube-api-access-6jmmz") pod "a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9" (UID: "a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9"). InnerVolumeSpecName "kube-api-access-6jmmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.026450 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a23e58e-d907-4f3e-92cb-be6652ceb9d6-kube-api-access-rqhq6" (OuterVolumeSpecName: "kube-api-access-rqhq6") pod "2a23e58e-d907-4f3e-92cb-be6652ceb9d6" (UID: "2a23e58e-d907-4f3e-92cb-be6652ceb9d6"). InnerVolumeSpecName "kube-api-access-rqhq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.112401 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqhq6\" (UniqueName: \"kubernetes.io/projected/2a23e58e-d907-4f3e-92cb-be6652ceb9d6-kube-api-access-rqhq6\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.112464 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.112476 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jmmz\" (UniqueName: \"kubernetes.io/projected/a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9-kube-api-access-6jmmz\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.112510 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2629401-ad79-4aab-b3ec-665c69eeaf78-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.112522 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5k47\" (UniqueName: \"kubernetes.io/projected/b8693532-d88f-49d6-a72c-4d751df3a3eb-kube-api-access-t5k47\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.112532 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.112540 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8693532-d88f-49d6-a72c-4d751df3a3eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.112570 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n84qj\" (UniqueName: \"kubernetes.io/projected/c2629401-ad79-4aab-b3ec-665c69eeaf78-kube-api-access-n84qj\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.112581 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a23e58e-d907-4f3e-92cb-be6652ceb9d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.112591 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk2qp\" (UniqueName: \"kubernetes.io/projected/0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5-kube-api-access-bk2qp\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.285639 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8b47-account-create-update-btdp7" event={"ID":"2a23e58e-d907-4f3e-92cb-be6652ceb9d6","Type":"ContainerDied","Data":"3a5a85339eec5608544c4de1483ddd2cc5bcf1c5afece4f3af2083a8d213338c"} Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.285775 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a5a85339eec5608544c4de1483ddd2cc5bcf1c5afece4f3af2083a8d213338c" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.286327 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b47-account-create-update-btdp7" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.288540 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cglrk" event={"ID":"b78afe08-d5e4-45d8-9149-4c0a1020a4cf","Type":"ContainerDied","Data":"2456574d0db0418095cd599763057bf43f0ff0a6de7e1b0a0051ee0d908c2189"} Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.288604 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2456574d0db0418095cd599763057bf43f0ff0a6de7e1b0a0051ee0d908c2189" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.288584 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cglrk" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.293970 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"cf249292c8f2ae891705ce8b850d38cf0ae227f4304c261b1ef67518dde444f6"} Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.294004 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"3322a1a63d982b007337c016df9dadd09417398badcaa9dcc4d0a6bf4fe12d41"} Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.294019 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"7f2b74932ff71cab49843de5a99395f2f3995ed6305e9aef4d30ec156d1541a1"} Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.294292 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"bd9a6c11cbd3bdfd513be881b8c42a51f7ab2e9b360a02ddb2a5e8383e963bd4"} Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.296052 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b51d-account-create-update-9qmbt" event={"ID":"a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9","Type":"ContainerDied","Data":"564613d17c8c5f3477c33b163fc55f7b1389159732f369284061712229a11b14"} Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.296118 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="564613d17c8c5f3477c33b163fc55f7b1389159732f369284061712229a11b14" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.296221 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b51d-account-create-update-9qmbt" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.301855 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-86db-account-create-update-5dpfx" event={"ID":"b8693532-d88f-49d6-a72c-4d751df3a3eb","Type":"ContainerDied","Data":"cd1749260948cbb442575177f18b68e30c60722a1a5130ea840bab08205dbce6"} Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.301879 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd1749260948cbb442575177f18b68e30c60722a1a5130ea840bab08205dbce6" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.301988 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-86db-account-create-update-5dpfx" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.304799 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s9nk6" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.304817 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s9nk6" event={"ID":"0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5","Type":"ContainerDied","Data":"d1576d2707147dc8aaf51a6888db0ac9df0bf5f4a09fdee99021837e745a3cee"} Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.304869 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1576d2707147dc8aaf51a6888db0ac9df0bf5f4a09fdee99021837e745a3cee" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.308003 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dbb59" event={"ID":"c2629401-ad79-4aab-b3ec-665c69eeaf78","Type":"ContainerDied","Data":"f6660fee5c143c2335881bfa39dda7c0c95b037f17778ec86619d9e0e3882d21"} Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.308148 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6660fee5c143c2335881bfa39dda7c0c95b037f17778ec86619d9e0e3882d21" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.308182 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dbb59" Feb 23 07:06:14 crc kubenswrapper[5047]: I0223 07:06:14.399320 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e093bba7-db7f-4f7c-b498-ab4a4d06c661" path="/var/lib/kubelet/pods/e093bba7-db7f-4f7c-b498-ab4a4d06c661/volumes" Feb 23 07:06:15 crc kubenswrapper[5047]: I0223 07:06:15.213175 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fk6gc" podUID="6f9e1257-3765-4d4e-8110-81c55d1546d4" containerName="ovn-controller" probeResult="failure" output=< Feb 23 07:06:15 crc kubenswrapper[5047]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 23 07:06:15 crc kubenswrapper[5047]: > Feb 23 07:06:15 crc kubenswrapper[5047]: I0223 07:06:15.320065 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"e8eadc030f3e0f8a80e20468bb35964a892723d0a32063f32fdb445ccfa9a64c"} Feb 23 07:06:15 crc kubenswrapper[5047]: I0223 07:06:15.320123 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"2271c0a7b8b654682e42f554500d0387d96647dadb59e76a372f57a74ce531ad"} Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.338414 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"5438c98e6edb083f8efe8f16e3d1152881677a6739c97c34bfdb8575aac23a30"} Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.340282 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"1497b14f7cabd5e74a05e2a73df9cf90fa7bf74823b616c5182cd8dc8c2842e9"} Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.354768 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"610194c27d45dd83cb9a83ae67eb964cc41e536d39f862f18b7fbc1666c8f18e"} Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.354841 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"fb9fbedf021ff6c481977b592703012f584d141b404e356a7e74031c5b137c6a"} Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.354890 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerStarted","Data":"5f0a6444a7b19329569218b09c622e3e4c94a546c76d994e6e7f55372800737f"} Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.386262 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.111320813 podStartE2EDuration="25.386229557s" podCreationTimestamp="2026-02-23 07:05:51 +0000 UTC" firstStartedPulling="2026-02-23 07:06:09.597669667 +0000 UTC m=+1291.848996801" lastFinishedPulling="2026-02-23 07:06:14.872578411 +0000 UTC m=+1297.123905545" observedRunningTime="2026-02-23 07:06:16.375224834 +0000 UTC m=+1298.626551988" watchObservedRunningTime="2026-02-23 07:06:16.386229557 +0000 UTC m=+1298.637556711" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.736756 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-768666cd57-fxhh5"] Feb 23 07:06:16 crc kubenswrapper[5047]: E0223 07:06:16.737594 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2629401-ad79-4aab-b3ec-665c69eeaf78" containerName="mariadb-database-create" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.737610 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2629401-ad79-4aab-b3ec-665c69eeaf78" containerName="mariadb-database-create" Feb 23 07:06:16 crc kubenswrapper[5047]: E0223 07:06:16.737621 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9" containerName="mariadb-account-create-update" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.737629 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9" containerName="mariadb-account-create-update" Feb 23 07:06:16 crc kubenswrapper[5047]: E0223 07:06:16.737657 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b78afe08-d5e4-45d8-9149-4c0a1020a4cf" containerName="mariadb-database-create" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.737663 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78afe08-d5e4-45d8-9149-4c0a1020a4cf" containerName="mariadb-database-create" Feb 23 07:06:16 crc kubenswrapper[5047]: E0223 07:06:16.737673 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8693532-d88f-49d6-a72c-4d751df3a3eb" containerName="mariadb-account-create-update" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.737678 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8693532-d88f-49d6-a72c-4d751df3a3eb" containerName="mariadb-account-create-update" Feb 23 07:06:16 crc kubenswrapper[5047]: E0223 07:06:16.737686 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5" containerName="mariadb-database-create" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.737692 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5" containerName="mariadb-database-create" Feb 23 07:06:16 crc kubenswrapper[5047]: E0223 07:06:16.737705 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a23e58e-d907-4f3e-92cb-be6652ceb9d6" containerName="mariadb-account-create-update" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.737711 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a23e58e-d907-4f3e-92cb-be6652ceb9d6" containerName="mariadb-account-create-update" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.737890 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8693532-d88f-49d6-a72c-4d751df3a3eb" containerName="mariadb-account-create-update" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.737925 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2629401-ad79-4aab-b3ec-665c69eeaf78" containerName="mariadb-database-create" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.737934 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a23e58e-d907-4f3e-92cb-be6652ceb9d6" containerName="mariadb-account-create-update" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.737944 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b78afe08-d5e4-45d8-9149-4c0a1020a4cf" containerName="mariadb-database-create" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.737958 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9" containerName="mariadb-account-create-update" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.737966 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5" containerName="mariadb-database-create" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.738854 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.741585 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.752748 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-fxhh5"] Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.759484 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.759561 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.767925 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.767988 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tgbv\" (UniqueName: \"kubernetes.io/projected/92786ecb-6090-40dc-8942-001a54d35d12-kube-api-access-6tgbv\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.768020 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-dns-svc\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.768074 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.768098 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-config\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.768114 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.870352 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-config\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.870721 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.870935 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.871149 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tgbv\" (UniqueName: \"kubernetes.io/projected/92786ecb-6090-40dc-8942-001a54d35d12-kube-api-access-6tgbv\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.871681 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-dns-svc\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.871574 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-config\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.871948 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.872572 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-dns-svc\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.873137 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.873451 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.873867 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:16 crc kubenswrapper[5047]: I0223 07:06:16.893468 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tgbv\" (UniqueName: \"kubernetes.io/projected/92786ecb-6090-40dc-8942-001a54d35d12-kube-api-access-6tgbv\") pod \"dnsmasq-dns-768666cd57-fxhh5\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:17 crc kubenswrapper[5047]: I0223 07:06:17.060353 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:17 crc kubenswrapper[5047]: I0223 07:06:17.581254 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-fxhh5"] Feb 23 07:06:17 crc kubenswrapper[5047]: I0223 07:06:17.708182 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-8wl9w"] Feb 23 07:06:17 crc kubenswrapper[5047]: I0223 07:06:17.710311 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8wl9w" Feb 23 07:06:17 crc kubenswrapper[5047]: I0223 07:06:17.715401 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 23 07:06:17 crc kubenswrapper[5047]: I0223 07:06:17.716709 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8wl9w"] Feb 23 07:06:17 crc kubenswrapper[5047]: I0223 07:06:17.894467 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3757bd5c-675e-4944-9a92-9e684e25ef6d-operator-scripts\") pod \"root-account-create-update-8wl9w\" (UID: \"3757bd5c-675e-4944-9a92-9e684e25ef6d\") " pod="openstack/root-account-create-update-8wl9w" Feb 23 07:06:17 crc kubenswrapper[5047]: I0223 07:06:17.894551 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj68j\" (UniqueName: \"kubernetes.io/projected/3757bd5c-675e-4944-9a92-9e684e25ef6d-kube-api-access-fj68j\") pod \"root-account-create-update-8wl9w\" (UID: \"3757bd5c-675e-4944-9a92-9e684e25ef6d\") " pod="openstack/root-account-create-update-8wl9w" Feb 23 07:06:17 crc kubenswrapper[5047]: I0223 07:06:17.996832 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3757bd5c-675e-4944-9a92-9e684e25ef6d-operator-scripts\") pod \"root-account-create-update-8wl9w\" (UID: \"3757bd5c-675e-4944-9a92-9e684e25ef6d\") " pod="openstack/root-account-create-update-8wl9w" Feb 23 07:06:17 crc kubenswrapper[5047]: I0223 07:06:17.996926 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj68j\" (UniqueName: \"kubernetes.io/projected/3757bd5c-675e-4944-9a92-9e684e25ef6d-kube-api-access-fj68j\") pod \"root-account-create-update-8wl9w\" (UID: \"3757bd5c-675e-4944-9a92-9e684e25ef6d\") " pod="openstack/root-account-create-update-8wl9w" Feb 23 07:06:17 crc kubenswrapper[5047]: I0223 07:06:17.997927 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3757bd5c-675e-4944-9a92-9e684e25ef6d-operator-scripts\") pod \"root-account-create-update-8wl9w\" (UID: \"3757bd5c-675e-4944-9a92-9e684e25ef6d\") " pod="openstack/root-account-create-update-8wl9w" Feb 23 07:06:18 crc kubenswrapper[5047]: I0223 07:06:18.027729 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj68j\" (UniqueName: \"kubernetes.io/projected/3757bd5c-675e-4944-9a92-9e684e25ef6d-kube-api-access-fj68j\") pod \"root-account-create-update-8wl9w\" (UID: \"3757bd5c-675e-4944-9a92-9e684e25ef6d\") " pod="openstack/root-account-create-update-8wl9w" Feb 23 07:06:18 crc kubenswrapper[5047]: I0223 07:06:18.087737 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8wl9w" Feb 23 07:06:18 crc kubenswrapper[5047]: I0223 07:06:18.393802 5047 generic.go:334] "Generic (PLEG): container finished" podID="92786ecb-6090-40dc-8942-001a54d35d12" containerID="891892cac348d183815d7ffe54e852d6a7c8f16d8e5289f24d91b7f0c6b96fac" exitCode=0 Feb 23 07:06:18 crc kubenswrapper[5047]: I0223 07:06:18.393931 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" event={"ID":"92786ecb-6090-40dc-8942-001a54d35d12","Type":"ContainerDied","Data":"891892cac348d183815d7ffe54e852d6a7c8f16d8e5289f24d91b7f0c6b96fac"} Feb 23 07:06:18 crc kubenswrapper[5047]: I0223 07:06:18.394272 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" event={"ID":"92786ecb-6090-40dc-8942-001a54d35d12","Type":"ContainerStarted","Data":"67781186d2285d661b545a399242388f4781ba57fdc46dda3600364e2adb17b3"} Feb 23 07:06:18 crc kubenswrapper[5047]: W0223 07:06:18.579257 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3757bd5c_675e_4944_9a92_9e684e25ef6d.slice/crio-91efa71facad5a8b53d795207a200b5667d5469e47f76863f3f26f1be4752dee WatchSource:0}: Error finding container 91efa71facad5a8b53d795207a200b5667d5469e47f76863f3f26f1be4752dee: Status 404 returned error can't find the container with id 91efa71facad5a8b53d795207a200b5667d5469e47f76863f3f26f1be4752dee Feb 23 07:06:18 crc kubenswrapper[5047]: I0223 07:06:18.588801 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8wl9w"] Feb 23 07:06:19 crc kubenswrapper[5047]: I0223 07:06:19.405138 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" event={"ID":"92786ecb-6090-40dc-8942-001a54d35d12","Type":"ContainerStarted","Data":"af59b9535f389d7d7a06a601dbefe85600115748ae0d09b9bba1ac8ff86c9b73"} Feb 23 07:06:19 crc kubenswrapper[5047]: I0223 07:06:19.405244 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:19 crc kubenswrapper[5047]: I0223 07:06:19.407156 5047 generic.go:334] "Generic (PLEG): container finished" podID="3757bd5c-675e-4944-9a92-9e684e25ef6d" containerID="c6a6e7d909b55a3b0c1abec01195b34c6c1aa06fec45edbc14f4f0e71b821777" exitCode=0 Feb 23 07:06:19 crc kubenswrapper[5047]: I0223 07:06:19.407198 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8wl9w" event={"ID":"3757bd5c-675e-4944-9a92-9e684e25ef6d","Type":"ContainerDied","Data":"c6a6e7d909b55a3b0c1abec01195b34c6c1aa06fec45edbc14f4f0e71b821777"} Feb 23 07:06:19 crc kubenswrapper[5047]: I0223 07:06:19.407219 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8wl9w" event={"ID":"3757bd5c-675e-4944-9a92-9e684e25ef6d","Type":"ContainerStarted","Data":"91efa71facad5a8b53d795207a200b5667d5469e47f76863f3f26f1be4752dee"} Feb 23 07:06:19 crc kubenswrapper[5047]: I0223 07:06:19.441667 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" podStartSLOduration=3.441640127 podStartE2EDuration="3.441640127s" podCreationTimestamp="2026-02-23 07:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:19.429437263 +0000 UTC m=+1301.680764397" watchObservedRunningTime="2026-02-23 07:06:19.441640127 +0000 UTC m=+1301.692967261" Feb 23 07:06:19 crc kubenswrapper[5047]: I0223 07:06:19.858254 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9bx9t"] Feb 23 07:06:19 crc kubenswrapper[5047]: I0223 07:06:19.860258 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:19 crc kubenswrapper[5047]: I0223 07:06:19.864404 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 23 07:06:19 crc kubenswrapper[5047]: I0223 07:06:19.865222 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rhm9s" Feb 23 07:06:19 crc kubenswrapper[5047]: I0223 07:06:19.875048 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9bx9t"] Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.034788 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-config-data\") pod \"glance-db-sync-9bx9t\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.035012 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-combined-ca-bundle\") pod \"glance-db-sync-9bx9t\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.035052 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-db-sync-config-data\") pod \"glance-db-sync-9bx9t\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.035088 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzjbp\" (UniqueName: \"kubernetes.io/projected/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-kube-api-access-tzjbp\") pod \"glance-db-sync-9bx9t\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.137500 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-config-data\") pod \"glance-db-sync-9bx9t\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.137657 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-combined-ca-bundle\") pod \"glance-db-sync-9bx9t\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.137691 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-db-sync-config-data\") pod \"glance-db-sync-9bx9t\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.137725 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzjbp\" (UniqueName: \"kubernetes.io/projected/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-kube-api-access-tzjbp\") pod \"glance-db-sync-9bx9t\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.148680 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-db-sync-config-data\") pod \"glance-db-sync-9bx9t\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.149238 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-config-data\") pod \"glance-db-sync-9bx9t\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.149243 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-combined-ca-bundle\") pod \"glance-db-sync-9bx9t\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.168060 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fk6gc" podUID="6f9e1257-3765-4d4e-8110-81c55d1546d4" containerName="ovn-controller" probeResult="failure" output=< Feb 23 07:06:20 crc kubenswrapper[5047]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 23 07:06:20 crc kubenswrapper[5047]: > Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.170554 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzjbp\" (UniqueName: \"kubernetes.io/projected/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-kube-api-access-tzjbp\") pod \"glance-db-sync-9bx9t\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.240390 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.259157 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.265251 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.522130 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fk6gc-config-fr9np"] Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.523289 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.528340 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.553664 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fk6gc-config-fr9np"] Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.650342 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6w2n\" (UniqueName: \"kubernetes.io/projected/daa39538-419f-4f51-931a-4465c3218f3e-kube-api-access-b6w2n\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.650392 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-log-ovn\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.650439 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-run-ovn\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.650469 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daa39538-419f-4f51-931a-4465c3218f3e-scripts\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.650513 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/daa39538-419f-4f51-931a-4465c3218f3e-additional-scripts\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.650544 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-run\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.732310 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8wl9w" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.752365 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6w2n\" (UniqueName: \"kubernetes.io/projected/daa39538-419f-4f51-931a-4465c3218f3e-kube-api-access-b6w2n\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.752426 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-log-ovn\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.752474 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-run-ovn\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.752501 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daa39538-419f-4f51-931a-4465c3218f3e-scripts\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.752543 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/daa39538-419f-4f51-931a-4465c3218f3e-additional-scripts\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.752574 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-run\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.752984 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-run\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.753043 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-run-ovn\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.753079 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-log-ovn\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.754927 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/daa39538-419f-4f51-931a-4465c3218f3e-additional-scripts\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.755111 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daa39538-419f-4f51-931a-4465c3218f3e-scripts\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.781679 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6w2n\" (UniqueName: \"kubernetes.io/projected/daa39538-419f-4f51-931a-4465c3218f3e-kube-api-access-b6w2n\") pod \"ovn-controller-fk6gc-config-fr9np\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.838234 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9bx9t"] Feb 23 07:06:20 crc kubenswrapper[5047]: W0223 07:06:20.848085 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e1cee2a_62ea_4010_8d71_d9ddd321a9c2.slice/crio-cfefeb266a17cb27f73648472a05b5ff89ca972be675aa29e268d2f514055aba WatchSource:0}: Error finding container cfefeb266a17cb27f73648472a05b5ff89ca972be675aa29e268d2f514055aba: Status 404 returned error can't find the container with id cfefeb266a17cb27f73648472a05b5ff89ca972be675aa29e268d2f514055aba Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.853291 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3757bd5c-675e-4944-9a92-9e684e25ef6d-operator-scripts\") pod \"3757bd5c-675e-4944-9a92-9e684e25ef6d\" (UID: \"3757bd5c-675e-4944-9a92-9e684e25ef6d\") " Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.853482 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj68j\" (UniqueName: \"kubernetes.io/projected/3757bd5c-675e-4944-9a92-9e684e25ef6d-kube-api-access-fj68j\") pod \"3757bd5c-675e-4944-9a92-9e684e25ef6d\" (UID: \"3757bd5c-675e-4944-9a92-9e684e25ef6d\") " Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.853783 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3757bd5c-675e-4944-9a92-9e684e25ef6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3757bd5c-675e-4944-9a92-9e684e25ef6d" (UID: "3757bd5c-675e-4944-9a92-9e684e25ef6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.858209 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3757bd5c-675e-4944-9a92-9e684e25ef6d-kube-api-access-fj68j" (OuterVolumeSpecName: "kube-api-access-fj68j") pod "3757bd5c-675e-4944-9a92-9e684e25ef6d" (UID: "3757bd5c-675e-4944-9a92-9e684e25ef6d"). InnerVolumeSpecName "kube-api-access-fj68j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.859357 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.955892 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj68j\" (UniqueName: \"kubernetes.io/projected/3757bd5c-675e-4944-9a92-9e684e25ef6d-kube-api-access-fj68j\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:20 crc kubenswrapper[5047]: I0223 07:06:20.956509 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3757bd5c-675e-4944-9a92-9e684e25ef6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:21 crc kubenswrapper[5047]: I0223 07:06:21.416824 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fk6gc-config-fr9np"] Feb 23 07:06:21 crc kubenswrapper[5047]: W0223 07:06:21.436422 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaa39538_419f_4f51_931a_4465c3218f3e.slice/crio-1f1320131fe4247f35fdc339a049fd036cf9ec61378489fd3816e19545757389 WatchSource:0}: Error finding container 1f1320131fe4247f35fdc339a049fd036cf9ec61378489fd3816e19545757389: Status 404 returned error can't find the container with id 1f1320131fe4247f35fdc339a049fd036cf9ec61378489fd3816e19545757389 Feb 23 07:06:21 crc kubenswrapper[5047]: I0223 07:06:21.437071 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8wl9w" event={"ID":"3757bd5c-675e-4944-9a92-9e684e25ef6d","Type":"ContainerDied","Data":"91efa71facad5a8b53d795207a200b5667d5469e47f76863f3f26f1be4752dee"} Feb 23 07:06:21 crc kubenswrapper[5047]: I0223 07:06:21.437318 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91efa71facad5a8b53d795207a200b5667d5469e47f76863f3f26f1be4752dee" Feb 23 07:06:21 crc kubenswrapper[5047]: I0223 07:06:21.437318 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8wl9w" Feb 23 07:06:21 crc kubenswrapper[5047]: I0223 07:06:21.440045 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9bx9t" event={"ID":"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2","Type":"ContainerStarted","Data":"cfefeb266a17cb27f73648472a05b5ff89ca972be675aa29e268d2f514055aba"} Feb 23 07:06:22 crc kubenswrapper[5047]: I0223 07:06:22.452561 5047 generic.go:334] "Generic (PLEG): container finished" podID="daa39538-419f-4f51-931a-4465c3218f3e" containerID="d14d86530a147cf0d67613e782e89086bd0d587992f1573cb82b255d08564b9a" exitCode=0 Feb 23 07:06:22 crc kubenswrapper[5047]: I0223 07:06:22.452635 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fk6gc-config-fr9np" event={"ID":"daa39538-419f-4f51-931a-4465c3218f3e","Type":"ContainerDied","Data":"d14d86530a147cf0d67613e782e89086bd0d587992f1573cb82b255d08564b9a"} Feb 23 07:06:22 crc kubenswrapper[5047]: I0223 07:06:22.453484 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fk6gc-config-fr9np" event={"ID":"daa39538-419f-4f51-931a-4465c3218f3e","Type":"ContainerStarted","Data":"1f1320131fe4247f35fdc339a049fd036cf9ec61378489fd3816e19545757389"} Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.824589 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.920674 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/daa39538-419f-4f51-931a-4465c3218f3e-additional-scripts\") pod \"daa39538-419f-4f51-931a-4465c3218f3e\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.920766 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-run-ovn\") pod \"daa39538-419f-4f51-931a-4465c3218f3e\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.920798 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-run\") pod \"daa39538-419f-4f51-931a-4465c3218f3e\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.920983 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6w2n\" (UniqueName: \"kubernetes.io/projected/daa39538-419f-4f51-931a-4465c3218f3e-kube-api-access-b6w2n\") pod \"daa39538-419f-4f51-931a-4465c3218f3e\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.921045 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daa39538-419f-4f51-931a-4465c3218f3e-scripts\") pod \"daa39538-419f-4f51-931a-4465c3218f3e\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.921174 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-log-ovn\") pod \"daa39538-419f-4f51-931a-4465c3218f3e\" (UID: \"daa39538-419f-4f51-931a-4465c3218f3e\") " Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.921383 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-run" (OuterVolumeSpecName: "var-run") pod "daa39538-419f-4f51-931a-4465c3218f3e" (UID: "daa39538-419f-4f51-931a-4465c3218f3e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.921450 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "daa39538-419f-4f51-931a-4465c3218f3e" (UID: "daa39538-419f-4f51-931a-4465c3218f3e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.921880 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "daa39538-419f-4f51-931a-4465c3218f3e" (UID: "daa39538-419f-4f51-931a-4465c3218f3e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.922647 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daa39538-419f-4f51-931a-4465c3218f3e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "daa39538-419f-4f51-931a-4465c3218f3e" (UID: "daa39538-419f-4f51-931a-4465c3218f3e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.923210 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daa39538-419f-4f51-931a-4465c3218f3e-scripts" (OuterVolumeSpecName: "scripts") pod "daa39538-419f-4f51-931a-4465c3218f3e" (UID: "daa39538-419f-4f51-931a-4465c3218f3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.923443 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daa39538-419f-4f51-931a-4465c3218f3e-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.923463 5047 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.923477 5047 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/daa39538-419f-4f51-931a-4465c3218f3e-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.923491 5047 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.923504 5047 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/daa39538-419f-4f51-931a-4465c3218f3e-var-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:23 crc kubenswrapper[5047]: I0223 07:06:23.934093 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daa39538-419f-4f51-931a-4465c3218f3e-kube-api-access-b6w2n" (OuterVolumeSpecName: "kube-api-access-b6w2n") pod "daa39538-419f-4f51-931a-4465c3218f3e" (UID: "daa39538-419f-4f51-931a-4465c3218f3e"). InnerVolumeSpecName "kube-api-access-b6w2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:24 crc kubenswrapper[5047]: I0223 07:06:24.024780 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6w2n\" (UniqueName: \"kubernetes.io/projected/daa39538-419f-4f51-931a-4465c3218f3e-kube-api-access-b6w2n\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:24 crc kubenswrapper[5047]: I0223 07:06:24.479492 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fk6gc-config-fr9np" event={"ID":"daa39538-419f-4f51-931a-4465c3218f3e","Type":"ContainerDied","Data":"1f1320131fe4247f35fdc339a049fd036cf9ec61378489fd3816e19545757389"} Feb 23 07:06:24 crc kubenswrapper[5047]: I0223 07:06:24.479554 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f1320131fe4247f35fdc339a049fd036cf9ec61378489fd3816e19545757389" Feb 23 07:06:24 crc kubenswrapper[5047]: I0223 07:06:24.479637 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fk6gc-config-fr9np" Feb 23 07:06:24 crc kubenswrapper[5047]: I0223 07:06:24.935011 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fk6gc-config-fr9np"] Feb 23 07:06:24 crc kubenswrapper[5047]: I0223 07:06:24.950240 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fk6gc-config-fr9np"] Feb 23 07:06:25 crc kubenswrapper[5047]: I0223 07:06:25.216529 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-fk6gc" Feb 23 07:06:25 crc kubenswrapper[5047]: I0223 07:06:25.994232 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.350802 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daa39538-419f-4f51-931a-4465c3218f3e" path="/var/lib/kubelet/pods/daa39538-419f-4f51-931a-4465c3218f3e/volumes" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.399081 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-hfcz7"] Feb 23 07:06:26 crc kubenswrapper[5047]: E0223 07:06:26.399510 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa39538-419f-4f51-931a-4465c3218f3e" containerName="ovn-config" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.399528 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa39538-419f-4f51-931a-4465c3218f3e" containerName="ovn-config" Feb 23 07:06:26 crc kubenswrapper[5047]: E0223 07:06:26.399552 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3757bd5c-675e-4944-9a92-9e684e25ef6d" containerName="mariadb-account-create-update" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.399559 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3757bd5c-675e-4944-9a92-9e684e25ef6d" containerName="mariadb-account-create-update" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.399746 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="daa39538-419f-4f51-931a-4465c3218f3e" containerName="ovn-config" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.399758 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="3757bd5c-675e-4944-9a92-9e684e25ef6d" containerName="mariadb-account-create-update" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.400406 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hfcz7" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.412451 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hfcz7"] Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.520585 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-965e-account-create-update-hv27b"] Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.521698 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-965e-account-create-update-hv27b" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.524405 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.532994 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-965e-account-create-update-hv27b"] Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.579086 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.583353 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-574bl\" (UniqueName: \"kubernetes.io/projected/7e852f86-de0c-4691-b6dc-c0149b5932fb-kube-api-access-574bl\") pod \"cinder-db-create-hfcz7\" (UID: \"7e852f86-de0c-4691-b6dc-c0149b5932fb\") " pod="openstack/cinder-db-create-hfcz7" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.583521 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e852f86-de0c-4691-b6dc-c0149b5932fb-operator-scripts\") pod \"cinder-db-create-hfcz7\" (UID: \"7e852f86-de0c-4691-b6dc-c0149b5932fb\") " pod="openstack/cinder-db-create-hfcz7" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.615702 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-ll9m9"] Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.617207 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ll9m9" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.629652 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8c0b-account-create-update-vwfrq"] Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.631192 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c0b-account-create-update-vwfrq" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.633161 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.636222 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ll9m9"] Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.685304 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/146d74a4-9a27-47d9-87e7-3372501acab0-operator-scripts\") pod \"barbican-965e-account-create-update-hv27b\" (UID: \"146d74a4-9a27-47d9-87e7-3372501acab0\") " pod="openstack/barbican-965e-account-create-update-hv27b" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.685385 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-574bl\" (UniqueName: \"kubernetes.io/projected/7e852f86-de0c-4691-b6dc-c0149b5932fb-kube-api-access-574bl\") pod \"cinder-db-create-hfcz7\" (UID: \"7e852f86-de0c-4691-b6dc-c0149b5932fb\") " pod="openstack/cinder-db-create-hfcz7" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.685538 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppc8s\" (UniqueName: \"kubernetes.io/projected/146d74a4-9a27-47d9-87e7-3372501acab0-kube-api-access-ppc8s\") pod \"barbican-965e-account-create-update-hv27b\" (UID: \"146d74a4-9a27-47d9-87e7-3372501acab0\") " pod="openstack/barbican-965e-account-create-update-hv27b" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.685622 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e852f86-de0c-4691-b6dc-c0149b5932fb-operator-scripts\") pod \"cinder-db-create-hfcz7\" (UID: \"7e852f86-de0c-4691-b6dc-c0149b5932fb\") " pod="openstack/cinder-db-create-hfcz7" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.688122 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e852f86-de0c-4691-b6dc-c0149b5932fb-operator-scripts\") pod \"cinder-db-create-hfcz7\" (UID: \"7e852f86-de0c-4691-b6dc-c0149b5932fb\") " pod="openstack/cinder-db-create-hfcz7" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.717966 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-574bl\" (UniqueName: \"kubernetes.io/projected/7e852f86-de0c-4691-b6dc-c0149b5932fb-kube-api-access-574bl\") pod \"cinder-db-create-hfcz7\" (UID: \"7e852f86-de0c-4691-b6dc-c0149b5932fb\") " pod="openstack/cinder-db-create-hfcz7" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.719954 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c0b-account-create-update-vwfrq"] Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.720333 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hfcz7" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.787650 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppc8s\" (UniqueName: \"kubernetes.io/projected/146d74a4-9a27-47d9-87e7-3372501acab0-kube-api-access-ppc8s\") pod \"barbican-965e-account-create-update-hv27b\" (UID: \"146d74a4-9a27-47d9-87e7-3372501acab0\") " pod="openstack/barbican-965e-account-create-update-hv27b" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.788212 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6427b69-b06b-4935-9206-ae09750f8900-operator-scripts\") pod \"barbican-db-create-ll9m9\" (UID: \"b6427b69-b06b-4935-9206-ae09750f8900\") " pod="openstack/barbican-db-create-ll9m9" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.788240 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq8lz\" (UniqueName: \"kubernetes.io/projected/1efebf35-1d9c-450d-abb5-6e247edc9db2-kube-api-access-pq8lz\") pod \"cinder-8c0b-account-create-update-vwfrq\" (UID: \"1efebf35-1d9c-450d-abb5-6e247edc9db2\") " pod="openstack/cinder-8c0b-account-create-update-vwfrq" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.788277 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1efebf35-1d9c-450d-abb5-6e247edc9db2-operator-scripts\") pod \"cinder-8c0b-account-create-update-vwfrq\" (UID: \"1efebf35-1d9c-450d-abb5-6e247edc9db2\") " pod="openstack/cinder-8c0b-account-create-update-vwfrq" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.788301 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjbwk\" (UniqueName: \"kubernetes.io/projected/b6427b69-b06b-4935-9206-ae09750f8900-kube-api-access-vjbwk\") pod \"barbican-db-create-ll9m9\" (UID: \"b6427b69-b06b-4935-9206-ae09750f8900\") " pod="openstack/barbican-db-create-ll9m9" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.788357 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/146d74a4-9a27-47d9-87e7-3372501acab0-operator-scripts\") pod \"barbican-965e-account-create-update-hv27b\" (UID: \"146d74a4-9a27-47d9-87e7-3372501acab0\") " pod="openstack/barbican-965e-account-create-update-hv27b" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.789352 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/146d74a4-9a27-47d9-87e7-3372501acab0-operator-scripts\") pod \"barbican-965e-account-create-update-hv27b\" (UID: \"146d74a4-9a27-47d9-87e7-3372501acab0\") " pod="openstack/barbican-965e-account-create-update-hv27b" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.824882 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-8tflc"] Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.826283 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8tflc" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.835655 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-j4sn8"] Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.841728 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j4sn8" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.845102 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.845362 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.845488 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.845705 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-whfdb" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.859336 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8tflc"] Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.866161 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppc8s\" (UniqueName: \"kubernetes.io/projected/146d74a4-9a27-47d9-87e7-3372501acab0-kube-api-access-ppc8s\") pod \"barbican-965e-account-create-update-hv27b\" (UID: \"146d74a4-9a27-47d9-87e7-3372501acab0\") " pod="openstack/barbican-965e-account-create-update-hv27b" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.869132 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e7d5-account-create-update-lwfj5"] Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.870895 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e7d5-account-create-update-lwfj5" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.875204 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.883058 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-j4sn8"] Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.890212 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e7d5-account-create-update-lwfj5"] Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.891445 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1efebf35-1d9c-450d-abb5-6e247edc9db2-operator-scripts\") pod \"cinder-8c0b-account-create-update-vwfrq\" (UID: \"1efebf35-1d9c-450d-abb5-6e247edc9db2\") " pod="openstack/cinder-8c0b-account-create-update-vwfrq" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.891487 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbwk\" (UniqueName: \"kubernetes.io/projected/b6427b69-b06b-4935-9206-ae09750f8900-kube-api-access-vjbwk\") pod \"barbican-db-create-ll9m9\" (UID: \"b6427b69-b06b-4935-9206-ae09750f8900\") " pod="openstack/barbican-db-create-ll9m9" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.891604 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6427b69-b06b-4935-9206-ae09750f8900-operator-scripts\") pod \"barbican-db-create-ll9m9\" (UID: \"b6427b69-b06b-4935-9206-ae09750f8900\") " pod="openstack/barbican-db-create-ll9m9" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.891627 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq8lz\" (UniqueName: \"kubernetes.io/projected/1efebf35-1d9c-450d-abb5-6e247edc9db2-kube-api-access-pq8lz\") pod \"cinder-8c0b-account-create-update-vwfrq\" (UID: \"1efebf35-1d9c-450d-abb5-6e247edc9db2\") " pod="openstack/cinder-8c0b-account-create-update-vwfrq" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.892806 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6427b69-b06b-4935-9206-ae09750f8900-operator-scripts\") pod \"barbican-db-create-ll9m9\" (UID: \"b6427b69-b06b-4935-9206-ae09750f8900\") " pod="openstack/barbican-db-create-ll9m9" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.893217 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1efebf35-1d9c-450d-abb5-6e247edc9db2-operator-scripts\") pod \"cinder-8c0b-account-create-update-vwfrq\" (UID: \"1efebf35-1d9c-450d-abb5-6e247edc9db2\") " pod="openstack/cinder-8c0b-account-create-update-vwfrq" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.915639 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq8lz\" (UniqueName: \"kubernetes.io/projected/1efebf35-1d9c-450d-abb5-6e247edc9db2-kube-api-access-pq8lz\") pod \"cinder-8c0b-account-create-update-vwfrq\" (UID: \"1efebf35-1d9c-450d-abb5-6e247edc9db2\") " pod="openstack/cinder-8c0b-account-create-update-vwfrq" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.925241 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbwk\" (UniqueName: \"kubernetes.io/projected/b6427b69-b06b-4935-9206-ae09750f8900-kube-api-access-vjbwk\") pod \"barbican-db-create-ll9m9\" (UID: \"b6427b69-b06b-4935-9206-ae09750f8900\") " pod="openstack/barbican-db-create-ll9m9" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.979118 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ll9m9" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.993382 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e380efd-ee4f-4339-b6ad-5688849a2c93-operator-scripts\") pod \"neutron-db-create-j4sn8\" (UID: \"7e380efd-ee4f-4339-b6ad-5688849a2c93\") " pod="openstack/neutron-db-create-j4sn8" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.993840 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199f6db1-c082-418e-9a55-330a39b15ed7-combined-ca-bundle\") pod \"keystone-db-sync-8tflc\" (UID: \"199f6db1-c082-418e-9a55-330a39b15ed7\") " pod="openstack/keystone-db-sync-8tflc" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.993889 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjxvn\" (UniqueName: \"kubernetes.io/projected/199f6db1-c082-418e-9a55-330a39b15ed7-kube-api-access-zjxvn\") pod \"keystone-db-sync-8tflc\" (UID: \"199f6db1-c082-418e-9a55-330a39b15ed7\") " pod="openstack/keystone-db-sync-8tflc" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.993939 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf8wn\" (UniqueName: \"kubernetes.io/projected/63179dda-c120-44c4-9f07-1cef37ca07e0-kube-api-access-gf8wn\") pod \"neutron-e7d5-account-create-update-lwfj5\" (UID: \"63179dda-c120-44c4-9f07-1cef37ca07e0\") " pod="openstack/neutron-e7d5-account-create-update-lwfj5" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.993985 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfr6p\" (UniqueName: \"kubernetes.io/projected/7e380efd-ee4f-4339-b6ad-5688849a2c93-kube-api-access-vfr6p\") pod \"neutron-db-create-j4sn8\" (UID: \"7e380efd-ee4f-4339-b6ad-5688849a2c93\") " pod="openstack/neutron-db-create-j4sn8" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.994026 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63179dda-c120-44c4-9f07-1cef37ca07e0-operator-scripts\") pod \"neutron-e7d5-account-create-update-lwfj5\" (UID: \"63179dda-c120-44c4-9f07-1cef37ca07e0\") " pod="openstack/neutron-e7d5-account-create-update-lwfj5" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.994050 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199f6db1-c082-418e-9a55-330a39b15ed7-config-data\") pod \"keystone-db-sync-8tflc\" (UID: \"199f6db1-c082-418e-9a55-330a39b15ed7\") " pod="openstack/keystone-db-sync-8tflc" Feb 23 07:06:26 crc kubenswrapper[5047]: I0223 07:06:26.998672 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c0b-account-create-update-vwfrq" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.065096 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.097234 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199f6db1-c082-418e-9a55-330a39b15ed7-combined-ca-bundle\") pod \"keystone-db-sync-8tflc\" (UID: \"199f6db1-c082-418e-9a55-330a39b15ed7\") " pod="openstack/keystone-db-sync-8tflc" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.097336 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjxvn\" (UniqueName: \"kubernetes.io/projected/199f6db1-c082-418e-9a55-330a39b15ed7-kube-api-access-zjxvn\") pod \"keystone-db-sync-8tflc\" (UID: \"199f6db1-c082-418e-9a55-330a39b15ed7\") " pod="openstack/keystone-db-sync-8tflc" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.097378 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf8wn\" (UniqueName: \"kubernetes.io/projected/63179dda-c120-44c4-9f07-1cef37ca07e0-kube-api-access-gf8wn\") pod \"neutron-e7d5-account-create-update-lwfj5\" (UID: \"63179dda-c120-44c4-9f07-1cef37ca07e0\") " pod="openstack/neutron-e7d5-account-create-update-lwfj5" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.097404 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfr6p\" (UniqueName: \"kubernetes.io/projected/7e380efd-ee4f-4339-b6ad-5688849a2c93-kube-api-access-vfr6p\") pod \"neutron-db-create-j4sn8\" (UID: \"7e380efd-ee4f-4339-b6ad-5688849a2c93\") " pod="openstack/neutron-db-create-j4sn8" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.097449 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63179dda-c120-44c4-9f07-1cef37ca07e0-operator-scripts\") pod \"neutron-e7d5-account-create-update-lwfj5\" (UID: \"63179dda-c120-44c4-9f07-1cef37ca07e0\") " pod="openstack/neutron-e7d5-account-create-update-lwfj5" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.097478 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199f6db1-c082-418e-9a55-330a39b15ed7-config-data\") pod \"keystone-db-sync-8tflc\" (UID: \"199f6db1-c082-418e-9a55-330a39b15ed7\") " pod="openstack/keystone-db-sync-8tflc" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.097555 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e380efd-ee4f-4339-b6ad-5688849a2c93-operator-scripts\") pod \"neutron-db-create-j4sn8\" (UID: \"7e380efd-ee4f-4339-b6ad-5688849a2c93\") " pod="openstack/neutron-db-create-j4sn8" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.098309 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e380efd-ee4f-4339-b6ad-5688849a2c93-operator-scripts\") pod \"neutron-db-create-j4sn8\" (UID: \"7e380efd-ee4f-4339-b6ad-5688849a2c93\") " pod="openstack/neutron-db-create-j4sn8" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.099137 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63179dda-c120-44c4-9f07-1cef37ca07e0-operator-scripts\") pod \"neutron-e7d5-account-create-update-lwfj5\" (UID: \"63179dda-c120-44c4-9f07-1cef37ca07e0\") " pod="openstack/neutron-e7d5-account-create-update-lwfj5" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.109307 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199f6db1-c082-418e-9a55-330a39b15ed7-config-data\") pod \"keystone-db-sync-8tflc\" (UID: \"199f6db1-c082-418e-9a55-330a39b15ed7\") " pod="openstack/keystone-db-sync-8tflc" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.117954 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199f6db1-c082-418e-9a55-330a39b15ed7-combined-ca-bundle\") pod \"keystone-db-sync-8tflc\" (UID: \"199f6db1-c082-418e-9a55-330a39b15ed7\") " pod="openstack/keystone-db-sync-8tflc" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.133223 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfr6p\" (UniqueName: \"kubernetes.io/projected/7e380efd-ee4f-4339-b6ad-5688849a2c93-kube-api-access-vfr6p\") pod \"neutron-db-create-j4sn8\" (UID: \"7e380efd-ee4f-4339-b6ad-5688849a2c93\") " pod="openstack/neutron-db-create-j4sn8" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.136276 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjxvn\" (UniqueName: \"kubernetes.io/projected/199f6db1-c082-418e-9a55-330a39b15ed7-kube-api-access-zjxvn\") pod \"keystone-db-sync-8tflc\" (UID: \"199f6db1-c082-418e-9a55-330a39b15ed7\") " pod="openstack/keystone-db-sync-8tflc" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.138654 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf8wn\" (UniqueName: \"kubernetes.io/projected/63179dda-c120-44c4-9f07-1cef37ca07e0-kube-api-access-gf8wn\") pod \"neutron-e7d5-account-create-update-lwfj5\" (UID: \"63179dda-c120-44c4-9f07-1cef37ca07e0\") " pod="openstack/neutron-e7d5-account-create-update-lwfj5" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.139213 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-965e-account-create-update-hv27b" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.185859 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-tx8hv"] Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.186431 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" podUID="8c765369-da46-4bff-96f0-279d6c0b3f2c" containerName="dnsmasq-dns" containerID="cri-o://828610b690bbe81f9a30874069a355a617bd2aee568a38e0062ad76bb08b84ce" gracePeriod=10 Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.203780 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8tflc" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.209860 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j4sn8" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.257558 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e7d5-account-create-update-lwfj5" Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.470998 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hfcz7"] Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.480441 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c0b-account-create-update-vwfrq"] Feb 23 07:06:27 crc kubenswrapper[5047]: I0223 07:06:27.570227 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ll9m9"] Feb 23 07:06:27 crc kubenswrapper[5047]: W0223 07:06:27.748610 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e852f86_de0c_4691_b6dc_c0149b5932fb.slice/crio-1c7ec62c1b74c7b9291caf13795634daaf31ee172aced523666a2bf221a73d34 WatchSource:0}: Error finding container 1c7ec62c1b74c7b9291caf13795634daaf31ee172aced523666a2bf221a73d34: Status 404 returned error can't find the container with id 1c7ec62c1b74c7b9291caf13795634daaf31ee172aced523666a2bf221a73d34 Feb 23 07:06:27 crc kubenswrapper[5047]: W0223 07:06:27.750607 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1efebf35_1d9c_450d_abb5_6e247edc9db2.slice/crio-8f6779daf6f4134b686615998db844428d709a15f5430c3a63d53ceffcf2936f WatchSource:0}: Error finding container 8f6779daf6f4134b686615998db844428d709a15f5430c3a63d53ceffcf2936f: Status 404 returned error can't find the container with id 8f6779daf6f4134b686615998db844428d709a15f5430c3a63d53ceffcf2936f Feb 23 07:06:27 crc kubenswrapper[5047]: W0223 07:06:27.759410 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6427b69_b06b_4935_9206_ae09750f8900.slice/crio-d00f3c8ae8e09aa51863e10627e61330322d30dea2cb8a89281a8b1a631f80bf WatchSource:0}: Error finding container d00f3c8ae8e09aa51863e10627e61330322d30dea2cb8a89281a8b1a631f80bf: Status 404 returned error can't find the container with id d00f3c8ae8e09aa51863e10627e61330322d30dea2cb8a89281a8b1a631f80bf Feb 23 07:06:28 crc kubenswrapper[5047]: I0223 07:06:28.366268 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-965e-account-create-update-hv27b"] Feb 23 07:06:28 crc kubenswrapper[5047]: I0223 07:06:28.520883 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ll9m9" event={"ID":"b6427b69-b06b-4935-9206-ae09750f8900","Type":"ContainerStarted","Data":"d00f3c8ae8e09aa51863e10627e61330322d30dea2cb8a89281a8b1a631f80bf"} Feb 23 07:06:28 crc kubenswrapper[5047]: I0223 07:06:28.523485 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c0b-account-create-update-vwfrq" event={"ID":"1efebf35-1d9c-450d-abb5-6e247edc9db2","Type":"ContainerStarted","Data":"8f6779daf6f4134b686615998db844428d709a15f5430c3a63d53ceffcf2936f"} Feb 23 07:06:28 crc kubenswrapper[5047]: I0223 07:06:28.527159 5047 generic.go:334] "Generic (PLEG): container finished" podID="8c765369-da46-4bff-96f0-279d6c0b3f2c" containerID="828610b690bbe81f9a30874069a355a617bd2aee568a38e0062ad76bb08b84ce" exitCode=0 Feb 23 07:06:28 crc kubenswrapper[5047]: I0223 07:06:28.527228 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" event={"ID":"8c765369-da46-4bff-96f0-279d6c0b3f2c","Type":"ContainerDied","Data":"828610b690bbe81f9a30874069a355a617bd2aee568a38e0062ad76bb08b84ce"} Feb 23 07:06:28 crc kubenswrapper[5047]: I0223 07:06:28.533319 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hfcz7" event={"ID":"7e852f86-de0c-4691-b6dc-c0149b5932fb","Type":"ContainerStarted","Data":"1c7ec62c1b74c7b9291caf13795634daaf31ee172aced523666a2bf221a73d34"} Feb 23 07:06:35 crc kubenswrapper[5047]: W0223 07:06:35.347628 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod146d74a4_9a27_47d9_87e7_3372501acab0.slice/crio-ae40cdd770e6ace7d422f09428e80114e09a71b3ffb1a11662b3e070f325b2d0 WatchSource:0}: Error finding container ae40cdd770e6ace7d422f09428e80114e09a71b3ffb1a11662b3e070f325b2d0: Status 404 returned error can't find the container with id ae40cdd770e6ace7d422f09428e80114e09a71b3ffb1a11662b3e070f325b2d0 Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.465216 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.543988 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-config\") pod \"8c765369-da46-4bff-96f0-279d6c0b3f2c\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.544043 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-dns-svc\") pod \"8c765369-da46-4bff-96f0-279d6c0b3f2c\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.544393 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-ovsdbserver-sb\") pod \"8c765369-da46-4bff-96f0-279d6c0b3f2c\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.544426 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtzl8\" (UniqueName: \"kubernetes.io/projected/8c765369-da46-4bff-96f0-279d6c0b3f2c-kube-api-access-qtzl8\") pod \"8c765369-da46-4bff-96f0-279d6c0b3f2c\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.545950 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-ovsdbserver-nb\") pod \"8c765369-da46-4bff-96f0-279d6c0b3f2c\" (UID: \"8c765369-da46-4bff-96f0-279d6c0b3f2c\") " Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.557468 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c765369-da46-4bff-96f0-279d6c0b3f2c-kube-api-access-qtzl8" (OuterVolumeSpecName: "kube-api-access-qtzl8") pod "8c765369-da46-4bff-96f0-279d6c0b3f2c" (UID: "8c765369-da46-4bff-96f0-279d6c0b3f2c"). InnerVolumeSpecName "kube-api-access-qtzl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.602347 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8c765369-da46-4bff-96f0-279d6c0b3f2c" (UID: "8c765369-da46-4bff-96f0-279d6c0b3f2c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.602382 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8c765369-da46-4bff-96f0-279d6c0b3f2c" (UID: "8c765369-da46-4bff-96f0-279d6c0b3f2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.615956 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8c765369-da46-4bff-96f0-279d6c0b3f2c" (UID: "8c765369-da46-4bff-96f0-279d6c0b3f2c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.625681 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-config" (OuterVolumeSpecName: "config") pod "8c765369-da46-4bff-96f0-279d6c0b3f2c" (UID: "8c765369-da46-4bff-96f0-279d6c0b3f2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.627185 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" event={"ID":"8c765369-da46-4bff-96f0-279d6c0b3f2c","Type":"ContainerDied","Data":"aa35409474a6e9fe7bdbc1a97dbe79db571b94bfde41cde6ede6d634b1fd7e1f"} Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.627248 5047 scope.go:117] "RemoveContainer" containerID="828610b690bbe81f9a30874069a355a617bd2aee568a38e0062ad76bb08b84ce" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.627389 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.633340 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-965e-account-create-update-hv27b" event={"ID":"146d74a4-9a27-47d9-87e7-3372501acab0","Type":"ContainerStarted","Data":"ae40cdd770e6ace7d422f09428e80114e09a71b3ffb1a11662b3e070f325b2d0"} Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.648462 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.648494 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.648503 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.648513 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c765369-da46-4bff-96f0-279d6c0b3f2c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.648523 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtzl8\" (UniqueName: \"kubernetes.io/projected/8c765369-da46-4bff-96f0-279d6c0b3f2c-kube-api-access-qtzl8\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.670402 5047 scope.go:117] "RemoveContainer" containerID="0737c350066d281c56cf052bf14fe4e41d910589f64b82514b9bb7568b6dfc8b" Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.717410 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-tx8hv"] Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.725618 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-tx8hv"] Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.848140 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-j4sn8"] Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.901125 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8tflc"] Feb 23 07:06:35 crc kubenswrapper[5047]: I0223 07:06:35.908523 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e7d5-account-create-update-lwfj5"] Feb 23 07:06:35 crc kubenswrapper[5047]: W0223 07:06:35.909815 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod199f6db1_c082_418e_9a55_330a39b15ed7.slice/crio-a09bcd4a7a34f9e1bc185d18ba6e55e731bfa5eff11b6faa01c9dc93f3a9252a WatchSource:0}: Error finding container a09bcd4a7a34f9e1bc185d18ba6e55e731bfa5eff11b6faa01c9dc93f3a9252a: Status 404 returned error can't find the container with id a09bcd4a7a34f9e1bc185d18ba6e55e731bfa5eff11b6faa01c9dc93f3a9252a Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.368054 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c765369-da46-4bff-96f0-279d6c0b3f2c" path="/var/lib/kubelet/pods/8c765369-da46-4bff-96f0-279d6c0b3f2c/volumes" Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.645332 5047 generic.go:334] "Generic (PLEG): container finished" podID="63179dda-c120-44c4-9f07-1cef37ca07e0" containerID="5bea11547183c48486f53b580384d648b13b5009d4bcf4723ac380d79f01172f" exitCode=0 Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.645409 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e7d5-account-create-update-lwfj5" event={"ID":"63179dda-c120-44c4-9f07-1cef37ca07e0","Type":"ContainerDied","Data":"5bea11547183c48486f53b580384d648b13b5009d4bcf4723ac380d79f01172f"} Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.645443 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e7d5-account-create-update-lwfj5" event={"ID":"63179dda-c120-44c4-9f07-1cef37ca07e0","Type":"ContainerStarted","Data":"633ac4ef62cfd134ae07374e84d2a915689a70733ff81e26681be91fe3e8d63a"} Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.649687 5047 generic.go:334] "Generic (PLEG): container finished" podID="7e380efd-ee4f-4339-b6ad-5688849a2c93" containerID="a1572c2ddd422b04e8cd0b8fc284f5c4a477827e929ddc442a91b9dc6f919ca6" exitCode=0 Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.649909 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j4sn8" event={"ID":"7e380efd-ee4f-4339-b6ad-5688849a2c93","Type":"ContainerDied","Data":"a1572c2ddd422b04e8cd0b8fc284f5c4a477827e929ddc442a91b9dc6f919ca6"} Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.650050 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j4sn8" event={"ID":"7e380efd-ee4f-4339-b6ad-5688849a2c93","Type":"ContainerStarted","Data":"2ee807ab586409198b183d6d1f7b2d3c92dbab9b310cb33704fb279bfb236994"} Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.651566 5047 generic.go:334] "Generic (PLEG): container finished" podID="1efebf35-1d9c-450d-abb5-6e247edc9db2" containerID="78644fef8da1bd0cdac4bf80a09b3f519c55c9bc25ca47733d8b832f3f4e6e06" exitCode=0 Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.651618 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c0b-account-create-update-vwfrq" event={"ID":"1efebf35-1d9c-450d-abb5-6e247edc9db2","Type":"ContainerDied","Data":"78644fef8da1bd0cdac4bf80a09b3f519c55c9bc25ca47733d8b832f3f4e6e06"} Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.653249 5047 generic.go:334] "Generic (PLEG): container finished" podID="146d74a4-9a27-47d9-87e7-3372501acab0" containerID="5a75079717a470b29de40ec2a4f36e2754c02b302a2199b3ba843bf798995f82" exitCode=0 Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.653289 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-965e-account-create-update-hv27b" event={"ID":"146d74a4-9a27-47d9-87e7-3372501acab0","Type":"ContainerDied","Data":"5a75079717a470b29de40ec2a4f36e2754c02b302a2199b3ba843bf798995f82"} Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.656839 5047 generic.go:334] "Generic (PLEG): container finished" podID="7e852f86-de0c-4691-b6dc-c0149b5932fb" containerID="0ad3ee1b7a850db2d2c262c0e56cbbd4c72b1c1fab94b408cf7c5149a500b02c" exitCode=0 Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.657106 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hfcz7" event={"ID":"7e852f86-de0c-4691-b6dc-c0149b5932fb","Type":"ContainerDied","Data":"0ad3ee1b7a850db2d2c262c0e56cbbd4c72b1c1fab94b408cf7c5149a500b02c"} Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.658294 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8tflc" event={"ID":"199f6db1-c082-418e-9a55-330a39b15ed7","Type":"ContainerStarted","Data":"a09bcd4a7a34f9e1bc185d18ba6e55e731bfa5eff11b6faa01c9dc93f3a9252a"} Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.661528 5047 generic.go:334] "Generic (PLEG): container finished" podID="b6427b69-b06b-4935-9206-ae09750f8900" containerID="c322513d0f04b9b096c0fe9678a7d5ad8b11f3442f3ff8b0d76101352a14947a" exitCode=0 Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.661627 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ll9m9" event={"ID":"b6427b69-b06b-4935-9206-ae09750f8900","Type":"ContainerDied","Data":"c322513d0f04b9b096c0fe9678a7d5ad8b11f3442f3ff8b0d76101352a14947a"} Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.684292 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9bx9t" podStartSLOduration=2.977652316 podStartE2EDuration="17.68427048s" podCreationTimestamp="2026-02-23 07:06:19 +0000 UTC" firstStartedPulling="2026-02-23 07:06:20.85098525 +0000 UTC m=+1303.102312384" lastFinishedPulling="2026-02-23 07:06:35.557603414 +0000 UTC m=+1317.808930548" observedRunningTime="2026-02-23 07:06:36.678709312 +0000 UTC m=+1318.930036446" watchObservedRunningTime="2026-02-23 07:06:36.68427048 +0000 UTC m=+1318.935597614" Feb 23 07:06:36 crc kubenswrapper[5047]: I0223 07:06:36.995724 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-689df5d84f-tx8hv" podUID="8c765369-da46-4bff-96f0-279d6c0b3f2c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Feb 23 07:06:37 crc kubenswrapper[5047]: I0223 07:06:37.687799 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9bx9t" event={"ID":"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2","Type":"ContainerStarted","Data":"2d797d6a9c01a1cb9479944763429bf1039d9581a793335a6b8e1f9d395e3154"} Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.746543 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ll9m9" event={"ID":"b6427b69-b06b-4935-9206-ae09750f8900","Type":"ContainerDied","Data":"d00f3c8ae8e09aa51863e10627e61330322d30dea2cb8a89281a8b1a631f80bf"} Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.747074 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d00f3c8ae8e09aa51863e10627e61330322d30dea2cb8a89281a8b1a631f80bf" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.752518 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e7d5-account-create-update-lwfj5" event={"ID":"63179dda-c120-44c4-9f07-1cef37ca07e0","Type":"ContainerDied","Data":"633ac4ef62cfd134ae07374e84d2a915689a70733ff81e26681be91fe3e8d63a"} Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.752576 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="633ac4ef62cfd134ae07374e84d2a915689a70733ff81e26681be91fe3e8d63a" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.755441 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j4sn8" event={"ID":"7e380efd-ee4f-4339-b6ad-5688849a2c93","Type":"ContainerDied","Data":"2ee807ab586409198b183d6d1f7b2d3c92dbab9b310cb33704fb279bfb236994"} Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.755460 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ee807ab586409198b183d6d1f7b2d3c92dbab9b310cb33704fb279bfb236994" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.757022 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c0b-account-create-update-vwfrq" event={"ID":"1efebf35-1d9c-450d-abb5-6e247edc9db2","Type":"ContainerDied","Data":"8f6779daf6f4134b686615998db844428d709a15f5430c3a63d53ceffcf2936f"} Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.757050 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f6779daf6f4134b686615998db844428d709a15f5430c3a63d53ceffcf2936f" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.758765 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-965e-account-create-update-hv27b" event={"ID":"146d74a4-9a27-47d9-87e7-3372501acab0","Type":"ContainerDied","Data":"ae40cdd770e6ace7d422f09428e80114e09a71b3ffb1a11662b3e070f325b2d0"} Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.758780 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae40cdd770e6ace7d422f09428e80114e09a71b3ffb1a11662b3e070f325b2d0" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.760872 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hfcz7" event={"ID":"7e852f86-de0c-4691-b6dc-c0149b5932fb","Type":"ContainerDied","Data":"1c7ec62c1b74c7b9291caf13795634daaf31ee172aced523666a2bf221a73d34"} Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.760967 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c7ec62c1b74c7b9291caf13795634daaf31ee172aced523666a2bf221a73d34" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.765869 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ll9m9" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.778238 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e7d5-account-create-update-lwfj5" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.802156 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j4sn8" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.826499 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c0b-account-create-update-vwfrq" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.853223 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-965e-account-create-update-hv27b" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.865278 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hfcz7" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.866546 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfr6p\" (UniqueName: \"kubernetes.io/projected/7e380efd-ee4f-4339-b6ad-5688849a2c93-kube-api-access-vfr6p\") pod \"7e380efd-ee4f-4339-b6ad-5688849a2c93\" (UID: \"7e380efd-ee4f-4339-b6ad-5688849a2c93\") " Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.866609 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1efebf35-1d9c-450d-abb5-6e247edc9db2-operator-scripts\") pod \"1efebf35-1d9c-450d-abb5-6e247edc9db2\" (UID: \"1efebf35-1d9c-450d-abb5-6e247edc9db2\") " Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.866660 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppc8s\" (UniqueName: \"kubernetes.io/projected/146d74a4-9a27-47d9-87e7-3372501acab0-kube-api-access-ppc8s\") pod \"146d74a4-9a27-47d9-87e7-3372501acab0\" (UID: \"146d74a4-9a27-47d9-87e7-3372501acab0\") " Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.867903 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1efebf35-1d9c-450d-abb5-6e247edc9db2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1efebf35-1d9c-450d-abb5-6e247edc9db2" (UID: "1efebf35-1d9c-450d-abb5-6e247edc9db2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.868838 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq8lz\" (UniqueName: \"kubernetes.io/projected/1efebf35-1d9c-450d-abb5-6e247edc9db2-kube-api-access-pq8lz\") pod \"1efebf35-1d9c-450d-abb5-6e247edc9db2\" (UID: \"1efebf35-1d9c-450d-abb5-6e247edc9db2\") " Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.869032 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjbwk\" (UniqueName: \"kubernetes.io/projected/b6427b69-b06b-4935-9206-ae09750f8900-kube-api-access-vjbwk\") pod \"b6427b69-b06b-4935-9206-ae09750f8900\" (UID: \"b6427b69-b06b-4935-9206-ae09750f8900\") " Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.869155 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/146d74a4-9a27-47d9-87e7-3372501acab0-operator-scripts\") pod \"146d74a4-9a27-47d9-87e7-3372501acab0\" (UID: \"146d74a4-9a27-47d9-87e7-3372501acab0\") " Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.869187 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63179dda-c120-44c4-9f07-1cef37ca07e0-operator-scripts\") pod \"63179dda-c120-44c4-9f07-1cef37ca07e0\" (UID: \"63179dda-c120-44c4-9f07-1cef37ca07e0\") " Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.869248 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e380efd-ee4f-4339-b6ad-5688849a2c93-operator-scripts\") pod \"7e380efd-ee4f-4339-b6ad-5688849a2c93\" (UID: \"7e380efd-ee4f-4339-b6ad-5688849a2c93\") " Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.869299 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6427b69-b06b-4935-9206-ae09750f8900-operator-scripts\") pod \"b6427b69-b06b-4935-9206-ae09750f8900\" (UID: \"b6427b69-b06b-4935-9206-ae09750f8900\") " Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.869362 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf8wn\" (UniqueName: \"kubernetes.io/projected/63179dda-c120-44c4-9f07-1cef37ca07e0-kube-api-access-gf8wn\") pod \"63179dda-c120-44c4-9f07-1cef37ca07e0\" (UID: \"63179dda-c120-44c4-9f07-1cef37ca07e0\") " Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.869804 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1efebf35-1d9c-450d-abb5-6e247edc9db2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.871330 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146d74a4-9a27-47d9-87e7-3372501acab0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "146d74a4-9a27-47d9-87e7-3372501acab0" (UID: "146d74a4-9a27-47d9-87e7-3372501acab0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.871772 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e380efd-ee4f-4339-b6ad-5688849a2c93-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e380efd-ee4f-4339-b6ad-5688849a2c93" (UID: "7e380efd-ee4f-4339-b6ad-5688849a2c93"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.872336 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63179dda-c120-44c4-9f07-1cef37ca07e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63179dda-c120-44c4-9f07-1cef37ca07e0" (UID: "63179dda-c120-44c4-9f07-1cef37ca07e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.873184 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6427b69-b06b-4935-9206-ae09750f8900-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6427b69-b06b-4935-9206-ae09750f8900" (UID: "b6427b69-b06b-4935-9206-ae09750f8900"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.875855 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e380efd-ee4f-4339-b6ad-5688849a2c93-kube-api-access-vfr6p" (OuterVolumeSpecName: "kube-api-access-vfr6p") pod "7e380efd-ee4f-4339-b6ad-5688849a2c93" (UID: "7e380efd-ee4f-4339-b6ad-5688849a2c93"). InnerVolumeSpecName "kube-api-access-vfr6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.877616 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63179dda-c120-44c4-9f07-1cef37ca07e0-kube-api-access-gf8wn" (OuterVolumeSpecName: "kube-api-access-gf8wn") pod "63179dda-c120-44c4-9f07-1cef37ca07e0" (UID: "63179dda-c120-44c4-9f07-1cef37ca07e0"). InnerVolumeSpecName "kube-api-access-gf8wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.877774 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6427b69-b06b-4935-9206-ae09750f8900-kube-api-access-vjbwk" (OuterVolumeSpecName: "kube-api-access-vjbwk") pod "b6427b69-b06b-4935-9206-ae09750f8900" (UID: "b6427b69-b06b-4935-9206-ae09750f8900"). InnerVolumeSpecName "kube-api-access-vjbwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.879068 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1efebf35-1d9c-450d-abb5-6e247edc9db2-kube-api-access-pq8lz" (OuterVolumeSpecName: "kube-api-access-pq8lz") pod "1efebf35-1d9c-450d-abb5-6e247edc9db2" (UID: "1efebf35-1d9c-450d-abb5-6e247edc9db2"). InnerVolumeSpecName "kube-api-access-pq8lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.879869 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146d74a4-9a27-47d9-87e7-3372501acab0-kube-api-access-ppc8s" (OuterVolumeSpecName: "kube-api-access-ppc8s") pod "146d74a4-9a27-47d9-87e7-3372501acab0" (UID: "146d74a4-9a27-47d9-87e7-3372501acab0"). InnerVolumeSpecName "kube-api-access-ppc8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.970380 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e852f86-de0c-4691-b6dc-c0149b5932fb-operator-scripts\") pod \"7e852f86-de0c-4691-b6dc-c0149b5932fb\" (UID: \"7e852f86-de0c-4691-b6dc-c0149b5932fb\") " Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.970450 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-574bl\" (UniqueName: \"kubernetes.io/projected/7e852f86-de0c-4691-b6dc-c0149b5932fb-kube-api-access-574bl\") pod \"7e852f86-de0c-4691-b6dc-c0149b5932fb\" (UID: \"7e852f86-de0c-4691-b6dc-c0149b5932fb\") " Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.970767 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf8wn\" (UniqueName: \"kubernetes.io/projected/63179dda-c120-44c4-9f07-1cef37ca07e0-kube-api-access-gf8wn\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.970790 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfr6p\" (UniqueName: \"kubernetes.io/projected/7e380efd-ee4f-4339-b6ad-5688849a2c93-kube-api-access-vfr6p\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.970801 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppc8s\" (UniqueName: \"kubernetes.io/projected/146d74a4-9a27-47d9-87e7-3372501acab0-kube-api-access-ppc8s\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.970813 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq8lz\" (UniqueName: \"kubernetes.io/projected/1efebf35-1d9c-450d-abb5-6e247edc9db2-kube-api-access-pq8lz\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.970826 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjbwk\" (UniqueName: \"kubernetes.io/projected/b6427b69-b06b-4935-9206-ae09750f8900-kube-api-access-vjbwk\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.970835 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/146d74a4-9a27-47d9-87e7-3372501acab0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.970844 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63179dda-c120-44c4-9f07-1cef37ca07e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.970857 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e380efd-ee4f-4339-b6ad-5688849a2c93-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.970867 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6427b69-b06b-4935-9206-ae09750f8900-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.971750 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e852f86-de0c-4691-b6dc-c0149b5932fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e852f86-de0c-4691-b6dc-c0149b5932fb" (UID: "7e852f86-de0c-4691-b6dc-c0149b5932fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:40 crc kubenswrapper[5047]: I0223 07:06:40.974193 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e852f86-de0c-4691-b6dc-c0149b5932fb-kube-api-access-574bl" (OuterVolumeSpecName: "kube-api-access-574bl") pod "7e852f86-de0c-4691-b6dc-c0149b5932fb" (UID: "7e852f86-de0c-4691-b6dc-c0149b5932fb"). InnerVolumeSpecName "kube-api-access-574bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:41 crc kubenswrapper[5047]: I0223 07:06:41.073406 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e852f86-de0c-4691-b6dc-c0149b5932fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:41 crc kubenswrapper[5047]: I0223 07:06:41.073454 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-574bl\" (UniqueName: \"kubernetes.io/projected/7e852f86-de0c-4691-b6dc-c0149b5932fb-kube-api-access-574bl\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:41 crc kubenswrapper[5047]: I0223 07:06:41.831359 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8tflc" event={"ID":"199f6db1-c082-418e-9a55-330a39b15ed7","Type":"ContainerStarted","Data":"d69b5e2523509c2ac1197e8192f774f48b4003ecd9c6f511dc9d68dbb5ace140"} Feb 23 07:06:41 crc kubenswrapper[5047]: I0223 07:06:41.831531 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ll9m9" Feb 23 07:06:41 crc kubenswrapper[5047]: I0223 07:06:41.831458 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j4sn8" Feb 23 07:06:41 crc kubenswrapper[5047]: I0223 07:06:41.831479 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hfcz7" Feb 23 07:06:41 crc kubenswrapper[5047]: I0223 07:06:41.831507 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c0b-account-create-update-vwfrq" Feb 23 07:06:41 crc kubenswrapper[5047]: I0223 07:06:41.831464 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e7d5-account-create-update-lwfj5" Feb 23 07:06:41 crc kubenswrapper[5047]: I0223 07:06:41.831513 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-965e-account-create-update-hv27b" Feb 23 07:06:41 crc kubenswrapper[5047]: I0223 07:06:41.889552 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-8tflc" podStartSLOduration=11.271411063 podStartE2EDuration="15.889523843s" podCreationTimestamp="2026-02-23 07:06:26 +0000 UTC" firstStartedPulling="2026-02-23 07:06:35.912790818 +0000 UTC m=+1318.164117952" lastFinishedPulling="2026-02-23 07:06:40.530903598 +0000 UTC m=+1322.782230732" observedRunningTime="2026-02-23 07:06:41.869713676 +0000 UTC m=+1324.121040830" watchObservedRunningTime="2026-02-23 07:06:41.889523843 +0000 UTC m=+1324.140850977" Feb 23 07:06:43 crc kubenswrapper[5047]: I0223 07:06:43.858405 5047 generic.go:334] "Generic (PLEG): container finished" podID="199f6db1-c082-418e-9a55-330a39b15ed7" containerID="d69b5e2523509c2ac1197e8192f774f48b4003ecd9c6f511dc9d68dbb5ace140" exitCode=0 Feb 23 07:06:43 crc kubenswrapper[5047]: I0223 07:06:43.858670 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8tflc" event={"ID":"199f6db1-c082-418e-9a55-330a39b15ed7","Type":"ContainerDied","Data":"d69b5e2523509c2ac1197e8192f774f48b4003ecd9c6f511dc9d68dbb5ace140"} Feb 23 07:06:43 crc kubenswrapper[5047]: I0223 07:06:43.866666 5047 generic.go:334] "Generic (PLEG): container finished" podID="2e1cee2a-62ea-4010-8d71-d9ddd321a9c2" containerID="2d797d6a9c01a1cb9479944763429bf1039d9581a793335a6b8e1f9d395e3154" exitCode=0 Feb 23 07:06:43 crc kubenswrapper[5047]: I0223 07:06:43.866726 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9bx9t" event={"ID":"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2","Type":"ContainerDied","Data":"2d797d6a9c01a1cb9479944763429bf1039d9581a793335a6b8e1f9d395e3154"} Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.341806 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8tflc" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.398972 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199f6db1-c082-418e-9a55-330a39b15ed7-config-data\") pod \"199f6db1-c082-418e-9a55-330a39b15ed7\" (UID: \"199f6db1-c082-418e-9a55-330a39b15ed7\") " Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.399096 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjxvn\" (UniqueName: \"kubernetes.io/projected/199f6db1-c082-418e-9a55-330a39b15ed7-kube-api-access-zjxvn\") pod \"199f6db1-c082-418e-9a55-330a39b15ed7\" (UID: \"199f6db1-c082-418e-9a55-330a39b15ed7\") " Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.399324 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199f6db1-c082-418e-9a55-330a39b15ed7-combined-ca-bundle\") pod \"199f6db1-c082-418e-9a55-330a39b15ed7\" (UID: \"199f6db1-c082-418e-9a55-330a39b15ed7\") " Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.408385 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199f6db1-c082-418e-9a55-330a39b15ed7-kube-api-access-zjxvn" (OuterVolumeSpecName: "kube-api-access-zjxvn") pod "199f6db1-c082-418e-9a55-330a39b15ed7" (UID: "199f6db1-c082-418e-9a55-330a39b15ed7"). InnerVolumeSpecName "kube-api-access-zjxvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.433787 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199f6db1-c082-418e-9a55-330a39b15ed7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "199f6db1-c082-418e-9a55-330a39b15ed7" (UID: "199f6db1-c082-418e-9a55-330a39b15ed7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.461023 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199f6db1-c082-418e-9a55-330a39b15ed7-config-data" (OuterVolumeSpecName: "config-data") pod "199f6db1-c082-418e-9a55-330a39b15ed7" (UID: "199f6db1-c082-418e-9a55-330a39b15ed7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.483354 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.500778 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-db-sync-config-data\") pod \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.501706 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-config-data\") pod \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.501756 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzjbp\" (UniqueName: \"kubernetes.io/projected/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-kube-api-access-tzjbp\") pod \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.501988 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-combined-ca-bundle\") pod \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\" (UID: \"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2\") " Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.506600 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-kube-api-access-tzjbp" (OuterVolumeSpecName: "kube-api-access-tzjbp") pod "2e1cee2a-62ea-4010-8d71-d9ddd321a9c2" (UID: "2e1cee2a-62ea-4010-8d71-d9ddd321a9c2"). InnerVolumeSpecName "kube-api-access-tzjbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.507045 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2e1cee2a-62ea-4010-8d71-d9ddd321a9c2" (UID: "2e1cee2a-62ea-4010-8d71-d9ddd321a9c2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.507223 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199f6db1-c082-418e-9a55-330a39b15ed7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.507281 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199f6db1-c082-418e-9a55-330a39b15ed7-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.507297 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjxvn\" (UniqueName: \"kubernetes.io/projected/199f6db1-c082-418e-9a55-330a39b15ed7-kube-api-access-zjxvn\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.529146 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e1cee2a-62ea-4010-8d71-d9ddd321a9c2" (UID: "2e1cee2a-62ea-4010-8d71-d9ddd321a9c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.555231 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-config-data" (OuterVolumeSpecName: "config-data") pod "2e1cee2a-62ea-4010-8d71-d9ddd321a9c2" (UID: "2e1cee2a-62ea-4010-8d71-d9ddd321a9c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.609234 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.609278 5047 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.609291 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.609300 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzjbp\" (UniqueName: \"kubernetes.io/projected/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2-kube-api-access-tzjbp\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.891002 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9bx9t" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.890881 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9bx9t" event={"ID":"2e1cee2a-62ea-4010-8d71-d9ddd321a9c2","Type":"ContainerDied","Data":"cfefeb266a17cb27f73648472a05b5ff89ca972be675aa29e268d2f514055aba"} Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.891187 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfefeb266a17cb27f73648472a05b5ff89ca972be675aa29e268d2f514055aba" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.893379 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8tflc" event={"ID":"199f6db1-c082-418e-9a55-330a39b15ed7","Type":"ContainerDied","Data":"a09bcd4a7a34f9e1bc185d18ba6e55e731bfa5eff11b6faa01c9dc93f3a9252a"} Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.893507 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a09bcd4a7a34f9e1bc185d18ba6e55e731bfa5eff11b6faa01c9dc93f3a9252a" Feb 23 07:06:45 crc kubenswrapper[5047]: I0223 07:06:45.893449 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8tflc" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.507348 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-q6m87"] Feb 23 07:06:46 crc kubenswrapper[5047]: E0223 07:06:46.507730 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146d74a4-9a27-47d9-87e7-3372501acab0" containerName="mariadb-account-create-update" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.507745 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="146d74a4-9a27-47d9-87e7-3372501acab0" containerName="mariadb-account-create-update" Feb 23 07:06:46 crc kubenswrapper[5047]: E0223 07:06:46.507759 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e380efd-ee4f-4339-b6ad-5688849a2c93" containerName="mariadb-database-create" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.507766 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e380efd-ee4f-4339-b6ad-5688849a2c93" containerName="mariadb-database-create" Feb 23 07:06:46 crc kubenswrapper[5047]: E0223 07:06:46.507780 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6427b69-b06b-4935-9206-ae09750f8900" containerName="mariadb-database-create" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.507788 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6427b69-b06b-4935-9206-ae09750f8900" containerName="mariadb-database-create" Feb 23 07:06:46 crc kubenswrapper[5047]: E0223 07:06:46.507805 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e1cee2a-62ea-4010-8d71-d9ddd321a9c2" containerName="glance-db-sync" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.507812 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e1cee2a-62ea-4010-8d71-d9ddd321a9c2" containerName="glance-db-sync" Feb 23 07:06:46 crc kubenswrapper[5047]: E0223 07:06:46.507827 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63179dda-c120-44c4-9f07-1cef37ca07e0" containerName="mariadb-account-create-update" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.507834 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="63179dda-c120-44c4-9f07-1cef37ca07e0" containerName="mariadb-account-create-update" Feb 23 07:06:46 crc kubenswrapper[5047]: E0223 07:06:46.507847 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199f6db1-c082-418e-9a55-330a39b15ed7" containerName="keystone-db-sync" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.507853 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="199f6db1-c082-418e-9a55-330a39b15ed7" containerName="keystone-db-sync" Feb 23 07:06:46 crc kubenswrapper[5047]: E0223 07:06:46.507866 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efebf35-1d9c-450d-abb5-6e247edc9db2" containerName="mariadb-account-create-update" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.507874 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efebf35-1d9c-450d-abb5-6e247edc9db2" containerName="mariadb-account-create-update" Feb 23 07:06:46 crc kubenswrapper[5047]: E0223 07:06:46.507884 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e852f86-de0c-4691-b6dc-c0149b5932fb" containerName="mariadb-database-create" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.507889 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e852f86-de0c-4691-b6dc-c0149b5932fb" containerName="mariadb-database-create" Feb 23 07:06:46 crc kubenswrapper[5047]: E0223 07:06:46.507898 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c765369-da46-4bff-96f0-279d6c0b3f2c" containerName="init" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.507920 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c765369-da46-4bff-96f0-279d6c0b3f2c" containerName="init" Feb 23 07:06:46 crc kubenswrapper[5047]: E0223 07:06:46.507927 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c765369-da46-4bff-96f0-279d6c0b3f2c" containerName="dnsmasq-dns" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.507934 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c765369-da46-4bff-96f0-279d6c0b3f2c" containerName="dnsmasq-dns" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.508069 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e852f86-de0c-4691-b6dc-c0149b5932fb" containerName="mariadb-database-create" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.508077 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="199f6db1-c082-418e-9a55-330a39b15ed7" containerName="keystone-db-sync" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.508091 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efebf35-1d9c-450d-abb5-6e247edc9db2" containerName="mariadb-account-create-update" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.508103 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6427b69-b06b-4935-9206-ae09750f8900" containerName="mariadb-database-create" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.508111 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="146d74a4-9a27-47d9-87e7-3372501acab0" containerName="mariadb-account-create-update" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.508118 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e380efd-ee4f-4339-b6ad-5688849a2c93" containerName="mariadb-database-create" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.508126 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c765369-da46-4bff-96f0-279d6c0b3f2c" containerName="dnsmasq-dns" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.508139 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e1cee2a-62ea-4010-8d71-d9ddd321a9c2" containerName="glance-db-sync" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.508147 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="63179dda-c120-44c4-9f07-1cef37ca07e0" containerName="mariadb-account-create-update" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.509103 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.537457 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.537524 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-dns-svc\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.537577 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.537603 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2gwd\" (UniqueName: \"kubernetes.io/projected/af1521c5-df95-4084-8c0e-301abbc8e28e-kube-api-access-m2gwd\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.538559 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.538798 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-config\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.548675 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-q6m87"] Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.618135 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-x7tmc"] Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.623427 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.628877 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.629059 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.629100 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-whfdb" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.629238 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.629363 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.641039 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-config\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.641143 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.641191 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-dns-svc\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.641255 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.641296 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2gwd\" (UniqueName: \"kubernetes.io/projected/af1521c5-df95-4084-8c0e-301abbc8e28e-kube-api-access-m2gwd\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.641339 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.642829 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.642899 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-config\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.643567 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-dns-svc\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.646367 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.648117 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.660673 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x7tmc"] Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.680873 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2gwd\" (UniqueName: \"kubernetes.io/projected/af1521c5-df95-4084-8c0e-301abbc8e28e-kube-api-access-m2gwd\") pod \"dnsmasq-dns-68677f88c9-q6m87\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.690635 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-q6m87"] Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.691705 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.738942 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-dqlkx"] Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.740851 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.742436 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlncq\" (UniqueName: \"kubernetes.io/projected/73c6907d-dad6-4fc7-942d-5cb624bc524e-kube-api-access-tlncq\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.742502 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-credential-keys\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.742540 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-combined-ca-bundle\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.742585 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-scripts\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.742612 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-fernet-keys\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.742644 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-config-data\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.753806 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-dqlkx"] Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.762766 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.762849 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.762895 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.763858 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18683d8e116e648be702e8fe2b59331eb73a682d795d009e915974255f56b210"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:06:46 crc kubenswrapper[5047]: I0223 07:06:46.763933 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://18683d8e116e648be702e8fe2b59331eb73a682d795d009e915974255f56b210" gracePeriod=600 Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.846182 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-scripts\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.846243 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.846266 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-fernet-keys\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.846297 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.846322 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-config-data\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.846342 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.846359 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-config\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.846385 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps7d4\" (UniqueName: \"kubernetes.io/projected/6425be06-addb-45b9-b3ce-9e414555e5b2-kube-api-access-ps7d4\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.846406 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlncq\" (UniqueName: \"kubernetes.io/projected/73c6907d-dad6-4fc7-942d-5cb624bc524e-kube-api-access-tlncq\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.846441 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-credential-keys\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.846508 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-combined-ca-bundle\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.846530 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.854617 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-scripts\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.855702 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-config-data\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.900451 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-credential-keys\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.900533 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-fernet-keys\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.909152 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-combined-ca-bundle\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.940101 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlncq\" (UniqueName: \"kubernetes.io/projected/73c6907d-dad6-4fc7-942d-5cb624bc524e-kube-api-access-tlncq\") pod \"keystone-bootstrap-x7tmc\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.997791 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps7d4\" (UniqueName: \"kubernetes.io/projected/6425be06-addb-45b9-b3ce-9e414555e5b2-kube-api-access-ps7d4\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:46.998813 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.002741 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-tqpzs"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.008093 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.008406 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.008974 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.009053 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.009395 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.009551 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.009093 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-config\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.010048 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.012689 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.013222 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-48c7h" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.015575 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-config\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.015885 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.016107 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.027860 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.039628 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-tqpzs"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.048557 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps7d4\" (UniqueName: \"kubernetes.io/projected/6425be06-addb-45b9-b3ce-9e414555e5b2-kube-api-access-ps7d4\") pod \"dnsmasq-dns-7d67cdfc8f-dqlkx\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.082741 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-dqlkx"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.083624 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.099451 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67dccc895-qmfq9"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.101166 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.118402 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-combined-ca-bundle\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.118498 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b54bac12-7429-4d94-a023-535d11e803d4-logs\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.118517 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx55v\" (UniqueName: \"kubernetes.io/projected/b54bac12-7429-4d94-a023-535d11e803d4-kube-api-access-wx55v\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.118539 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-config-data\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.118593 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-scripts\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.123935 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-qmfq9"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.139020 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.148932 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-vw9rj"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.149763 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.149888 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vw9rj" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.152043 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-vw9rj"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.154300 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.154988 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-v97qw" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.157123 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.157575 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.163046 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-nglmq"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.171634 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.177136 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4k8zt" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.177518 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.183180 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.188796 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.201407 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nglmq"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.219718 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.219766 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-log-httpd\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.220049 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj2m5\" (UniqueName: \"kubernetes.io/projected/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-kube-api-access-rj2m5\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.220083 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-combined-ca-bundle\") pod \"barbican-db-sync-vw9rj\" (UID: \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\") " pod="openstack/barbican-db-sync-vw9rj" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.220132 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-db-sync-config-data\") pod \"barbican-db-sync-vw9rj\" (UID: \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\") " pod="openstack/barbican-db-sync-vw9rj" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.220149 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg67s\" (UniqueName: \"kubernetes.io/projected/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-kube-api-access-gg67s\") pod \"barbican-db-sync-vw9rj\" (UID: \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\") " pod="openstack/barbican-db-sync-vw9rj" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.220188 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.220225 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.220251 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-combined-ca-bundle\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.220270 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.220473 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-config\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.220851 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-scripts\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.220961 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b54bac12-7429-4d94-a023-535d11e803d4-logs\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.220996 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx55v\" (UniqueName: \"kubernetes.io/projected/b54bac12-7429-4d94-a023-535d11e803d4-kube-api-access-wx55v\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.221032 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-config-data\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.221052 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-dns-svc\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.221124 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8g9j\" (UniqueName: \"kubernetes.io/projected/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-kube-api-access-r8g9j\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.221722 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.221795 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-config-data\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.221837 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-run-httpd\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.221876 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-scripts\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.221896 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b54bac12-7429-4d94-a023-535d11e803d4-logs\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.234035 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-8zsv6"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.235681 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8zsv6" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.237972 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.239081 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pgvbz" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.239858 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.245014 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx55v\" (UniqueName: \"kubernetes.io/projected/b54bac12-7429-4d94-a023-535d11e803d4-kube-api-access-wx55v\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.245308 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-scripts\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.246612 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-combined-ca-bundle\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.249424 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-config-data\") pod \"placement-db-sync-tqpzs\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.254928 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8zsv6"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.324621 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8g9j\" (UniqueName: \"kubernetes.io/projected/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-kube-api-access-r8g9j\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.325100 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.325130 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-config-data\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326159 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-db-sync-config-data\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326187 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-scripts\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326206 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-run-httpd\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326231 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-config-data\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326257 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326281 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-log-httpd\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326302 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj2m5\" (UniqueName: \"kubernetes.io/projected/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-kube-api-access-rj2m5\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326340 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-combined-ca-bundle\") pod \"barbican-db-sync-vw9rj\" (UID: \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\") " pod="openstack/barbican-db-sync-vw9rj" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326361 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-db-sync-config-data\") pod \"barbican-db-sync-vw9rj\" (UID: \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\") " pod="openstack/barbican-db-sync-vw9rj" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326383 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aca947e1-09b7-404f-9162-94484bf52701-etc-machine-id\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326401 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zfk9\" (UniqueName: \"kubernetes.io/projected/67fadcfb-082c-4b01-9972-816963267dff-kube-api-access-4zfk9\") pod \"neutron-db-sync-8zsv6\" (UID: \"67fadcfb-082c-4b01-9972-816963267dff\") " pod="openstack/neutron-db-sync-8zsv6" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326421 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg67s\" (UniqueName: \"kubernetes.io/projected/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-kube-api-access-gg67s\") pod \"barbican-db-sync-vw9rj\" (UID: \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\") " pod="openstack/barbican-db-sync-vw9rj" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326444 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326479 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326500 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-combined-ca-bundle\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326520 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326558 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99lld\" (UniqueName: \"kubernetes.io/projected/aca947e1-09b7-404f-9162-94484bf52701-kube-api-access-99lld\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326580 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-config\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326597 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67fadcfb-082c-4b01-9972-816963267dff-combined-ca-bundle\") pod \"neutron-db-sync-8zsv6\" (UID: \"67fadcfb-082c-4b01-9972-816963267dff\") " pod="openstack/neutron-db-sync-8zsv6" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326635 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-scripts\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326654 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-dns-svc\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.326671 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67fadcfb-082c-4b01-9972-816963267dff-config\") pod \"neutron-db-sync-8zsv6\" (UID: \"67fadcfb-082c-4b01-9972-816963267dff\") " pod="openstack/neutron-db-sync-8zsv6" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.331013 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tqpzs" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.331567 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.331976 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-run-httpd\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.332582 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-log-httpd\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.333200 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.334776 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.336312 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-config\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.337324 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-dns-svc\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.345680 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.346897 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.347105 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-combined-ca-bundle\") pod \"barbican-db-sync-vw9rj\" (UID: \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\") " pod="openstack/barbican-db-sync-vw9rj" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.347238 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-db-sync-config-data\") pod \"barbican-db-sync-vw9rj\" (UID: \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\") " pod="openstack/barbican-db-sync-vw9rj" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.347475 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-scripts\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.351713 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8g9j\" (UniqueName: \"kubernetes.io/projected/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-kube-api-access-r8g9j\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.352743 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-config-data\") pod \"ceilometer-0\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.360555 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg67s\" (UniqueName: \"kubernetes.io/projected/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-kube-api-access-gg67s\") pod \"barbican-db-sync-vw9rj\" (UID: \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\") " pod="openstack/barbican-db-sync-vw9rj" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.361621 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj2m5\" (UniqueName: \"kubernetes.io/projected/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-kube-api-access-rj2m5\") pod \"dnsmasq-dns-67dccc895-qmfq9\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.428860 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99lld\" (UniqueName: \"kubernetes.io/projected/aca947e1-09b7-404f-9162-94484bf52701-kube-api-access-99lld\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.428926 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67fadcfb-082c-4b01-9972-816963267dff-combined-ca-bundle\") pod \"neutron-db-sync-8zsv6\" (UID: \"67fadcfb-082c-4b01-9972-816963267dff\") " pod="openstack/neutron-db-sync-8zsv6" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.428975 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67fadcfb-082c-4b01-9972-816963267dff-config\") pod \"neutron-db-sync-8zsv6\" (UID: \"67fadcfb-082c-4b01-9972-816963267dff\") " pod="openstack/neutron-db-sync-8zsv6" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.429024 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-db-sync-config-data\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.429045 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-scripts\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.429067 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-config-data\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.432120 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zfk9\" (UniqueName: \"kubernetes.io/projected/67fadcfb-082c-4b01-9972-816963267dff-kube-api-access-4zfk9\") pod \"neutron-db-sync-8zsv6\" (UID: \"67fadcfb-082c-4b01-9972-816963267dff\") " pod="openstack/neutron-db-sync-8zsv6" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.432172 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aca947e1-09b7-404f-9162-94484bf52701-etc-machine-id\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.432308 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-combined-ca-bundle\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.434658 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-db-sync-config-data\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.434784 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aca947e1-09b7-404f-9162-94484bf52701-etc-machine-id\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.435734 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/67fadcfb-082c-4b01-9972-816963267dff-config\") pod \"neutron-db-sync-8zsv6\" (UID: \"67fadcfb-082c-4b01-9972-816963267dff\") " pod="openstack/neutron-db-sync-8zsv6" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.439555 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vw9rj" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.439866 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-scripts\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.442888 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67fadcfb-082c-4b01-9972-816963267dff-combined-ca-bundle\") pod \"neutron-db-sync-8zsv6\" (UID: \"67fadcfb-082c-4b01-9972-816963267dff\") " pod="openstack/neutron-db-sync-8zsv6" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.447760 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.450913 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-config-data\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.460520 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zfk9\" (UniqueName: \"kubernetes.io/projected/67fadcfb-082c-4b01-9972-816963267dff-kube-api-access-4zfk9\") pod \"neutron-db-sync-8zsv6\" (UID: \"67fadcfb-082c-4b01-9972-816963267dff\") " pod="openstack/neutron-db-sync-8zsv6" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.466335 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-combined-ca-bundle\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.491855 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.505032 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99lld\" (UniqueName: \"kubernetes.io/projected/aca947e1-09b7-404f-9162-94484bf52701-kube-api-access-99lld\") pod \"cinder-db-sync-nglmq\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.509853 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nglmq" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.515423 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8zsv6" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.804602 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.815826 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.819524 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.823729 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.823971 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.824110 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rhm9s" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.938274 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.940575 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.946636 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.951393 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.951517 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-scripts\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.951559 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8b73cfe-a7a9-45be-8773-0edac983a26f-logs\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.951604 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.951644 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-config-data\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.951662 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8b73cfe-a7a9-45be-8773-0edac983a26f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.951698 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6z5k\" (UniqueName: \"kubernetes.io/projected/f8b73cfe-a7a9-45be-8773-0edac983a26f-kube-api-access-q6z5k\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:47.968347 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.006820 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="18683d8e116e648be702e8fe2b59331eb73a682d795d009e915974255f56b210" exitCode=0 Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.006885 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"18683d8e116e648be702e8fe2b59331eb73a682d795d009e915974255f56b210"} Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.006946 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"837f7c0cf58cb1629ad928ad357807b43700d666b61b873505ed015b092129de"} Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.006972 5047 scope.go:117] "RemoveContainer" containerID="3620b5bb5ccf16efd5aeafb9b69b3b4d8f6c25db9ca9c3c473e293ae127f043c" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.056769 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-config-data\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.056836 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8b73cfe-a7a9-45be-8773-0edac983a26f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.056883 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.057476 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8b73cfe-a7a9-45be-8773-0edac983a26f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.057529 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.057575 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6z5k\" (UniqueName: \"kubernetes.io/projected/f8b73cfe-a7a9-45be-8773-0edac983a26f-kube-api-access-q6z5k\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.057611 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.057645 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-logs\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.057682 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.057787 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.057814 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-scripts\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.057850 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.057882 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6cst\" (UniqueName: \"kubernetes.io/projected/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-kube-api-access-l6cst\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.057930 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8b73cfe-a7a9-45be-8773-0edac983a26f-logs\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.058477 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.058962 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.059293 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8b73cfe-a7a9-45be-8773-0edac983a26f-logs\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.064977 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.067884 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-scripts\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.072260 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-config-data\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.082431 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6z5k\" (UniqueName: \"kubernetes.io/projected/f8b73cfe-a7a9-45be-8773-0edac983a26f-kube-api-access-q6z5k\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.118343 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.159795 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.161718 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-logs\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.161790 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.161873 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.161926 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.161963 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6cst\" (UniqueName: \"kubernetes.io/projected/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-kube-api-access-l6cst\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.162066 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.162088 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.162294 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.163982 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-logs\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.164308 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.185863 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.187684 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6cst\" (UniqueName: \"kubernetes.io/projected/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-kube-api-access-l6cst\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.188151 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.208485 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.220006 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.295494 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.646965 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8zsv6"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.662169 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.696438 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x7tmc"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.714705 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-qmfq9"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.723251 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nglmq"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.739003 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-tqpzs"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.759884 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-dqlkx"] Feb 23 07:06:48 crc kubenswrapper[5047]: W0223 07:06:48.766557 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc24a7665_aaa1_4ec3_bb37_2ec812f8386e.slice/crio-d7cbc3b229cfed82e13ad2851084eee2ceab1f395c428ac3c685e098593fd027 WatchSource:0}: Error finding container d7cbc3b229cfed82e13ad2851084eee2ceab1f395c428ac3c685e098593fd027: Status 404 returned error can't find the container with id d7cbc3b229cfed82e13ad2851084eee2ceab1f395c428ac3c685e098593fd027 Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.789211 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-vw9rj"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.798476 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-q6m87"] Feb 23 07:06:48 crc kubenswrapper[5047]: I0223 07:06:48.994111 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:06:49 crc kubenswrapper[5047]: I0223 07:06:49.055619 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8zsv6" event={"ID":"67fadcfb-082c-4b01-9972-816963267dff","Type":"ContainerStarted","Data":"9fb7f9b26033ae6da38802f2f97a664948895be0191c59508f4e4bb012c2cb7d"} Feb 23 07:06:49 crc kubenswrapper[5047]: I0223 07:06:49.070546 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-q6m87" event={"ID":"af1521c5-df95-4084-8c0e-301abbc8e28e","Type":"ContainerStarted","Data":"a83477e425c548fb8aebb729d44031b0b4fe0971b176a893671f4b2e34d04fd0"} Feb 23 07:06:49 crc kubenswrapper[5047]: I0223 07:06:49.077250 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nglmq" event={"ID":"aca947e1-09b7-404f-9162-94484bf52701","Type":"ContainerStarted","Data":"329a6c5c160bc4ade2a05131c1a0ed9ecae2ee7191b5cb346289e3d67c0c3d49"} Feb 23 07:06:49 crc kubenswrapper[5047]: I0223 07:06:49.082171 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7tmc" event={"ID":"73c6907d-dad6-4fc7-942d-5cb624bc524e","Type":"ContainerStarted","Data":"c2853e5874506cf3a1e2740514e6c5e7bcec099b1d9186df03fcc546bfae3073"} Feb 23 07:06:49 crc kubenswrapper[5047]: I0223 07:06:49.100903 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vw9rj" event={"ID":"4effbb70-cdb3-42e4-a8d3-e1df904f38b9","Type":"ContainerStarted","Data":"73a5c6ee7f4f6bf5f16a7d851a6d70d882c52914a97d036202771f4d8ade43e5"} Feb 23 07:06:49 crc kubenswrapper[5047]: I0223 07:06:49.104897 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" event={"ID":"6425be06-addb-45b9-b3ce-9e414555e5b2","Type":"ContainerStarted","Data":"a228a69cc34544a02ddf35071938bdf9ce99a3f501293df3c81b9d3da2951dc0"} Feb 23 07:06:49 crc kubenswrapper[5047]: I0223 07:06:49.111278 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-qmfq9" event={"ID":"c24a7665-aaa1-4ec3-bb37-2ec812f8386e","Type":"ContainerStarted","Data":"d7cbc3b229cfed82e13ad2851084eee2ceab1f395c428ac3c685e098593fd027"} Feb 23 07:06:49 crc kubenswrapper[5047]: I0223 07:06:49.113105 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tqpzs" event={"ID":"b54bac12-7429-4d94-a023-535d11e803d4","Type":"ContainerStarted","Data":"8decca2cc6f412751399db0ca2aa9a750030d9748945dab9138e51ae141a4b33"} Feb 23 07:06:49 crc kubenswrapper[5047]: I0223 07:06:49.115253 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70d776-7f7c-41ee-b60d-5a924bd30b7e","Type":"ContainerStarted","Data":"2189e74f1c16851895c6382ed06b5927e6abb93a6242832359b383a9615013c8"} Feb 23 07:06:49 crc kubenswrapper[5047]: I0223 07:06:49.696263 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:06:49 crc kubenswrapper[5047]: W0223 07:06:49.753848 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ed54528_ea6c_4b86_b10c_d3b0bd7cf822.slice/crio-c2d6734d54be1a1d077f3ee89e8d92736ab5afe77eec092c632e131cdec242e6 WatchSource:0}: Error finding container c2d6734d54be1a1d077f3ee89e8d92736ab5afe77eec092c632e131cdec242e6: Status 404 returned error can't find the container with id c2d6734d54be1a1d077f3ee89e8d92736ab5afe77eec092c632e131cdec242e6 Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.122295 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.143974 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.165752 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822","Type":"ContainerStarted","Data":"c2d6734d54be1a1d077f3ee89e8d92736ab5afe77eec092c632e131cdec242e6"} Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.191487 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7tmc" event={"ID":"73c6907d-dad6-4fc7-942d-5cb624bc524e","Type":"ContainerStarted","Data":"2838da4fc4581c30df7297386992da7a7d97e1c93377f3b9bedfba25d4cf24ed"} Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.224556 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-x7tmc" podStartSLOduration=4.224531631 podStartE2EDuration="4.224531631s" podCreationTimestamp="2026-02-23 07:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:50.222685612 +0000 UTC m=+1332.474012746" watchObservedRunningTime="2026-02-23 07:06:50.224531631 +0000 UTC m=+1332.475858765" Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.263739 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.275011 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" event={"ID":"6425be06-addb-45b9-b3ce-9e414555e5b2","Type":"ContainerDied","Data":"a5652e0d2bbebbfe18cfb27b333e84b6ca9fdeed33302170e043313d4ed13609"} Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.275241 5047 generic.go:334] "Generic (PLEG): container finished" podID="6425be06-addb-45b9-b3ce-9e414555e5b2" containerID="a5652e0d2bbebbfe18cfb27b333e84b6ca9fdeed33302170e043313d4ed13609" exitCode=0 Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.277700 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8b73cfe-a7a9-45be-8773-0edac983a26f","Type":"ContainerStarted","Data":"02f91a5d613f258d87e6e54abf9b3a18bc49cba7b61e02a6994d57f450dd9dc4"} Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.281571 5047 generic.go:334] "Generic (PLEG): container finished" podID="c24a7665-aaa1-4ec3-bb37-2ec812f8386e" containerID="737497cf2ac9b016880b5d7b4a2b3659fb2c6bb26357ef5ff1733b4e482559fc" exitCode=0 Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.281660 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-qmfq9" event={"ID":"c24a7665-aaa1-4ec3-bb37-2ec812f8386e","Type":"ContainerDied","Data":"737497cf2ac9b016880b5d7b4a2b3659fb2c6bb26357ef5ff1733b4e482559fc"} Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.299083 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8zsv6" event={"ID":"67fadcfb-082c-4b01-9972-816963267dff","Type":"ContainerStarted","Data":"ce63d2873c33fcb8fe6a6f80101295092c9629334e7ca2d463fb1fa48336a19d"} Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.320707 5047 generic.go:334] "Generic (PLEG): container finished" podID="af1521c5-df95-4084-8c0e-301abbc8e28e" containerID="c2b271370c63cb68536ad914dbc533711997f886b77b0540af2a961313d972ed" exitCode=0 Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.320796 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-q6m87" event={"ID":"af1521c5-df95-4084-8c0e-301abbc8e28e","Type":"ContainerDied","Data":"c2b271370c63cb68536ad914dbc533711997f886b77b0540af2a961313d972ed"} Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.442733 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-8zsv6" podStartSLOduration=3.442692531 podStartE2EDuration="3.442692531s" podCreationTimestamp="2026-02-23 07:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:50.403176091 +0000 UTC m=+1332.654503225" watchObservedRunningTime="2026-02-23 07:06:50.442692531 +0000 UTC m=+1332.694019665" Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.850341 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.884455 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.978208 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-ovsdbserver-nb\") pod \"6425be06-addb-45b9-b3ce-9e414555e5b2\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.978373 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps7d4\" (UniqueName: \"kubernetes.io/projected/6425be06-addb-45b9-b3ce-9e414555e5b2-kube-api-access-ps7d4\") pod \"6425be06-addb-45b9-b3ce-9e414555e5b2\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.978440 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-ovsdbserver-sb\") pod \"6425be06-addb-45b9-b3ce-9e414555e5b2\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.978471 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-config\") pod \"6425be06-addb-45b9-b3ce-9e414555e5b2\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.978554 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-dns-swift-storage-0\") pod \"6425be06-addb-45b9-b3ce-9e414555e5b2\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.978588 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-dns-svc\") pod \"6425be06-addb-45b9-b3ce-9e414555e5b2\" (UID: \"6425be06-addb-45b9-b3ce-9e414555e5b2\") " Feb 23 07:06:50 crc kubenswrapper[5047]: I0223 07:06:50.999768 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6425be06-addb-45b9-b3ce-9e414555e5b2-kube-api-access-ps7d4" (OuterVolumeSpecName: "kube-api-access-ps7d4") pod "6425be06-addb-45b9-b3ce-9e414555e5b2" (UID: "6425be06-addb-45b9-b3ce-9e414555e5b2"). InnerVolumeSpecName "kube-api-access-ps7d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.009756 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6425be06-addb-45b9-b3ce-9e414555e5b2" (UID: "6425be06-addb-45b9-b3ce-9e414555e5b2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.013721 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6425be06-addb-45b9-b3ce-9e414555e5b2" (UID: "6425be06-addb-45b9-b3ce-9e414555e5b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.018723 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6425be06-addb-45b9-b3ce-9e414555e5b2" (UID: "6425be06-addb-45b9-b3ce-9e414555e5b2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.021520 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6425be06-addb-45b9-b3ce-9e414555e5b2" (UID: "6425be06-addb-45b9-b3ce-9e414555e5b2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.036676 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-config" (OuterVolumeSpecName: "config") pod "6425be06-addb-45b9-b3ce-9e414555e5b2" (UID: "6425be06-addb-45b9-b3ce-9e414555e5b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.080283 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-dns-svc\") pod \"af1521c5-df95-4084-8c0e-301abbc8e28e\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.080511 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2gwd\" (UniqueName: \"kubernetes.io/projected/af1521c5-df95-4084-8c0e-301abbc8e28e-kube-api-access-m2gwd\") pod \"af1521c5-df95-4084-8c0e-301abbc8e28e\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.080539 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-ovsdbserver-nb\") pod \"af1521c5-df95-4084-8c0e-301abbc8e28e\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.080617 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-dns-swift-storage-0\") pod \"af1521c5-df95-4084-8c0e-301abbc8e28e\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.080653 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-config\") pod \"af1521c5-df95-4084-8c0e-301abbc8e28e\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.081021 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-ovsdbserver-sb\") pod \"af1521c5-df95-4084-8c0e-301abbc8e28e\" (UID: \"af1521c5-df95-4084-8c0e-301abbc8e28e\") " Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.081415 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.081434 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.081447 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps7d4\" (UniqueName: \"kubernetes.io/projected/6425be06-addb-45b9-b3ce-9e414555e5b2-kube-api-access-ps7d4\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.081457 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.081467 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.081476 5047 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6425be06-addb-45b9-b3ce-9e414555e5b2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.085343 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1521c5-df95-4084-8c0e-301abbc8e28e-kube-api-access-m2gwd" (OuterVolumeSpecName: "kube-api-access-m2gwd") pod "af1521c5-df95-4084-8c0e-301abbc8e28e" (UID: "af1521c5-df95-4084-8c0e-301abbc8e28e"). InnerVolumeSpecName "kube-api-access-m2gwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.111576 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af1521c5-df95-4084-8c0e-301abbc8e28e" (UID: "af1521c5-df95-4084-8c0e-301abbc8e28e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.117098 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-config" (OuterVolumeSpecName: "config") pod "af1521c5-df95-4084-8c0e-301abbc8e28e" (UID: "af1521c5-df95-4084-8c0e-301abbc8e28e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.121646 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "af1521c5-df95-4084-8c0e-301abbc8e28e" (UID: "af1521c5-df95-4084-8c0e-301abbc8e28e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.122341 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "af1521c5-df95-4084-8c0e-301abbc8e28e" (UID: "af1521c5-df95-4084-8c0e-301abbc8e28e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.127470 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "af1521c5-df95-4084-8c0e-301abbc8e28e" (UID: "af1521c5-df95-4084-8c0e-301abbc8e28e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.183618 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.183658 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2gwd\" (UniqueName: \"kubernetes.io/projected/af1521c5-df95-4084-8c0e-301abbc8e28e-kube-api-access-m2gwd\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.183669 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.183680 5047 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.183694 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.183731 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af1521c5-df95-4084-8c0e-301abbc8e28e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.385839 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-q6m87" event={"ID":"af1521c5-df95-4084-8c0e-301abbc8e28e","Type":"ContainerDied","Data":"a83477e425c548fb8aebb729d44031b0b4fe0971b176a893671f4b2e34d04fd0"} Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.385919 5047 scope.go:117] "RemoveContainer" containerID="c2b271370c63cb68536ad914dbc533711997f886b77b0540af2a961313d972ed" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.386291 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-q6m87" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.397472 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" event={"ID":"6425be06-addb-45b9-b3ce-9e414555e5b2","Type":"ContainerDied","Data":"a228a69cc34544a02ddf35071938bdf9ce99a3f501293df3c81b9d3da2951dc0"} Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.397887 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-dqlkx" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.453768 5047 scope.go:117] "RemoveContainer" containerID="a5652e0d2bbebbfe18cfb27b333e84b6ca9fdeed33302170e043313d4ed13609" Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.546092 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-dqlkx"] Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.555501 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-dqlkx"] Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.576108 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-q6m87"] Feb 23 07:06:51 crc kubenswrapper[5047]: I0223 07:06:51.589120 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-q6m87"] Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.355430 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6425be06-addb-45b9-b3ce-9e414555e5b2" path="/var/lib/kubelet/pods/6425be06-addb-45b9-b3ce-9e414555e5b2/volumes" Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.356545 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1521c5-df95-4084-8c0e-301abbc8e28e" path="/var/lib/kubelet/pods/af1521c5-df95-4084-8c0e-301abbc8e28e/volumes" Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.412967 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-qmfq9" event={"ID":"c24a7665-aaa1-4ec3-bb37-2ec812f8386e","Type":"ContainerStarted","Data":"768a91714a4c77f293aad71986199d4ec8a4b28e9c11e97ea29bb3aa818453d8"} Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.413084 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.431427 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822","Type":"ContainerStarted","Data":"3f877f648acaeb166f67ee91ed49a3582ac0948aa793dbe2db3f2d7b6eb0a300"} Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.431506 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822","Type":"ContainerStarted","Data":"f3661aba84786900e3928939ee027d6b8095617b730b6e7d9bbe1d1719aca6a7"} Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.431733 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" containerName="glance-log" containerID="cri-o://f3661aba84786900e3928939ee027d6b8095617b730b6e7d9bbe1d1719aca6a7" gracePeriod=30 Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.432336 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" containerName="glance-httpd" containerID="cri-o://3f877f648acaeb166f67ee91ed49a3582ac0948aa793dbe2db3f2d7b6eb0a300" gracePeriod=30 Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.436432 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67dccc895-qmfq9" podStartSLOduration=6.436415363 podStartE2EDuration="6.436415363s" podCreationTimestamp="2026-02-23 07:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:52.435051337 +0000 UTC m=+1334.686378481" watchObservedRunningTime="2026-02-23 07:06:52.436415363 +0000 UTC m=+1334.687742497" Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.457874 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8b73cfe-a7a9-45be-8773-0edac983a26f","Type":"ContainerStarted","Data":"5c13f390fd300e4d74624f2ea06c5dab39ec8c070cd1cdae9652d52e8797efb2"} Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.458153 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f8b73cfe-a7a9-45be-8773-0edac983a26f" containerName="glance-log" containerID="cri-o://5c13f390fd300e4d74624f2ea06c5dab39ec8c070cd1cdae9652d52e8797efb2" gracePeriod=30 Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.458297 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f8b73cfe-a7a9-45be-8773-0edac983a26f" containerName="glance-httpd" containerID="cri-o://79eb79b6849212579adcd96a0c9b0534b3df6b745c908bf3ed0f5d2ab6f834c1" gracePeriod=30 Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.475083 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.475063171 podStartE2EDuration="6.475063171s" podCreationTimestamp="2026-02-23 07:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:52.471117915 +0000 UTC m=+1334.722445049" watchObservedRunningTime="2026-02-23 07:06:52.475063171 +0000 UTC m=+1334.726390305" Feb 23 07:06:52 crc kubenswrapper[5047]: I0223 07:06:52.499983 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.499957063 podStartE2EDuration="6.499957063s" podCreationTimestamp="2026-02-23 07:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:52.491831596 +0000 UTC m=+1334.743158730" watchObservedRunningTime="2026-02-23 07:06:52.499957063 +0000 UTC m=+1334.751284197" Feb 23 07:06:53 crc kubenswrapper[5047]: I0223 07:06:53.473978 5047 generic.go:334] "Generic (PLEG): container finished" podID="73c6907d-dad6-4fc7-942d-5cb624bc524e" containerID="2838da4fc4581c30df7297386992da7a7d97e1c93377f3b9bedfba25d4cf24ed" exitCode=0 Feb 23 07:06:53 crc kubenswrapper[5047]: I0223 07:06:53.474077 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7tmc" event={"ID":"73c6907d-dad6-4fc7-942d-5cb624bc524e","Type":"ContainerDied","Data":"2838da4fc4581c30df7297386992da7a7d97e1c93377f3b9bedfba25d4cf24ed"} Feb 23 07:06:53 crc kubenswrapper[5047]: I0223 07:06:53.478816 5047 generic.go:334] "Generic (PLEG): container finished" podID="f8b73cfe-a7a9-45be-8773-0edac983a26f" containerID="79eb79b6849212579adcd96a0c9b0534b3df6b745c908bf3ed0f5d2ab6f834c1" exitCode=143 Feb 23 07:06:53 crc kubenswrapper[5047]: I0223 07:06:53.478851 5047 generic.go:334] "Generic (PLEG): container finished" podID="f8b73cfe-a7a9-45be-8773-0edac983a26f" containerID="5c13f390fd300e4d74624f2ea06c5dab39ec8c070cd1cdae9652d52e8797efb2" exitCode=143 Feb 23 07:06:53 crc kubenswrapper[5047]: I0223 07:06:53.478927 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8b73cfe-a7a9-45be-8773-0edac983a26f","Type":"ContainerDied","Data":"79eb79b6849212579adcd96a0c9b0534b3df6b745c908bf3ed0f5d2ab6f834c1"} Feb 23 07:06:53 crc kubenswrapper[5047]: I0223 07:06:53.479017 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8b73cfe-a7a9-45be-8773-0edac983a26f","Type":"ContainerDied","Data":"5c13f390fd300e4d74624f2ea06c5dab39ec8c070cd1cdae9652d52e8797efb2"} Feb 23 07:06:53 crc kubenswrapper[5047]: I0223 07:06:53.482568 5047 generic.go:334] "Generic (PLEG): container finished" podID="1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" containerID="3f877f648acaeb166f67ee91ed49a3582ac0948aa793dbe2db3f2d7b6eb0a300" exitCode=143 Feb 23 07:06:53 crc kubenswrapper[5047]: I0223 07:06:53.482601 5047 generic.go:334] "Generic (PLEG): container finished" podID="1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" containerID="f3661aba84786900e3928939ee027d6b8095617b730b6e7d9bbe1d1719aca6a7" exitCode=143 Feb 23 07:06:53 crc kubenswrapper[5047]: I0223 07:06:53.482598 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822","Type":"ContainerDied","Data":"3f877f648acaeb166f67ee91ed49a3582ac0948aa793dbe2db3f2d7b6eb0a300"} Feb 23 07:06:53 crc kubenswrapper[5047]: I0223 07:06:53.482664 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822","Type":"ContainerDied","Data":"f3661aba84786900e3928939ee027d6b8095617b730b6e7d9bbe1d1719aca6a7"} Feb 23 07:06:57 crc kubenswrapper[5047]: I0223 07:06:57.450193 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:06:57 crc kubenswrapper[5047]: I0223 07:06:57.542947 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-fxhh5"] Feb 23 07:06:57 crc kubenswrapper[5047]: I0223 07:06:57.543245 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" podUID="92786ecb-6090-40dc-8942-001a54d35d12" containerName="dnsmasq-dns" containerID="cri-o://af59b9535f389d7d7a06a601dbefe85600115748ae0d09b9bba1ac8ff86c9b73" gracePeriod=10 Feb 23 07:06:58 crc kubenswrapper[5047]: I0223 07:06:58.541123 5047 generic.go:334] "Generic (PLEG): container finished" podID="92786ecb-6090-40dc-8942-001a54d35d12" containerID="af59b9535f389d7d7a06a601dbefe85600115748ae0d09b9bba1ac8ff86c9b73" exitCode=0 Feb 23 07:06:58 crc kubenswrapper[5047]: I0223 07:06:58.541837 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" event={"ID":"92786ecb-6090-40dc-8942-001a54d35d12","Type":"ContainerDied","Data":"af59b9535f389d7d7a06a601dbefe85600115748ae0d09b9bba1ac8ff86c9b73"} Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.374861 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.479039 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-scripts\") pod \"73c6907d-dad6-4fc7-942d-5cb624bc524e\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.479146 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-credential-keys\") pod \"73c6907d-dad6-4fc7-942d-5cb624bc524e\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.480352 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-config-data\") pod \"73c6907d-dad6-4fc7-942d-5cb624bc524e\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.480476 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlncq\" (UniqueName: \"kubernetes.io/projected/73c6907d-dad6-4fc7-942d-5cb624bc524e-kube-api-access-tlncq\") pod \"73c6907d-dad6-4fc7-942d-5cb624bc524e\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.480576 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-fernet-keys\") pod \"73c6907d-dad6-4fc7-942d-5cb624bc524e\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.480679 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-combined-ca-bundle\") pod \"73c6907d-dad6-4fc7-942d-5cb624bc524e\" (UID: \"73c6907d-dad6-4fc7-942d-5cb624bc524e\") " Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.486864 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c6907d-dad6-4fc7-942d-5cb624bc524e-kube-api-access-tlncq" (OuterVolumeSpecName: "kube-api-access-tlncq") pod "73c6907d-dad6-4fc7-942d-5cb624bc524e" (UID: "73c6907d-dad6-4fc7-942d-5cb624bc524e"). InnerVolumeSpecName "kube-api-access-tlncq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.488390 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "73c6907d-dad6-4fc7-942d-5cb624bc524e" (UID: "73c6907d-dad6-4fc7-942d-5cb624bc524e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.488479 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-scripts" (OuterVolumeSpecName: "scripts") pod "73c6907d-dad6-4fc7-942d-5cb624bc524e" (UID: "73c6907d-dad6-4fc7-942d-5cb624bc524e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.502312 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "73c6907d-dad6-4fc7-942d-5cb624bc524e" (UID: "73c6907d-dad6-4fc7-942d-5cb624bc524e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.512439 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73c6907d-dad6-4fc7-942d-5cb624bc524e" (UID: "73c6907d-dad6-4fc7-942d-5cb624bc524e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.519441 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-config-data" (OuterVolumeSpecName: "config-data") pod "73c6907d-dad6-4fc7-942d-5cb624bc524e" (UID: "73c6907d-dad6-4fc7-942d-5cb624bc524e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.564673 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7tmc" event={"ID":"73c6907d-dad6-4fc7-942d-5cb624bc524e","Type":"ContainerDied","Data":"c2853e5874506cf3a1e2740514e6c5e7bcec099b1d9186df03fcc546bfae3073"} Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.564731 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2853e5874506cf3a1e2740514e6c5e7bcec099b1d9186df03fcc546bfae3073" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.564782 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7tmc" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.583128 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlncq\" (UniqueName: \"kubernetes.io/projected/73c6907d-dad6-4fc7-942d-5cb624bc524e-kube-api-access-tlncq\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.583169 5047 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.583185 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.583198 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.583210 5047 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:00 crc kubenswrapper[5047]: I0223 07:07:00.583221 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c6907d-dad6-4fc7-942d-5cb624bc524e-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.472386 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-x7tmc"] Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.490153 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-x7tmc"] Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.583344 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lc9cb"] Feb 23 07:07:01 crc kubenswrapper[5047]: E0223 07:07:01.587024 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c6907d-dad6-4fc7-942d-5cb624bc524e" containerName="keystone-bootstrap" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.587066 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c6907d-dad6-4fc7-942d-5cb624bc524e" containerName="keystone-bootstrap" Feb 23 07:07:01 crc kubenswrapper[5047]: E0223 07:07:01.587096 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6425be06-addb-45b9-b3ce-9e414555e5b2" containerName="init" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.587107 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6425be06-addb-45b9-b3ce-9e414555e5b2" containerName="init" Feb 23 07:07:01 crc kubenswrapper[5047]: E0223 07:07:01.587172 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1521c5-df95-4084-8c0e-301abbc8e28e" containerName="init" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.587184 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1521c5-df95-4084-8c0e-301abbc8e28e" containerName="init" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.587694 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1521c5-df95-4084-8c0e-301abbc8e28e" containerName="init" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.587748 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="6425be06-addb-45b9-b3ce-9e414555e5b2" containerName="init" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.587780 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c6907d-dad6-4fc7-942d-5cb624bc524e" containerName="keystone-bootstrap" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.590945 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.598063 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-whfdb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.598063 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.598310 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.598491 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.598576 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.619347 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lc9cb"] Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.707847 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-fernet-keys\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.707923 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-credential-keys\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.707979 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qck4f\" (UniqueName: \"kubernetes.io/projected/ed1fbf93-445e-478b-9f00-347d83b83977-kube-api-access-qck4f\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.708014 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-scripts\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.708053 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-combined-ca-bundle\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.708100 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-config-data\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.810092 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-scripts\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.810194 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-combined-ca-bundle\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.810261 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-config-data\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.810338 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-fernet-keys\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.810375 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-credential-keys\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.810430 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qck4f\" (UniqueName: \"kubernetes.io/projected/ed1fbf93-445e-478b-9f00-347d83b83977-kube-api-access-qck4f\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.818177 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-combined-ca-bundle\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.820441 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-scripts\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.821463 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-config-data\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.824990 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-fernet-keys\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.841216 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-credential-keys\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.841411 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qck4f\" (UniqueName: \"kubernetes.io/projected/ed1fbf93-445e-478b-9f00-347d83b83977-kube-api-access-qck4f\") pod \"keystone-bootstrap-lc9cb\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:01 crc kubenswrapper[5047]: I0223 07:07:01.957076 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:02 crc kubenswrapper[5047]: I0223 07:07:02.061469 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" podUID="92786ecb-6090-40dc-8942-001a54d35d12" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Feb 23 07:07:02 crc kubenswrapper[5047]: I0223 07:07:02.355834 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73c6907d-dad6-4fc7-942d-5cb624bc524e" path="/var/lib/kubelet/pods/73c6907d-dad6-4fc7-942d-5cb624bc524e/volumes" Feb 23 07:07:07 crc kubenswrapper[5047]: I0223 07:07:07.061782 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" podUID="92786ecb-6090-40dc-8942-001a54d35d12" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Feb 23 07:07:07 crc kubenswrapper[5047]: I0223 07:07:07.648961 5047 generic.go:334] "Generic (PLEG): container finished" podID="67fadcfb-082c-4b01-9972-816963267dff" containerID="ce63d2873c33fcb8fe6a6f80101295092c9629334e7ca2d463fb1fa48336a19d" exitCode=0 Feb 23 07:07:07 crc kubenswrapper[5047]: I0223 07:07:07.649021 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8zsv6" event={"ID":"67fadcfb-082c-4b01-9972-816963267dff","Type":"ContainerDied","Data":"ce63d2873c33fcb8fe6a6f80101295092c9629334e7ca2d463fb1fa48336a19d"} Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.300118 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.301194 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463291 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-combined-ca-bundle\") pod \"f8b73cfe-a7a9-45be-8773-0edac983a26f\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463340 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-config-data\") pod \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463368 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-scripts\") pod \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463406 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-scripts\") pod \"f8b73cfe-a7a9-45be-8773-0edac983a26f\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463435 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6cst\" (UniqueName: \"kubernetes.io/projected/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-kube-api-access-l6cst\") pod \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463460 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8b73cfe-a7a9-45be-8773-0edac983a26f-logs\") pod \"f8b73cfe-a7a9-45be-8773-0edac983a26f\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463500 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8b73cfe-a7a9-45be-8773-0edac983a26f-httpd-run\") pod \"f8b73cfe-a7a9-45be-8773-0edac983a26f\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463522 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-logs\") pod \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463550 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-config-data\") pod \"f8b73cfe-a7a9-45be-8773-0edac983a26f\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463595 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463641 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-httpd-run\") pod \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463756 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"f8b73cfe-a7a9-45be-8773-0edac983a26f\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463843 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-combined-ca-bundle\") pod \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\" (UID: \"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.463866 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6z5k\" (UniqueName: \"kubernetes.io/projected/f8b73cfe-a7a9-45be-8773-0edac983a26f-kube-api-access-q6z5k\") pod \"f8b73cfe-a7a9-45be-8773-0edac983a26f\" (UID: \"f8b73cfe-a7a9-45be-8773-0edac983a26f\") " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.464439 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-logs" (OuterVolumeSpecName: "logs") pod "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" (UID: "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.464571 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b73cfe-a7a9-45be-8773-0edac983a26f-logs" (OuterVolumeSpecName: "logs") pod "f8b73cfe-a7a9-45be-8773-0edac983a26f" (UID: "f8b73cfe-a7a9-45be-8773-0edac983a26f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.464809 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b73cfe-a7a9-45be-8773-0edac983a26f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f8b73cfe-a7a9-45be-8773-0edac983a26f" (UID: "f8b73cfe-a7a9-45be-8773-0edac983a26f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.470812 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" (UID: "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.471864 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-kube-api-access-l6cst" (OuterVolumeSpecName: "kube-api-access-l6cst") pod "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" (UID: "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822"). InnerVolumeSpecName "kube-api-access-l6cst". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.474045 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" (UID: "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.474792 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "f8b73cfe-a7a9-45be-8773-0edac983a26f" (UID: "f8b73cfe-a7a9-45be-8773-0edac983a26f"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.475456 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b73cfe-a7a9-45be-8773-0edac983a26f-kube-api-access-q6z5k" (OuterVolumeSpecName: "kube-api-access-q6z5k") pod "f8b73cfe-a7a9-45be-8773-0edac983a26f" (UID: "f8b73cfe-a7a9-45be-8773-0edac983a26f"). InnerVolumeSpecName "kube-api-access-q6z5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.476970 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-scripts" (OuterVolumeSpecName: "scripts") pod "f8b73cfe-a7a9-45be-8773-0edac983a26f" (UID: "f8b73cfe-a7a9-45be-8773-0edac983a26f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.488229 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-scripts" (OuterVolumeSpecName: "scripts") pod "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" (UID: "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.514556 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" (UID: "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.523286 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8b73cfe-a7a9-45be-8773-0edac983a26f" (UID: "f8b73cfe-a7a9-45be-8773-0edac983a26f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.528042 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-config-data" (OuterVolumeSpecName: "config-data") pod "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" (UID: "1ed54528-ea6c-4b86-b10c-d3b0bd7cf822"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.544928 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-config-data" (OuterVolumeSpecName: "config-data") pod "f8b73cfe-a7a9-45be-8773-0edac983a26f" (UID: "f8b73cfe-a7a9-45be-8773-0edac983a26f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566306 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566360 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566374 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566388 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566401 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6cst\" (UniqueName: \"kubernetes.io/projected/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-kube-api-access-l6cst\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566416 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8b73cfe-a7a9-45be-8773-0edac983a26f-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566427 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8b73cfe-a7a9-45be-8773-0edac983a26f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566441 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566453 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b73cfe-a7a9-45be-8773-0edac983a26f-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566507 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566522 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566543 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566559 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.566574 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6z5k\" (UniqueName: \"kubernetes.io/projected/f8b73cfe-a7a9-45be-8773-0edac983a26f-kube-api-access-q6z5k\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.591287 5047 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.597824 5047 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.661083 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8b73cfe-a7a9-45be-8773-0edac983a26f","Type":"ContainerDied","Data":"02f91a5d613f258d87e6e54abf9b3a18bc49cba7b61e02a6994d57f450dd9dc4"} Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.661109 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.661163 5047 scope.go:117] "RemoveContainer" containerID="79eb79b6849212579adcd96a0c9b0534b3df6b745c908bf3ed0f5d2ab6f834c1" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.666607 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.670468 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1ed54528-ea6c-4b86-b10c-d3b0bd7cf822","Type":"ContainerDied","Data":"c2d6734d54be1a1d077f3ee89e8d92736ab5afe77eec092c632e131cdec242e6"} Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.671981 5047 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.672102 5047 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.712922 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.734110 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.787352 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.799422 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.808149 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:07:08 crc kubenswrapper[5047]: E0223 07:07:08.808764 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" containerName="glance-log" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.808787 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" containerName="glance-log" Feb 23 07:07:08 crc kubenswrapper[5047]: E0223 07:07:08.808804 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b73cfe-a7a9-45be-8773-0edac983a26f" containerName="glance-log" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.808813 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b73cfe-a7a9-45be-8773-0edac983a26f" containerName="glance-log" Feb 23 07:07:08 crc kubenswrapper[5047]: E0223 07:07:08.808820 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b73cfe-a7a9-45be-8773-0edac983a26f" containerName="glance-httpd" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.808827 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b73cfe-a7a9-45be-8773-0edac983a26f" containerName="glance-httpd" Feb 23 07:07:08 crc kubenswrapper[5047]: E0223 07:07:08.808841 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" containerName="glance-httpd" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.808847 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" containerName="glance-httpd" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.809082 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b73cfe-a7a9-45be-8773-0edac983a26f" containerName="glance-httpd" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.809105 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b73cfe-a7a9-45be-8773-0edac983a26f" containerName="glance-log" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.809119 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" containerName="glance-log" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.809136 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" containerName="glance-httpd" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.810673 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.813363 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.814186 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rhm9s" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.814490 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.814759 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.829394 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.836006 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.838384 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.838634 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.841452 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.854057 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:07:08 crc kubenswrapper[5047]: E0223 07:07:08.900308 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8b73cfe_a7a9_45be_8773_0edac983a26f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8b73cfe_a7a9_45be_8773_0edac983a26f.slice/crio-02f91a5d613f258d87e6e54abf9b3a18bc49cba7b61e02a6994d57f450dd9dc4\": RecentStats: unable to find data in memory cache]" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.984961 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985045 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-logs\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985078 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/261cbbc0-ab7f-4259-94b2-00ab04e23187-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985097 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/261cbbc0-ab7f-4259-94b2-00ab04e23187-logs\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985120 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6xjw\" (UniqueName: \"kubernetes.io/projected/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-kube-api-access-s6xjw\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985165 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985196 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985217 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985247 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-scripts\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985273 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtqwl\" (UniqueName: \"kubernetes.io/projected/261cbbc0-ab7f-4259-94b2-00ab04e23187-kube-api-access-jtqwl\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985329 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985385 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-config-data\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985413 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985428 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985474 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:08 crc kubenswrapper[5047]: I0223 07:07:08.985507 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.087333 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.087777 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.087808 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.087836 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-scripts\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.087862 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtqwl\" (UniqueName: \"kubernetes.io/projected/261cbbc0-ab7f-4259-94b2-00ab04e23187-kube-api-access-jtqwl\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.087937 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.087979 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-config-data\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.088012 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.088038 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.088077 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.088120 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.088161 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.088213 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-logs\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.088247 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/261cbbc0-ab7f-4259-94b2-00ab04e23187-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.088269 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/261cbbc0-ab7f-4259-94b2-00ab04e23187-logs\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.088299 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6xjw\" (UniqueName: \"kubernetes.io/projected/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-kube-api-access-s6xjw\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.089069 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.091604 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-logs\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.092661 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/261cbbc0-ab7f-4259-94b2-00ab04e23187-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.095292 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.095777 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/261cbbc0-ab7f-4259-94b2-00ab04e23187-logs\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.095933 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.096202 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.097511 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-scripts\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.098370 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-config-data\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.098991 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.103787 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.104773 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.105876 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.109795 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.112040 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6xjw\" (UniqueName: \"kubernetes.io/projected/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-kube-api-access-s6xjw\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.112093 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtqwl\" (UniqueName: \"kubernetes.io/projected/261cbbc0-ab7f-4259-94b2-00ab04e23187-kube-api-access-jtqwl\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.131091 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.132846 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.159129 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: I0223 07:07:09.172182 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:09 crc kubenswrapper[5047]: E0223 07:07:09.297778 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec" Feb 23 07:07:09 crc kubenswrapper[5047]: E0223 07:07:09.298016 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gg67s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-vw9rj_openstack(4effbb70-cdb3-42e4-a8d3-e1df904f38b9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:07:09 crc kubenswrapper[5047]: E0223 07:07:09.299679 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-vw9rj" podUID="4effbb70-cdb3-42e4-a8d3-e1df904f38b9" Feb 23 07:07:09 crc kubenswrapper[5047]: E0223 07:07:09.686648 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec\\\"\"" pod="openstack/barbican-db-sync-vw9rj" podUID="4effbb70-cdb3-42e4-a8d3-e1df904f38b9" Feb 23 07:07:10 crc kubenswrapper[5047]: I0223 07:07:10.358550 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed54528-ea6c-4b86-b10c-d3b0bd7cf822" path="/var/lib/kubelet/pods/1ed54528-ea6c-4b86-b10c-d3b0bd7cf822/volumes" Feb 23 07:07:10 crc kubenswrapper[5047]: I0223 07:07:10.360404 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b73cfe-a7a9-45be-8773-0edac983a26f" path="/var/lib/kubelet/pods/f8b73cfe-a7a9-45be-8773-0edac983a26f/volumes" Feb 23 07:07:10 crc kubenswrapper[5047]: E0223 07:07:10.569839 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b" Feb 23 07:07:10 crc kubenswrapper[5047]: E0223 07:07:10.570289 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-99lld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-nglmq_openstack(aca947e1-09b7-404f-9162-94484bf52701): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:07:10 crc kubenswrapper[5047]: E0223 07:07:10.572456 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-nglmq" podUID="aca947e1-09b7-404f-9162-94484bf52701" Feb 23 07:07:10 crc kubenswrapper[5047]: I0223 07:07:10.600279 5047 scope.go:117] "RemoveContainer" containerID="5c13f390fd300e4d74624f2ea06c5dab39ec8c070cd1cdae9652d52e8797efb2" Feb 23 07:07:10 crc kubenswrapper[5047]: I0223 07:07:10.704067 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8zsv6" event={"ID":"67fadcfb-082c-4b01-9972-816963267dff","Type":"ContainerDied","Data":"9fb7f9b26033ae6da38802f2f97a664948895be0191c59508f4e4bb012c2cb7d"} Feb 23 07:07:10 crc kubenswrapper[5047]: I0223 07:07:10.704136 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fb7f9b26033ae6da38802f2f97a664948895be0191c59508f4e4bb012c2cb7d" Feb 23 07:07:10 crc kubenswrapper[5047]: I0223 07:07:10.714055 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" event={"ID":"92786ecb-6090-40dc-8942-001a54d35d12","Type":"ContainerDied","Data":"67781186d2285d661b545a399242388f4781ba57fdc46dda3600364e2adb17b3"} Feb 23 07:07:10 crc kubenswrapper[5047]: I0223 07:07:10.714115 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67781186d2285d661b545a399242388f4781ba57fdc46dda3600364e2adb17b3" Feb 23 07:07:10 crc kubenswrapper[5047]: I0223 07:07:10.717280 5047 scope.go:117] "RemoveContainer" containerID="3f877f648acaeb166f67ee91ed49a3582ac0948aa793dbe2db3f2d7b6eb0a300" Feb 23 07:07:10 crc kubenswrapper[5047]: E0223 07:07:10.717813 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b\\\"\"" pod="openstack/cinder-db-sync-nglmq" podUID="aca947e1-09b7-404f-9162-94484bf52701" Feb 23 07:07:10 crc kubenswrapper[5047]: I0223 07:07:10.867257 5047 scope.go:117] "RemoveContainer" containerID="f3661aba84786900e3928939ee027d6b8095617b730b6e7d9bbe1d1719aca6a7" Feb 23 07:07:10 crc kubenswrapper[5047]: I0223 07:07:10.871106 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:07:10 crc kubenswrapper[5047]: I0223 07:07:10.970372 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8zsv6" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.040535 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-dns-svc\") pod \"92786ecb-6090-40dc-8942-001a54d35d12\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.040625 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-ovsdbserver-nb\") pod \"92786ecb-6090-40dc-8942-001a54d35d12\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.040756 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-config\") pod \"92786ecb-6090-40dc-8942-001a54d35d12\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.040865 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tgbv\" (UniqueName: \"kubernetes.io/projected/92786ecb-6090-40dc-8942-001a54d35d12-kube-api-access-6tgbv\") pod \"92786ecb-6090-40dc-8942-001a54d35d12\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.040955 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-ovsdbserver-sb\") pod \"92786ecb-6090-40dc-8942-001a54d35d12\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.040999 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-dns-swift-storage-0\") pod \"92786ecb-6090-40dc-8942-001a54d35d12\" (UID: \"92786ecb-6090-40dc-8942-001a54d35d12\") " Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.051740 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92786ecb-6090-40dc-8942-001a54d35d12-kube-api-access-6tgbv" (OuterVolumeSpecName: "kube-api-access-6tgbv") pod "92786ecb-6090-40dc-8942-001a54d35d12" (UID: "92786ecb-6090-40dc-8942-001a54d35d12"). InnerVolumeSpecName "kube-api-access-6tgbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.087828 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "92786ecb-6090-40dc-8942-001a54d35d12" (UID: "92786ecb-6090-40dc-8942-001a54d35d12"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.101519 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "92786ecb-6090-40dc-8942-001a54d35d12" (UID: "92786ecb-6090-40dc-8942-001a54d35d12"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.101543 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "92786ecb-6090-40dc-8942-001a54d35d12" (UID: "92786ecb-6090-40dc-8942-001a54d35d12"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.104221 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-config" (OuterVolumeSpecName: "config") pod "92786ecb-6090-40dc-8942-001a54d35d12" (UID: "92786ecb-6090-40dc-8942-001a54d35d12"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.106267 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "92786ecb-6090-40dc-8942-001a54d35d12" (UID: "92786ecb-6090-40dc-8942-001a54d35d12"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:11 crc kubenswrapper[5047]: W0223 07:07:11.114564 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded1fbf93_445e_478b_9f00_347d83b83977.slice/crio-f13811e85dc6baa96f4da99a97912ffa3ba12b8524d58074a041bb9ca2032871 WatchSource:0}: Error finding container f13811e85dc6baa96f4da99a97912ffa3ba12b8524d58074a041bb9ca2032871: Status 404 returned error can't find the container with id f13811e85dc6baa96f4da99a97912ffa3ba12b8524d58074a041bb9ca2032871 Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.116365 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lc9cb"] Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.142620 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zfk9\" (UniqueName: \"kubernetes.io/projected/67fadcfb-082c-4b01-9972-816963267dff-kube-api-access-4zfk9\") pod \"67fadcfb-082c-4b01-9972-816963267dff\" (UID: \"67fadcfb-082c-4b01-9972-816963267dff\") " Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.142716 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67fadcfb-082c-4b01-9972-816963267dff-config\") pod \"67fadcfb-082c-4b01-9972-816963267dff\" (UID: \"67fadcfb-082c-4b01-9972-816963267dff\") " Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.142758 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67fadcfb-082c-4b01-9972-816963267dff-combined-ca-bundle\") pod \"67fadcfb-082c-4b01-9972-816963267dff\" (UID: \"67fadcfb-082c-4b01-9972-816963267dff\") " Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.143454 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.143477 5047 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.143488 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.143498 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.143510 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92786ecb-6090-40dc-8942-001a54d35d12-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.143519 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tgbv\" (UniqueName: \"kubernetes.io/projected/92786ecb-6090-40dc-8942-001a54d35d12-kube-api-access-6tgbv\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.148344 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67fadcfb-082c-4b01-9972-816963267dff-kube-api-access-4zfk9" (OuterVolumeSpecName: "kube-api-access-4zfk9") pod "67fadcfb-082c-4b01-9972-816963267dff" (UID: "67fadcfb-082c-4b01-9972-816963267dff"). InnerVolumeSpecName "kube-api-access-4zfk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.169432 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67fadcfb-082c-4b01-9972-816963267dff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67fadcfb-082c-4b01-9972-816963267dff" (UID: "67fadcfb-082c-4b01-9972-816963267dff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.187700 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67fadcfb-082c-4b01-9972-816963267dff-config" (OuterVolumeSpecName: "config") pod "67fadcfb-082c-4b01-9972-816963267dff" (UID: "67fadcfb-082c-4b01-9972-816963267dff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.188604 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.245273 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zfk9\" (UniqueName: \"kubernetes.io/projected/67fadcfb-082c-4b01-9972-816963267dff-kube-api-access-4zfk9\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.245314 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/67fadcfb-082c-4b01-9972-816963267dff-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.245328 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67fadcfb-082c-4b01-9972-816963267dff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.289752 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:07:11 crc kubenswrapper[5047]: W0223 07:07:11.298096 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod391de77f_1a20_4fdf_90fb_413ab5e7cc3f.slice/crio-5e71b3aefe34ac6a57d430db9928cacbb52c8185e7e90d9498c75b2061200258 WatchSource:0}: Error finding container 5e71b3aefe34ac6a57d430db9928cacbb52c8185e7e90d9498c75b2061200258: Status 404 returned error can't find the container with id 5e71b3aefe34ac6a57d430db9928cacbb52c8185e7e90d9498c75b2061200258 Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.726530 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"391de77f-1a20-4fdf-90fb-413ab5e7cc3f","Type":"ContainerStarted","Data":"5e71b3aefe34ac6a57d430db9928cacbb52c8185e7e90d9498c75b2061200258"} Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.728090 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"261cbbc0-ab7f-4259-94b2-00ab04e23187","Type":"ContainerStarted","Data":"c009f361846efdcfc899415f01d7ab42bdc42d46552a6d6ada17f72951240444"} Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.731313 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tqpzs" event={"ID":"b54bac12-7429-4d94-a023-535d11e803d4","Type":"ContainerStarted","Data":"31926a2974cd4ca21172fb1cd25667b92d592918db322cb316be55add0a17a1d"} Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.732858 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70d776-7f7c-41ee-b60d-5a924bd30b7e","Type":"ContainerStarted","Data":"303b0be2fc137b88102768998fcad85da0339cb31cd9c5b3f96d8dc5f831d866"} Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.737532 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8zsv6" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.741257 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lc9cb" event={"ID":"ed1fbf93-445e-478b-9f00-347d83b83977","Type":"ContainerStarted","Data":"cd436dcf3211130ff764f076fed340d93284320db3baceec202f8ab4ad5a1294"} Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.741318 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lc9cb" event={"ID":"ed1fbf93-445e-478b-9f00-347d83b83977","Type":"ContainerStarted","Data":"f13811e85dc6baa96f4da99a97912ffa3ba12b8524d58074a041bb9ca2032871"} Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.741360 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-fxhh5" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.752356 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-tqpzs" podStartSLOduration=3.986144914 podStartE2EDuration="25.752329532s" podCreationTimestamp="2026-02-23 07:06:46 +0000 UTC" firstStartedPulling="2026-02-23 07:06:48.782162281 +0000 UTC m=+1331.033489415" lastFinishedPulling="2026-02-23 07:07:10.548346899 +0000 UTC m=+1352.799674033" observedRunningTime="2026-02-23 07:07:11.75037757 +0000 UTC m=+1354.001704704" watchObservedRunningTime="2026-02-23 07:07:11.752329532 +0000 UTC m=+1354.003656666" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.784237 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lc9cb" podStartSLOduration=10.78421374 podStartE2EDuration="10.78421374s" podCreationTimestamp="2026-02-23 07:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:11.772137388 +0000 UTC m=+1354.023464522" watchObservedRunningTime="2026-02-23 07:07:11.78421374 +0000 UTC m=+1354.035540864" Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.834676 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-fxhh5"] Feb 23 07:07:11 crc kubenswrapper[5047]: I0223 07:07:11.852855 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-fxhh5"] Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.221247 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-b2csx"] Feb 23 07:07:12 crc kubenswrapper[5047]: E0223 07:07:12.221999 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92786ecb-6090-40dc-8942-001a54d35d12" containerName="dnsmasq-dns" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.222030 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="92786ecb-6090-40dc-8942-001a54d35d12" containerName="dnsmasq-dns" Feb 23 07:07:12 crc kubenswrapper[5047]: E0223 07:07:12.222049 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67fadcfb-082c-4b01-9972-816963267dff" containerName="neutron-db-sync" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.222056 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="67fadcfb-082c-4b01-9972-816963267dff" containerName="neutron-db-sync" Feb 23 07:07:12 crc kubenswrapper[5047]: E0223 07:07:12.222077 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92786ecb-6090-40dc-8942-001a54d35d12" containerName="init" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.222083 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="92786ecb-6090-40dc-8942-001a54d35d12" containerName="init" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.222241 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="92786ecb-6090-40dc-8942-001a54d35d12" containerName="dnsmasq-dns" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.222267 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="67fadcfb-082c-4b01-9972-816963267dff" containerName="neutron-db-sync" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.224363 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.247946 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-b2csx"] Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.371561 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92786ecb-6090-40dc-8942-001a54d35d12" path="/var/lib/kubelet/pods/92786ecb-6090-40dc-8942-001a54d35d12/volumes" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.376018 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.376087 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-config\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.376794 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.377020 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n66rp\" (UniqueName: \"kubernetes.io/projected/bdaf4744-839c-4cad-aae8-c0b7396d9915-kube-api-access-n66rp\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.377099 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.377372 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.458439 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5ff9c9c5c6-pkfqh"] Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.460582 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.464414 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pgvbz" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.465267 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.465469 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.468694 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.479139 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.479212 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n66rp\" (UniqueName: \"kubernetes.io/projected/bdaf4744-839c-4cad-aae8-c0b7396d9915-kube-api-access-n66rp\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.479242 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.479309 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.479396 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.479425 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-config\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.480470 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-config\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.481041 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.489189 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.489639 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5ff9c9c5c6-pkfqh"] Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.489668 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.490091 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.509203 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n66rp\" (UniqueName: \"kubernetes.io/projected/bdaf4744-839c-4cad-aae8-c0b7396d9915-kube-api-access-n66rp\") pod \"dnsmasq-dns-db5c97f8f-b2csx\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.560888 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.580973 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-ovndb-tls-certs\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.581060 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbxl8\" (UniqueName: \"kubernetes.io/projected/19e4074e-d9fa-436c-97d8-f9d90ea147e2-kube-api-access-vbxl8\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.581095 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-httpd-config\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.581125 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-config\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.581167 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-combined-ca-bundle\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.683262 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbxl8\" (UniqueName: \"kubernetes.io/projected/19e4074e-d9fa-436c-97d8-f9d90ea147e2-kube-api-access-vbxl8\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.683607 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-httpd-config\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.683644 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-config\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.683693 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-combined-ca-bundle\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.683747 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-ovndb-tls-certs\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.694290 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-httpd-config\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.696647 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-combined-ca-bundle\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.700145 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-config\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.702885 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbxl8\" (UniqueName: \"kubernetes.io/projected/19e4074e-d9fa-436c-97d8-f9d90ea147e2-kube-api-access-vbxl8\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.709878 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-ovndb-tls-certs\") pod \"neutron-5ff9c9c5c6-pkfqh\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.760068 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"261cbbc0-ab7f-4259-94b2-00ab04e23187","Type":"ContainerStarted","Data":"fc5b0a7fc5af478f48124dfbe64534cecb6a01a445a34e3c0217362bf0299435"} Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.760141 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"261cbbc0-ab7f-4259-94b2-00ab04e23187","Type":"ContainerStarted","Data":"1dd1586e3c8a4c703445e7b1be23cebbba7ba7a30a2d668e41638e09f76c54f4"} Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.766517 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"391de77f-1a20-4fdf-90fb-413ab5e7cc3f","Type":"ContainerStarted","Data":"32f4b174839bd5ba33fa916bb23331139b624944cb0f61b780cd7e606f20324a"} Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.800438 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:12 crc kubenswrapper[5047]: I0223 07:07:12.815777 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.815727087 podStartE2EDuration="4.815727087s" podCreationTimestamp="2026-02-23 07:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:12.790394683 +0000 UTC m=+1355.041721817" watchObservedRunningTime="2026-02-23 07:07:12.815727087 +0000 UTC m=+1355.067054221" Feb 23 07:07:13 crc kubenswrapper[5047]: I0223 07:07:13.518263 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-b2csx"] Feb 23 07:07:13 crc kubenswrapper[5047]: W0223 07:07:13.543365 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdaf4744_839c_4cad_aae8_c0b7396d9915.slice/crio-205259eaeb81467f1579bfaca53fc93c660c4444f400e9fb9fe045ab24e36cba WatchSource:0}: Error finding container 205259eaeb81467f1579bfaca53fc93c660c4444f400e9fb9fe045ab24e36cba: Status 404 returned error can't find the container with id 205259eaeb81467f1579bfaca53fc93c660c4444f400e9fb9fe045ab24e36cba Feb 23 07:07:13 crc kubenswrapper[5047]: I0223 07:07:13.683614 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5ff9c9c5c6-pkfqh"] Feb 23 07:07:13 crc kubenswrapper[5047]: I0223 07:07:13.784317 5047 generic.go:334] "Generic (PLEG): container finished" podID="b54bac12-7429-4d94-a023-535d11e803d4" containerID="31926a2974cd4ca21172fb1cd25667b92d592918db322cb316be55add0a17a1d" exitCode=0 Feb 23 07:07:13 crc kubenswrapper[5047]: I0223 07:07:13.784421 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tqpzs" event={"ID":"b54bac12-7429-4d94-a023-535d11e803d4","Type":"ContainerDied","Data":"31926a2974cd4ca21172fb1cd25667b92d592918db322cb316be55add0a17a1d"} Feb 23 07:07:13 crc kubenswrapper[5047]: I0223 07:07:13.787114 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" event={"ID":"bdaf4744-839c-4cad-aae8-c0b7396d9915","Type":"ContainerStarted","Data":"205259eaeb81467f1579bfaca53fc93c660c4444f400e9fb9fe045ab24e36cba"} Feb 23 07:07:14 crc kubenswrapper[5047]: W0223 07:07:14.745151 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19e4074e_d9fa_436c_97d8_f9d90ea147e2.slice/crio-c5e5f341371e2c3ad0918231991c95e427953084a3ddaf656a0b81f186095417 WatchSource:0}: Error finding container c5e5f341371e2c3ad0918231991c95e427953084a3ddaf656a0b81f186095417: Status 404 returned error can't find the container with id c5e5f341371e2c3ad0918231991c95e427953084a3ddaf656a0b81f186095417 Feb 23 07:07:14 crc kubenswrapper[5047]: I0223 07:07:14.800886 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ff9c9c5c6-pkfqh" event={"ID":"19e4074e-d9fa-436c-97d8-f9d90ea147e2","Type":"ContainerStarted","Data":"c5e5f341371e2c3ad0918231991c95e427953084a3ddaf656a0b81f186095417"} Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.074506 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-54d54c8555-hb2hm"] Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.077721 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.084679 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.085276 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.109971 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54d54c8555-hb2hm"] Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.144756 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-config\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.144948 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wd9m\" (UniqueName: \"kubernetes.io/projected/89c85b56-2c82-442b-af26-77a1f0d294f4-kube-api-access-8wd9m\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.145135 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-combined-ca-bundle\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.145261 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-internal-tls-certs\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.145434 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-public-tls-certs\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.145466 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-httpd-config\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.145560 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-ovndb-tls-certs\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.248433 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-internal-tls-certs\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.248515 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-public-tls-certs\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.248533 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-httpd-config\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.248559 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-ovndb-tls-certs\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.248605 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-config\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.248645 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wd9m\" (UniqueName: \"kubernetes.io/projected/89c85b56-2c82-442b-af26-77a1f0d294f4-kube-api-access-8wd9m\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.248687 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-combined-ca-bundle\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.257158 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-combined-ca-bundle\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.258233 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-httpd-config\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.258463 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-config\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.260872 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-ovndb-tls-certs\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.264252 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-public-tls-certs\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.264823 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-internal-tls-certs\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.273138 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tqpzs" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.273509 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wd9m\" (UniqueName: \"kubernetes.io/projected/89c85b56-2c82-442b-af26-77a1f0d294f4-kube-api-access-8wd9m\") pod \"neutron-54d54c8555-hb2hm\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.350213 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-scripts\") pod \"b54bac12-7429-4d94-a023-535d11e803d4\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.350335 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-config-data\") pod \"b54bac12-7429-4d94-a023-535d11e803d4\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.350473 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-combined-ca-bundle\") pod \"b54bac12-7429-4d94-a023-535d11e803d4\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.350554 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b54bac12-7429-4d94-a023-535d11e803d4-logs\") pod \"b54bac12-7429-4d94-a023-535d11e803d4\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.350594 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx55v\" (UniqueName: \"kubernetes.io/projected/b54bac12-7429-4d94-a023-535d11e803d4-kube-api-access-wx55v\") pod \"b54bac12-7429-4d94-a023-535d11e803d4\" (UID: \"b54bac12-7429-4d94-a023-535d11e803d4\") " Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.352162 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b54bac12-7429-4d94-a023-535d11e803d4-logs" (OuterVolumeSpecName: "logs") pod "b54bac12-7429-4d94-a023-535d11e803d4" (UID: "b54bac12-7429-4d94-a023-535d11e803d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.356711 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b54bac12-7429-4d94-a023-535d11e803d4-kube-api-access-wx55v" (OuterVolumeSpecName: "kube-api-access-wx55v") pod "b54bac12-7429-4d94-a023-535d11e803d4" (UID: "b54bac12-7429-4d94-a023-535d11e803d4"). InnerVolumeSpecName "kube-api-access-wx55v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.358244 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-scripts" (OuterVolumeSpecName: "scripts") pod "b54bac12-7429-4d94-a023-535d11e803d4" (UID: "b54bac12-7429-4d94-a023-535d11e803d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.386126 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b54bac12-7429-4d94-a023-535d11e803d4" (UID: "b54bac12-7429-4d94-a023-535d11e803d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.386233 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-config-data" (OuterVolumeSpecName: "config-data") pod "b54bac12-7429-4d94-a023-535d11e803d4" (UID: "b54bac12-7429-4d94-a023-535d11e803d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.453279 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.453320 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.453332 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b54bac12-7429-4d94-a023-535d11e803d4-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.453341 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx55v\" (UniqueName: \"kubernetes.io/projected/b54bac12-7429-4d94-a023-535d11e803d4-kube-api-access-wx55v\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.453352 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54bac12-7429-4d94-a023-535d11e803d4-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.563623 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.832168 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70d776-7f7c-41ee-b60d-5a924bd30b7e","Type":"ContainerStarted","Data":"817654c6409668e7517ce28241e05825ddfc2915782d23b89179dd2479319550"} Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.842607 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ff9c9c5c6-pkfqh" event={"ID":"19e4074e-d9fa-436c-97d8-f9d90ea147e2","Type":"ContainerStarted","Data":"48beb39652f6bbd9ec63e7ff1fa9cf1cf2543500d09199781370b0ee22d9c880"} Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.842664 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ff9c9c5c6-pkfqh" event={"ID":"19e4074e-d9fa-436c-97d8-f9d90ea147e2","Type":"ContainerStarted","Data":"39e5ccb29704604c78f137df87ae9930b8b1ecf21406a4701a0850a629ebcdfd"} Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.845091 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.861022 5047 generic.go:334] "Generic (PLEG): container finished" podID="bdaf4744-839c-4cad-aae8-c0b7396d9915" containerID="c11f2ddda63038dac539e4536e912b8a4c9b69c019ac7074f03f3519b4cd2b57" exitCode=0 Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.861100 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" event={"ID":"bdaf4744-839c-4cad-aae8-c0b7396d9915","Type":"ContainerDied","Data":"c11f2ddda63038dac539e4536e912b8a4c9b69c019ac7074f03f3519b4cd2b57"} Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.916098 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"391de77f-1a20-4fdf-90fb-413ab5e7cc3f","Type":"ContainerStarted","Data":"ace6ecfe551c44e9efb31d1ea90d89e9ff52a5828d4bf6e2593aac5d72081b6e"} Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.921108 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5ff9c9c5c6-pkfqh" podStartSLOduration=3.921085774 podStartE2EDuration="3.921085774s" podCreationTimestamp="2026-02-23 07:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:15.873785127 +0000 UTC m=+1358.125112271" watchObservedRunningTime="2026-02-23 07:07:15.921085774 +0000 UTC m=+1358.172412908" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.924657 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tqpzs" event={"ID":"b54bac12-7429-4d94-a023-535d11e803d4","Type":"ContainerDied","Data":"8decca2cc6f412751399db0ca2aa9a750030d9748945dab9138e51ae141a4b33"} Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.924820 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8decca2cc6f412751399db0ca2aa9a750030d9748945dab9138e51ae141a4b33" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.924835 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tqpzs" Feb 23 07:07:15 crc kubenswrapper[5047]: I0223 07:07:15.976828 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.976809946 podStartE2EDuration="7.976809946s" podCreationTimestamp="2026-02-23 07:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:15.97242581 +0000 UTC m=+1358.223752944" watchObservedRunningTime="2026-02-23 07:07:15.976809946 +0000 UTC m=+1358.228137080" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.017045 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c6b9994d4-nt845"] Feb 23 07:07:16 crc kubenswrapper[5047]: E0223 07:07:16.020983 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54bac12-7429-4d94-a023-535d11e803d4" containerName="placement-db-sync" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.021075 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54bac12-7429-4d94-a023-535d11e803d4" containerName="placement-db-sync" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.021351 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b54bac12-7429-4d94-a023-535d11e803d4" containerName="placement-db-sync" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.022533 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.027608 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.027779 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.027960 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-48c7h" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.028078 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.028683 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.032997 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c6b9994d4-nt845"] Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.078744 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cca3de-06a4-4adf-b400-fe78a34dbb65-logs\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.078840 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-scripts\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.078894 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-public-tls-certs\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.078982 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-combined-ca-bundle\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.079030 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-internal-tls-certs\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.079088 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-config-data\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.079122 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwkk5\" (UniqueName: \"kubernetes.io/projected/a7cca3de-06a4-4adf-b400-fe78a34dbb65-kube-api-access-xwkk5\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.181732 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-combined-ca-bundle\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.182190 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-internal-tls-certs\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.182242 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-config-data\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.182302 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwkk5\" (UniqueName: \"kubernetes.io/projected/a7cca3de-06a4-4adf-b400-fe78a34dbb65-kube-api-access-xwkk5\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.182358 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cca3de-06a4-4adf-b400-fe78a34dbb65-logs\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.182415 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-scripts\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.182454 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-public-tls-certs\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.183360 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cca3de-06a4-4adf-b400-fe78a34dbb65-logs\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.190653 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-internal-tls-certs\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.190873 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-combined-ca-bundle\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.192746 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-public-tls-certs\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.193372 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-config-data\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.194283 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54d54c8555-hb2hm"] Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.203825 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-scripts\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.204399 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwkk5\" (UniqueName: \"kubernetes.io/projected/a7cca3de-06a4-4adf-b400-fe78a34dbb65-kube-api-access-xwkk5\") pod \"placement-c6b9994d4-nt845\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: W0223 07:07:16.209204 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89c85b56_2c82_442b_af26_77a1f0d294f4.slice/crio-ee45005f916de2f639683a0185ceb146e57fd7e1d5283e699364f1fa53f26153 WatchSource:0}: Error finding container ee45005f916de2f639683a0185ceb146e57fd7e1d5283e699364f1fa53f26153: Status 404 returned error can't find the container with id ee45005f916de2f639683a0185ceb146e57fd7e1d5283e699364f1fa53f26153 Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.456693 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.931352 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c6b9994d4-nt845"] Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.934985 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54d54c8555-hb2hm" event={"ID":"89c85b56-2c82-442b-af26-77a1f0d294f4","Type":"ContainerStarted","Data":"2249ebf762a07cddeea531d1e4a71499a988887c4ca6bcba65fac88ec4cbd24f"} Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.935048 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54d54c8555-hb2hm" event={"ID":"89c85b56-2c82-442b-af26-77a1f0d294f4","Type":"ContainerStarted","Data":"e0295183d4622f15cec9404b2b73a238a6d5a5e45dd0fa339ba26d01acc57cfd"} Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.935065 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54d54c8555-hb2hm" event={"ID":"89c85b56-2c82-442b-af26-77a1f0d294f4","Type":"ContainerStarted","Data":"ee45005f916de2f639683a0185ceb146e57fd7e1d5283e699364f1fa53f26153"} Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.935135 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.938779 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" event={"ID":"bdaf4744-839c-4cad-aae8-c0b7396d9915","Type":"ContainerStarted","Data":"992e78ea8204955943ad3ed2818ae4a36e25c710cae1df252541c2cf6ffc793c"} Feb 23 07:07:16 crc kubenswrapper[5047]: W0223 07:07:16.938942 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7cca3de_06a4_4adf_b400_fe78a34dbb65.slice/crio-09e358999d6346f0b6b1719bea325de7c97c4864c62d52af83d3d2ba1e7d56b1 WatchSource:0}: Error finding container 09e358999d6346f0b6b1719bea325de7c97c4864c62d52af83d3d2ba1e7d56b1: Status 404 returned error can't find the container with id 09e358999d6346f0b6b1719bea325de7c97c4864c62d52af83d3d2ba1e7d56b1 Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.938952 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.941281 5047 generic.go:334] "Generic (PLEG): container finished" podID="ed1fbf93-445e-478b-9f00-347d83b83977" containerID="cd436dcf3211130ff764f076fed340d93284320db3baceec202f8ab4ad5a1294" exitCode=0 Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.941400 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lc9cb" event={"ID":"ed1fbf93-445e-478b-9f00-347d83b83977","Type":"ContainerDied","Data":"cd436dcf3211130ff764f076fed340d93284320db3baceec202f8ab4ad5a1294"} Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.975168 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-54d54c8555-hb2hm" podStartSLOduration=1.975136804 podStartE2EDuration="1.975136804s" podCreationTimestamp="2026-02-23 07:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:16.959093578 +0000 UTC m=+1359.210420712" watchObservedRunningTime="2026-02-23 07:07:16.975136804 +0000 UTC m=+1359.226463948" Feb 23 07:07:16 crc kubenswrapper[5047]: I0223 07:07:16.988657 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" podStartSLOduration=4.988631464 podStartE2EDuration="4.988631464s" podCreationTimestamp="2026-02-23 07:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:16.982260994 +0000 UTC m=+1359.233588128" watchObservedRunningTime="2026-02-23 07:07:16.988631464 +0000 UTC m=+1359.239958618" Feb 23 07:07:17 crc kubenswrapper[5047]: I0223 07:07:17.951820 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c6b9994d4-nt845" event={"ID":"a7cca3de-06a4-4adf-b400-fe78a34dbb65","Type":"ContainerStarted","Data":"4b60e5151d596147b7021c9545594a95c4b3f30414105fd3a7f9fe33908978eb"} Feb 23 07:07:17 crc kubenswrapper[5047]: I0223 07:07:17.952295 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c6b9994d4-nt845" event={"ID":"a7cca3de-06a4-4adf-b400-fe78a34dbb65","Type":"ContainerStarted","Data":"71648def4db91a786a875c748f6f4a8187a401446c788caf05a7425b249ebded"} Feb 23 07:07:17 crc kubenswrapper[5047]: I0223 07:07:17.952306 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c6b9994d4-nt845" event={"ID":"a7cca3de-06a4-4adf-b400-fe78a34dbb65","Type":"ContainerStarted","Data":"09e358999d6346f0b6b1719bea325de7c97c4864c62d52af83d3d2ba1e7d56b1"} Feb 23 07:07:17 crc kubenswrapper[5047]: I0223 07:07:17.977827 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-c6b9994d4-nt845" podStartSLOduration=2.977801945 podStartE2EDuration="2.977801945s" podCreationTimestamp="2026-02-23 07:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:17.976629936 +0000 UTC m=+1360.227957070" watchObservedRunningTime="2026-02-23 07:07:17.977801945 +0000 UTC m=+1360.229129079" Feb 23 07:07:18 crc kubenswrapper[5047]: I0223 07:07:18.964035 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:18 crc kubenswrapper[5047]: I0223 07:07:18.964777 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.160057 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.160114 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.172969 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.174204 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.198828 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.214885 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.215075 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.237581 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.978955 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lc9cb" event={"ID":"ed1fbf93-445e-478b-9f00-347d83b83977","Type":"ContainerDied","Data":"f13811e85dc6baa96f4da99a97912ffa3ba12b8524d58074a041bb9ca2032871"} Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.979015 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f13811e85dc6baa96f4da99a97912ffa3ba12b8524d58074a041bb9ca2032871" Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.980603 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.980648 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.980660 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 07:07:19 crc kubenswrapper[5047]: I0223 07:07:19.980670 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.122306 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.296401 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-config-data\") pod \"ed1fbf93-445e-478b-9f00-347d83b83977\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.296895 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-combined-ca-bundle\") pod \"ed1fbf93-445e-478b-9f00-347d83b83977\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.297028 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-fernet-keys\") pod \"ed1fbf93-445e-478b-9f00-347d83b83977\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.297071 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-credential-keys\") pod \"ed1fbf93-445e-478b-9f00-347d83b83977\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.297100 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qck4f\" (UniqueName: \"kubernetes.io/projected/ed1fbf93-445e-478b-9f00-347d83b83977-kube-api-access-qck4f\") pod \"ed1fbf93-445e-478b-9f00-347d83b83977\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.297208 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-scripts\") pod \"ed1fbf93-445e-478b-9f00-347d83b83977\" (UID: \"ed1fbf93-445e-478b-9f00-347d83b83977\") " Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.305730 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ed1fbf93-445e-478b-9f00-347d83b83977" (UID: "ed1fbf93-445e-478b-9f00-347d83b83977"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.306300 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-scripts" (OuterVolumeSpecName: "scripts") pod "ed1fbf93-445e-478b-9f00-347d83b83977" (UID: "ed1fbf93-445e-478b-9f00-347d83b83977"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.312598 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ed1fbf93-445e-478b-9f00-347d83b83977" (UID: "ed1fbf93-445e-478b-9f00-347d83b83977"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.319170 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed1fbf93-445e-478b-9f00-347d83b83977-kube-api-access-qck4f" (OuterVolumeSpecName: "kube-api-access-qck4f") pod "ed1fbf93-445e-478b-9f00-347d83b83977" (UID: "ed1fbf93-445e-478b-9f00-347d83b83977"). InnerVolumeSpecName "kube-api-access-qck4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.331517 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-config-data" (OuterVolumeSpecName: "config-data") pod "ed1fbf93-445e-478b-9f00-347d83b83977" (UID: "ed1fbf93-445e-478b-9f00-347d83b83977"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.350170 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed1fbf93-445e-478b-9f00-347d83b83977" (UID: "ed1fbf93-445e-478b-9f00-347d83b83977"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.398987 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qck4f\" (UniqueName: \"kubernetes.io/projected/ed1fbf93-445e-478b-9f00-347d83b83977-kube-api-access-qck4f\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.399034 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.399055 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.399066 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.399077 5047 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.399086 5047 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed1fbf93-445e-478b-9f00-347d83b83977-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.991731 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70d776-7f7c-41ee-b60d-5a924bd30b7e","Type":"ContainerStarted","Data":"299090d986c8c6ed9b50c0a6fb790c96a09c902c1c2aea7366f06a25dbe324c0"} Feb 23 07:07:20 crc kubenswrapper[5047]: I0223 07:07:20.991883 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lc9cb" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.345842 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5864496f8c-9bg5d"] Feb 23 07:07:21 crc kubenswrapper[5047]: E0223 07:07:21.346279 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1fbf93-445e-478b-9f00-347d83b83977" containerName="keystone-bootstrap" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.346300 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1fbf93-445e-478b-9f00-347d83b83977" containerName="keystone-bootstrap" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.346493 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1fbf93-445e-478b-9f00-347d83b83977" containerName="keystone-bootstrap" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.347187 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.357087 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.357131 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.357418 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.357604 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-whfdb" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.357714 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.358013 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.362643 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5864496f8c-9bg5d"] Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.420231 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-fernet-keys\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.420280 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-public-tls-certs\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.420315 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-scripts\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.420396 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-credential-keys\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.420450 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-internal-tls-certs\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.420473 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-config-data\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.420488 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-combined-ca-bundle\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.420533 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9xwq\" (UniqueName: \"kubernetes.io/projected/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-kube-api-access-r9xwq\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.523503 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-fernet-keys\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.523557 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-public-tls-certs\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.523608 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-scripts\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.523700 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-credential-keys\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.523765 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-internal-tls-certs\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.523800 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-config-data\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.523823 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-combined-ca-bundle\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.523873 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9xwq\" (UniqueName: \"kubernetes.io/projected/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-kube-api-access-r9xwq\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.532435 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-public-tls-certs\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.533699 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-combined-ca-bundle\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.536672 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-credential-keys\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.541607 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-config-data\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.542157 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-scripts\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.543563 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-fernet-keys\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.545267 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-internal-tls-certs\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.556167 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9xwq\" (UniqueName: \"kubernetes.io/projected/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-kube-api-access-r9xwq\") pod \"keystone-5864496f8c-9bg5d\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:21 crc kubenswrapper[5047]: I0223 07:07:21.670117 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:22 crc kubenswrapper[5047]: I0223 07:07:22.006228 5047 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:07:22 crc kubenswrapper[5047]: I0223 07:07:22.006677 5047 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:07:22 crc kubenswrapper[5047]: I0223 07:07:22.217863 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5864496f8c-9bg5d"] Feb 23 07:07:22 crc kubenswrapper[5047]: I0223 07:07:22.563130 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:22 crc kubenswrapper[5047]: I0223 07:07:22.653079 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-qmfq9"] Feb 23 07:07:22 crc kubenswrapper[5047]: I0223 07:07:22.653388 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67dccc895-qmfq9" podUID="c24a7665-aaa1-4ec3-bb37-2ec812f8386e" containerName="dnsmasq-dns" containerID="cri-o://768a91714a4c77f293aad71986199d4ec8a4b28e9c11e97ea29bb3aa818453d8" gracePeriod=10 Feb 23 07:07:22 crc kubenswrapper[5047]: I0223 07:07:22.684878 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.021634 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.021780 5047 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.069363 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5864496f8c-9bg5d" event={"ID":"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08","Type":"ContainerStarted","Data":"fc5d3551724091dfc09fc2b99a9eac93b56874e2010d67bb0d6736fc0d101cdb"} Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.069443 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5864496f8c-9bg5d" event={"ID":"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08","Type":"ContainerStarted","Data":"8c777d86ec08197b30580a725928ba2f64aeba450777ffc5f6f6246ad3e4b4ea"} Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.070593 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.089470 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.126357 5047 generic.go:334] "Generic (PLEG): container finished" podID="c24a7665-aaa1-4ec3-bb37-2ec812f8386e" containerID="768a91714a4c77f293aad71986199d4ec8a4b28e9c11e97ea29bb3aa818453d8" exitCode=0 Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.126830 5047 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.129712 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-qmfq9" event={"ID":"c24a7665-aaa1-4ec3-bb37-2ec812f8386e","Type":"ContainerDied","Data":"768a91714a4c77f293aad71986199d4ec8a4b28e9c11e97ea29bb3aa818453d8"} Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.193040 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5864496f8c-9bg5d" podStartSLOduration=2.193015139 podStartE2EDuration="2.193015139s" podCreationTimestamp="2026-02-23 07:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:23.162399565 +0000 UTC m=+1365.413726699" watchObservedRunningTime="2026-02-23 07:07:23.193015139 +0000 UTC m=+1365.444342273" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.427552 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.447295 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.510714 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-ovsdbserver-sb\") pod \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.510786 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-dns-swift-storage-0\") pod \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.510939 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj2m5\" (UniqueName: \"kubernetes.io/projected/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-kube-api-access-rj2m5\") pod \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.510966 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-ovsdbserver-nb\") pod \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.511091 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-dns-svc\") pod \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.511208 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-config\") pod \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\" (UID: \"c24a7665-aaa1-4ec3-bb37-2ec812f8386e\") " Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.522010 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-kube-api-access-rj2m5" (OuterVolumeSpecName: "kube-api-access-rj2m5") pod "c24a7665-aaa1-4ec3-bb37-2ec812f8386e" (UID: "c24a7665-aaa1-4ec3-bb37-2ec812f8386e"). InnerVolumeSpecName "kube-api-access-rj2m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.615810 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj2m5\" (UniqueName: \"kubernetes.io/projected/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-kube-api-access-rj2m5\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.637993 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c24a7665-aaa1-4ec3-bb37-2ec812f8386e" (UID: "c24a7665-aaa1-4ec3-bb37-2ec812f8386e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.647943 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c24a7665-aaa1-4ec3-bb37-2ec812f8386e" (UID: "c24a7665-aaa1-4ec3-bb37-2ec812f8386e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.664594 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c24a7665-aaa1-4ec3-bb37-2ec812f8386e" (UID: "c24a7665-aaa1-4ec3-bb37-2ec812f8386e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.676639 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c24a7665-aaa1-4ec3-bb37-2ec812f8386e" (UID: "c24a7665-aaa1-4ec3-bb37-2ec812f8386e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.692614 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-config" (OuterVolumeSpecName: "config") pod "c24a7665-aaa1-4ec3-bb37-2ec812f8386e" (UID: "c24a7665-aaa1-4ec3-bb37-2ec812f8386e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.719060 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.719110 5047 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.719122 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.719135 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:23 crc kubenswrapper[5047]: I0223 07:07:23.719145 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24a7665-aaa1-4ec3-bb37-2ec812f8386e-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:24 crc kubenswrapper[5047]: I0223 07:07:24.152082 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-qmfq9" event={"ID":"c24a7665-aaa1-4ec3-bb37-2ec812f8386e","Type":"ContainerDied","Data":"d7cbc3b229cfed82e13ad2851084eee2ceab1f395c428ac3c685e098593fd027"} Feb 23 07:07:24 crc kubenswrapper[5047]: I0223 07:07:24.152595 5047 scope.go:117] "RemoveContainer" containerID="768a91714a4c77f293aad71986199d4ec8a4b28e9c11e97ea29bb3aa818453d8" Feb 23 07:07:24 crc kubenswrapper[5047]: I0223 07:07:24.152805 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-qmfq9" Feb 23 07:07:24 crc kubenswrapper[5047]: I0223 07:07:24.171067 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nglmq" event={"ID":"aca947e1-09b7-404f-9162-94484bf52701","Type":"ContainerStarted","Data":"80ad6f181096756b5b17df7bddd5e7b9f97d35cea4ff8798e80c1362b7bcf2a8"} Feb 23 07:07:24 crc kubenswrapper[5047]: I0223 07:07:24.194385 5047 scope.go:117] "RemoveContainer" containerID="737497cf2ac9b016880b5d7b4a2b3659fb2c6bb26357ef5ff1733b4e482559fc" Feb 23 07:07:24 crc kubenswrapper[5047]: I0223 07:07:24.195870 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-nglmq" podStartSLOduration=3.168139008 podStartE2EDuration="37.195851405s" podCreationTimestamp="2026-02-23 07:06:47 +0000 UTC" firstStartedPulling="2026-02-23 07:06:48.818303781 +0000 UTC m=+1331.069630915" lastFinishedPulling="2026-02-23 07:07:22.846016178 +0000 UTC m=+1365.097343312" observedRunningTime="2026-02-23 07:07:24.191197187 +0000 UTC m=+1366.442524321" watchObservedRunningTime="2026-02-23 07:07:24.195851405 +0000 UTC m=+1366.447178539" Feb 23 07:07:24 crc kubenswrapper[5047]: I0223 07:07:24.224822 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-qmfq9"] Feb 23 07:07:24 crc kubenswrapper[5047]: I0223 07:07:24.236484 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-qmfq9"] Feb 23 07:07:24 crc kubenswrapper[5047]: I0223 07:07:24.356658 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24a7665-aaa1-4ec3-bb37-2ec812f8386e" path="/var/lib/kubelet/pods/c24a7665-aaa1-4ec3-bb37-2ec812f8386e/volumes" Feb 23 07:07:25 crc kubenswrapper[5047]: I0223 07:07:25.184827 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vw9rj" event={"ID":"4effbb70-cdb3-42e4-a8d3-e1df904f38b9","Type":"ContainerStarted","Data":"bb85a2f1ff6def18a756205ee4f5c162129c0424fc0d632f8cb5d4b7d746121f"} Feb 23 07:07:25 crc kubenswrapper[5047]: I0223 07:07:25.218014 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-vw9rj" podStartSLOduration=3.109844976 podStartE2EDuration="38.217983499s" podCreationTimestamp="2026-02-23 07:06:47 +0000 UTC" firstStartedPulling="2026-02-23 07:06:48.788951951 +0000 UTC m=+1331.040279085" lastFinishedPulling="2026-02-23 07:07:23.897090484 +0000 UTC m=+1366.148417608" observedRunningTime="2026-02-23 07:07:25.202131788 +0000 UTC m=+1367.453458922" watchObservedRunningTime="2026-02-23 07:07:25.217983499 +0000 UTC m=+1367.469310633" Feb 23 07:07:27 crc kubenswrapper[5047]: I0223 07:07:27.217027 5047 generic.go:334] "Generic (PLEG): container finished" podID="4effbb70-cdb3-42e4-a8d3-e1df904f38b9" containerID="bb85a2f1ff6def18a756205ee4f5c162129c0424fc0d632f8cb5d4b7d746121f" exitCode=0 Feb 23 07:07:27 crc kubenswrapper[5047]: I0223 07:07:27.217120 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vw9rj" event={"ID":"4effbb70-cdb3-42e4-a8d3-e1df904f38b9","Type":"ContainerDied","Data":"bb85a2f1ff6def18a756205ee4f5c162129c0424fc0d632f8cb5d4b7d746121f"} Feb 23 07:07:29 crc kubenswrapper[5047]: I0223 07:07:29.243135 5047 generic.go:334] "Generic (PLEG): container finished" podID="aca947e1-09b7-404f-9162-94484bf52701" containerID="80ad6f181096756b5b17df7bddd5e7b9f97d35cea4ff8798e80c1362b7bcf2a8" exitCode=0 Feb 23 07:07:29 crc kubenswrapper[5047]: I0223 07:07:29.243208 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nglmq" event={"ID":"aca947e1-09b7-404f-9162-94484bf52701","Type":"ContainerDied","Data":"80ad6f181096756b5b17df7bddd5e7b9f97d35cea4ff8798e80c1362b7bcf2a8"} Feb 23 07:07:29 crc kubenswrapper[5047]: I0223 07:07:29.484324 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vw9rj" Feb 23 07:07:29 crc kubenswrapper[5047]: I0223 07:07:29.663734 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-db-sync-config-data\") pod \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\" (UID: \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\") " Feb 23 07:07:29 crc kubenswrapper[5047]: I0223 07:07:29.663888 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-combined-ca-bundle\") pod \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\" (UID: \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\") " Feb 23 07:07:29 crc kubenswrapper[5047]: I0223 07:07:29.664035 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg67s\" (UniqueName: \"kubernetes.io/projected/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-kube-api-access-gg67s\") pod \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\" (UID: \"4effbb70-cdb3-42e4-a8d3-e1df904f38b9\") " Feb 23 07:07:29 crc kubenswrapper[5047]: I0223 07:07:29.669705 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4effbb70-cdb3-42e4-a8d3-e1df904f38b9" (UID: "4effbb70-cdb3-42e4-a8d3-e1df904f38b9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:29 crc kubenswrapper[5047]: I0223 07:07:29.671433 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-kube-api-access-gg67s" (OuterVolumeSpecName: "kube-api-access-gg67s") pod "4effbb70-cdb3-42e4-a8d3-e1df904f38b9" (UID: "4effbb70-cdb3-42e4-a8d3-e1df904f38b9"). InnerVolumeSpecName "kube-api-access-gg67s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:29 crc kubenswrapper[5047]: I0223 07:07:29.700186 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4effbb70-cdb3-42e4-a8d3-e1df904f38b9" (UID: "4effbb70-cdb3-42e4-a8d3-e1df904f38b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:29 crc kubenswrapper[5047]: I0223 07:07:29.767393 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:29 crc kubenswrapper[5047]: I0223 07:07:29.767452 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg67s\" (UniqueName: \"kubernetes.io/projected/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-kube-api-access-gg67s\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:29 crc kubenswrapper[5047]: I0223 07:07:29.767475 5047 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4effbb70-cdb3-42e4-a8d3-e1df904f38b9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.258491 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vw9rj" event={"ID":"4effbb70-cdb3-42e4-a8d3-e1df904f38b9","Type":"ContainerDied","Data":"73a5c6ee7f4f6bf5f16a7d851a6d70d882c52914a97d036202771f4d8ade43e5"} Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.259052 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73a5c6ee7f4f6bf5f16a7d851a6d70d882c52914a97d036202771f4d8ade43e5" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.258516 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vw9rj" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.278500 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70d776-7f7c-41ee-b60d-5a924bd30b7e","Type":"ContainerStarted","Data":"7dbb0f2cb1fd3a1970239bd3959d74badc51668fe802c20c64485a2570565ebe"} Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.279000 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="ceilometer-central-agent" containerID="cri-o://303b0be2fc137b88102768998fcad85da0339cb31cd9c5b3f96d8dc5f831d866" gracePeriod=30 Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.279037 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="sg-core" containerID="cri-o://299090d986c8c6ed9b50c0a6fb790c96a09c902c1c2aea7366f06a25dbe324c0" gracePeriod=30 Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.279108 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="proxy-httpd" containerID="cri-o://7dbb0f2cb1fd3a1970239bd3959d74badc51668fe802c20c64485a2570565ebe" gracePeriod=30 Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.279012 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="ceilometer-notification-agent" containerID="cri-o://817654c6409668e7517ce28241e05825ddfc2915782d23b89179dd2479319550" gracePeriod=30 Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.327938 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.548865184 podStartE2EDuration="43.327888291s" podCreationTimestamp="2026-02-23 07:06:47 +0000 UTC" firstStartedPulling="2026-02-23 07:06:48.709959381 +0000 UTC m=+1330.961286515" lastFinishedPulling="2026-02-23 07:07:29.488982498 +0000 UTC m=+1371.740309622" observedRunningTime="2026-02-23 07:07:30.31480789 +0000 UTC m=+1372.566135054" watchObservedRunningTime="2026-02-23 07:07:30.327888291 +0000 UTC m=+1372.579215435" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.778645 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-799dddc985-b669w"] Feb 23 07:07:30 crc kubenswrapper[5047]: E0223 07:07:30.779107 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24a7665-aaa1-4ec3-bb37-2ec812f8386e" containerName="dnsmasq-dns" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.779121 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24a7665-aaa1-4ec3-bb37-2ec812f8386e" containerName="dnsmasq-dns" Feb 23 07:07:30 crc kubenswrapper[5047]: E0223 07:07:30.779155 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24a7665-aaa1-4ec3-bb37-2ec812f8386e" containerName="init" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.779161 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24a7665-aaa1-4ec3-bb37-2ec812f8386e" containerName="init" Feb 23 07:07:30 crc kubenswrapper[5047]: E0223 07:07:30.779171 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4effbb70-cdb3-42e4-a8d3-e1df904f38b9" containerName="barbican-db-sync" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.779177 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4effbb70-cdb3-42e4-a8d3-e1df904f38b9" containerName="barbican-db-sync" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.779360 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4effbb70-cdb3-42e4-a8d3-e1df904f38b9" containerName="barbican-db-sync" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.779381 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24a7665-aaa1-4ec3-bb37-2ec812f8386e" containerName="dnsmasq-dns" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.782055 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.784828 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nglmq" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.786189 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.786415 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.800518 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-v97qw" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.813334 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-67c5f5b45b-szhrl"] Feb 23 07:07:30 crc kubenswrapper[5047]: E0223 07:07:30.813962 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca947e1-09b7-404f-9162-94484bf52701" containerName="cinder-db-sync" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.813982 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca947e1-09b7-404f-9162-94484bf52701" containerName="cinder-db-sync" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.814272 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca947e1-09b7-404f-9162-94484bf52701" containerName="cinder-db-sync" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.815565 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.818720 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.835423 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-799dddc985-b669w"] Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.852129 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-67c5f5b45b-szhrl"] Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.890861 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-5hfhv"] Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.892707 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.894377 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-config-data\") pod \"aca947e1-09b7-404f-9162-94484bf52701\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.894455 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-scripts\") pod \"aca947e1-09b7-404f-9162-94484bf52701\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.894522 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-combined-ca-bundle\") pod \"aca947e1-09b7-404f-9162-94484bf52701\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.894556 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aca947e1-09b7-404f-9162-94484bf52701-etc-machine-id\") pod \"aca947e1-09b7-404f-9162-94484bf52701\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.894603 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99lld\" (UniqueName: \"kubernetes.io/projected/aca947e1-09b7-404f-9162-94484bf52701-kube-api-access-99lld\") pod \"aca947e1-09b7-404f-9162-94484bf52701\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.894660 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-db-sync-config-data\") pod \"aca947e1-09b7-404f-9162-94484bf52701\" (UID: \"aca947e1-09b7-404f-9162-94484bf52701\") " Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.895043 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r4sq\" (UniqueName: \"kubernetes.io/projected/8af89633-d2bc-4f80-9e1e-0eb183f11462-kube-api-access-6r4sq\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.895081 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-config-data\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.895134 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-config-data-custom\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.895160 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af89633-d2bc-4f80-9e1e-0eb183f11462-logs\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.895200 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-combined-ca-bundle\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.901954 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aca947e1-09b7-404f-9162-94484bf52701-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "aca947e1-09b7-404f-9162-94484bf52701" (UID: "aca947e1-09b7-404f-9162-94484bf52701"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.907513 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "aca947e1-09b7-404f-9162-94484bf52701" (UID: "aca947e1-09b7-404f-9162-94484bf52701"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.912355 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-scripts" (OuterVolumeSpecName: "scripts") pod "aca947e1-09b7-404f-9162-94484bf52701" (UID: "aca947e1-09b7-404f-9162-94484bf52701"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.933444 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca947e1-09b7-404f-9162-94484bf52701-kube-api-access-99lld" (OuterVolumeSpecName: "kube-api-access-99lld") pod "aca947e1-09b7-404f-9162-94484bf52701" (UID: "aca947e1-09b7-404f-9162-94484bf52701"). InnerVolumeSpecName "kube-api-access-99lld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.933544 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-5hfhv"] Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.957507 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aca947e1-09b7-404f-9162-94484bf52701" (UID: "aca947e1-09b7-404f-9162-94484bf52701"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.990041 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-config-data" (OuterVolumeSpecName: "config-data") pod "aca947e1-09b7-404f-9162-94484bf52701" (UID: "aca947e1-09b7-404f-9162-94484bf52701"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.996692 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-config-data-custom\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.996743 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8x4z\" (UniqueName: \"kubernetes.io/projected/556e6ac3-8c64-4ee2-95f2-511a07bf220b-kube-api-access-b8x4z\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.996801 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-combined-ca-bundle\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.996831 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r4sq\" (UniqueName: \"kubernetes.io/projected/8af89633-d2bc-4f80-9e1e-0eb183f11462-kube-api-access-6r4sq\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.996864 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.996885 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-config-data\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.996926 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/556e6ac3-8c64-4ee2-95f2-511a07bf220b-logs\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.996952 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-config-data\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.996974 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qbpg\" (UniqueName: \"kubernetes.io/projected/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-kube-api-access-8qbpg\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997007 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-config-data-custom\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997042 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997062 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af89633-d2bc-4f80-9e1e-0eb183f11462-logs\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997083 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-config\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997112 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997135 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997153 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-combined-ca-bundle\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997238 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997252 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997261 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997270 5047 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aca947e1-09b7-404f-9162-94484bf52701-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997279 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99lld\" (UniqueName: \"kubernetes.io/projected/aca947e1-09b7-404f-9162-94484bf52701-kube-api-access-99lld\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997288 5047 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aca947e1-09b7-404f-9162-94484bf52701-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.997376 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5856459f5b-mnzgp"] Feb 23 07:07:30 crc kubenswrapper[5047]: I0223 07:07:30.998172 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af89633-d2bc-4f80-9e1e-0eb183f11462-logs\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.000666 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.003357 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-combined-ca-bundle\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.003741 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-config-data\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.005629 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-config-data-custom\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.010374 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.022887 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r4sq\" (UniqueName: \"kubernetes.io/projected/8af89633-d2bc-4f80-9e1e-0eb183f11462-kube-api-access-6r4sq\") pod \"barbican-worker-799dddc985-b669w\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.024710 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5856459f5b-mnzgp"] Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.099785 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg5lz\" (UniqueName: \"kubernetes.io/projected/b602e57c-15c3-42cc-b37a-740047deb948-kube-api-access-cg5lz\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.100850 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-combined-ca-bundle\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.101012 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.101179 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/556e6ac3-8c64-4ee2-95f2-511a07bf220b-logs\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.101370 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-config-data\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.101527 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-combined-ca-bundle\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.101638 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qbpg\" (UniqueName: \"kubernetes.io/projected/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-kube-api-access-8qbpg\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.101763 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.101871 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-config\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.103030 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.103169 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.103299 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b602e57c-15c3-42cc-b37a-740047deb948-logs\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.103420 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-config-data-custom\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.103517 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-config-data\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.104576 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.101657 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/556e6ac3-8c64-4ee2-95f2-511a07bf220b-logs\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.102748 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.102961 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-config\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.104646 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-combined-ca-bundle\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.102051 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.104468 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.105125 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-config-data-custom\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.105194 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8x4z\" (UniqueName: \"kubernetes.io/projected/556e6ac3-8c64-4ee2-95f2-511a07bf220b-kube-api-access-b8x4z\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.105492 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-config-data\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.109241 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-config-data-custom\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.123026 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qbpg\" (UniqueName: \"kubernetes.io/projected/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-kube-api-access-8qbpg\") pod \"dnsmasq-dns-9d49dd75f-5hfhv\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.126743 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8x4z\" (UniqueName: \"kubernetes.io/projected/556e6ac3-8c64-4ee2-95f2-511a07bf220b-kube-api-access-b8x4z\") pod \"barbican-keystone-listener-67c5f5b45b-szhrl\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.162640 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.172215 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.207113 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-combined-ca-bundle\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.207207 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b602e57c-15c3-42cc-b37a-740047deb948-logs\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.207238 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-config-data-custom\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.207255 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-config-data\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.207308 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg5lz\" (UniqueName: \"kubernetes.io/projected/b602e57c-15c3-42cc-b37a-740047deb948-kube-api-access-cg5lz\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.208979 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b602e57c-15c3-42cc-b37a-740047deb948-logs\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.212453 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-combined-ca-bundle\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.214070 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-config-data\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.215106 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-config-data-custom\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.230171 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg5lz\" (UniqueName: \"kubernetes.io/projected/b602e57c-15c3-42cc-b37a-740047deb948-kube-api-access-cg5lz\") pod \"barbican-api-5856459f5b-mnzgp\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.233250 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.319824 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nglmq" event={"ID":"aca947e1-09b7-404f-9162-94484bf52701","Type":"ContainerDied","Data":"329a6c5c160bc4ade2a05131c1a0ed9ecae2ee7191b5cb346289e3d67c0c3d49"} Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.319872 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="329a6c5c160bc4ade2a05131c1a0ed9ecae2ee7191b5cb346289e3d67c0c3d49" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.319969 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nglmq" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.325287 5047 generic.go:334] "Generic (PLEG): container finished" podID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerID="7dbb0f2cb1fd3a1970239bd3959d74badc51668fe802c20c64485a2570565ebe" exitCode=0 Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.325323 5047 generic.go:334] "Generic (PLEG): container finished" podID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerID="299090d986c8c6ed9b50c0a6fb790c96a09c902c1c2aea7366f06a25dbe324c0" exitCode=2 Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.325335 5047 generic.go:334] "Generic (PLEG): container finished" podID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerID="303b0be2fc137b88102768998fcad85da0339cb31cd9c5b3f96d8dc5f831d866" exitCode=0 Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.325364 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70d776-7f7c-41ee-b60d-5a924bd30b7e","Type":"ContainerDied","Data":"7dbb0f2cb1fd3a1970239bd3959d74badc51668fe802c20c64485a2570565ebe"} Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.325430 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70d776-7f7c-41ee-b60d-5a924bd30b7e","Type":"ContainerDied","Data":"299090d986c8c6ed9b50c0a6fb790c96a09c902c1c2aea7366f06a25dbe324c0"} Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.325445 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70d776-7f7c-41ee-b60d-5a924bd30b7e","Type":"ContainerDied","Data":"303b0be2fc137b88102768998fcad85da0339cb31cd9c5b3f96d8dc5f831d866"} Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.336667 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.500353 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.502175 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.508339 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.513113 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.513434 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.519117 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4k8zt" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.538027 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.586476 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-5hfhv"] Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.616502 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2"] Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.618682 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.625447 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.625629 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg8r2\" (UniqueName: \"kubernetes.io/projected/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-kube-api-access-bg8r2\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.625693 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.625753 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-config-data\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.625791 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-scripts\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.625817 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.666492 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2"] Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.733252 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.733312 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.733359 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.733381 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77ljk\" (UniqueName: \"kubernetes.io/projected/c53524c5-f61d-4400-b15e-75d3a13e8297-kube-api-access-77ljk\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.733400 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.733421 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg8r2\" (UniqueName: \"kubernetes.io/projected/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-kube-api-access-bg8r2\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.733477 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.733524 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.733559 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-config\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.733589 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-config-data\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.733624 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.733643 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-scripts\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.735597 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.743060 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.746120 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.746161 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-scripts\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.746968 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-config-data\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.760106 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg8r2\" (UniqueName: \"kubernetes.io/projected/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-kube-api-access-bg8r2\") pod \"cinder-scheduler-0\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.793993 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-799dddc985-b669w"] Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.843015 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.843663 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.843740 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-config\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.844021 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.844130 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.844249 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.844281 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77ljk\" (UniqueName: \"kubernetes.io/projected/c53524c5-f61d-4400-b15e-75d3a13e8297-kube-api-access-77ljk\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.857861 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.859828 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.859867 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.860209 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-config\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.889143 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.896845 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.901692 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.906186 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.912993 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-67c5f5b45b-szhrl"] Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.923501 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77ljk\" (UniqueName: \"kubernetes.io/projected/c53524c5-f61d-4400-b15e-75d3a13e8297-kube-api-access-77ljk\") pod \"dnsmasq-dns-6c8dc7b4d9-xwjw2\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.929586 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:07:31 crc kubenswrapper[5047]: I0223 07:07:31.961368 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.012008 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-5hfhv"] Feb 23 07:07:32 crc kubenswrapper[5047]: W0223 07:07:32.012163 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod579a85eb_082b_4fc4_abf3_a4c11fdacf0f.slice/crio-53997038c6f1af699a1f266476b6068ad9ed9d2a6b351e39483eb48332cbc15f WatchSource:0}: Error finding container 53997038c6f1af699a1f266476b6068ad9ed9d2a6b351e39483eb48332cbc15f: Status 404 returned error can't find the container with id 53997038c6f1af699a1f266476b6068ad9ed9d2a6b351e39483eb48332cbc15f Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.060751 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.060802 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-etc-machine-id\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.060838 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-logs\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.060870 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-scripts\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.060914 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-config-data\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.060938 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-config-data-custom\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.060964 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ttjc\" (UniqueName: \"kubernetes.io/projected/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-kube-api-access-4ttjc\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.154374 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5856459f5b-mnzgp"] Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.163917 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-config-data-custom\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.164253 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ttjc\" (UniqueName: \"kubernetes.io/projected/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-kube-api-access-4ttjc\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.164762 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.164794 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-etc-machine-id\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.164899 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-logs\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.164959 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-scripts\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.165023 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-config-data\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.165288 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-etc-machine-id\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.166374 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-logs\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: W0223 07:07:32.169313 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb602e57c_15c3_42cc_b37a_740047deb948.slice/crio-b9ff29c385a088cd67807ff7cf9447cc7236b3ff0917f6b7b84ff973042bfa02 WatchSource:0}: Error finding container b9ff29c385a088cd67807ff7cf9447cc7236b3ff0917f6b7b84ff973042bfa02: Status 404 returned error can't find the container with id b9ff29c385a088cd67807ff7cf9447cc7236b3ff0917f6b7b84ff973042bfa02 Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.171243 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-config-data\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.174614 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-scripts\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.175996 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-config-data-custom\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.179794 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.194073 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ttjc\" (UniqueName: \"kubernetes.io/projected/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-kube-api-access-4ttjc\") pod \"cinder-api-0\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.266319 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.393987 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.395625 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5856459f5b-mnzgp" event={"ID":"b602e57c-15c3-42cc-b37a-740047deb948","Type":"ContainerStarted","Data":"b9ff29c385a088cd67807ff7cf9447cc7236b3ff0917f6b7b84ff973042bfa02"} Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.403473 5047 generic.go:334] "Generic (PLEG): container finished" podID="579a85eb-082b-4fc4-abf3-a4c11fdacf0f" containerID="b9f10d2f1dd9b29560bf5d01af0904d706883905051a4612e772c40391fcfe8b" exitCode=0 Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.403529 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" event={"ID":"579a85eb-082b-4fc4-abf3-a4c11fdacf0f","Type":"ContainerDied","Data":"b9f10d2f1dd9b29560bf5d01af0904d706883905051a4612e772c40391fcfe8b"} Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.403830 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" event={"ID":"579a85eb-082b-4fc4-abf3-a4c11fdacf0f","Type":"ContainerStarted","Data":"53997038c6f1af699a1f266476b6068ad9ed9d2a6b351e39483eb48332cbc15f"} Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.406288 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-799dddc985-b669w" event={"ID":"8af89633-d2bc-4f80-9e1e-0eb183f11462","Type":"ContainerStarted","Data":"00e565f531572fa03633468e40118a0379a2309d335c1660f9e708811cfb3222"} Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.411924 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" event={"ID":"556e6ac3-8c64-4ee2-95f2-511a07bf220b","Type":"ContainerStarted","Data":"1b97eb217f1c10d1c7abd316a031554173458d0214ab00813822aaa1492304ec"} Feb 23 07:07:32 crc kubenswrapper[5047]: W0223 07:07:32.419935 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3495c7bc_8bda_4e3e_bb7e_b147b38426cc.slice/crio-7e24c487dac0f31ebfa1a01de0e94c781fa3bc8fa4506c0265942195d999c60a WatchSource:0}: Error finding container 7e24c487dac0f31ebfa1a01de0e94c781fa3bc8fa4506c0265942195d999c60a: Status 404 returned error can't find the container with id 7e24c487dac0f31ebfa1a01de0e94c781fa3bc8fa4506c0265942195d999c60a Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.528574 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2"] Feb 23 07:07:32 crc kubenswrapper[5047]: W0223 07:07:32.529383 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc53524c5_f61d_4400_b15e_75d3a13e8297.slice/crio-e6911942035096026ad06d77ea8f79879d181b8f2ce45ce50563caaa696c48ae WatchSource:0}: Error finding container e6911942035096026ad06d77ea8f79879d181b8f2ce45ce50563caaa696c48ae: Status 404 returned error can't find the container with id e6911942035096026ad06d77ea8f79879d181b8f2ce45ce50563caaa696c48ae Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.944235 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:07:32 crc kubenswrapper[5047]: I0223 07:07:32.945517 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:33 crc kubenswrapper[5047]: W0223 07:07:33.056999 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod369ee1fc_d1da_47e9_b1e7_04a114c1e5df.slice/crio-8c654d51cc7830e0208b5a0c09c7ad45129bb3f596475b43a7a9365d251d9784 WatchSource:0}: Error finding container 8c654d51cc7830e0208b5a0c09c7ad45129bb3f596475b43a7a9365d251d9784: Status 404 returned error can't find the container with id 8c654d51cc7830e0208b5a0c09c7ad45129bb3f596475b43a7a9365d251d9784 Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.096239 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-ovsdbserver-nb\") pod \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.096767 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-dns-swift-storage-0\") pod \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.096943 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-ovsdbserver-sb\") pod \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.097038 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-dns-svc\") pod \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.097087 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qbpg\" (UniqueName: \"kubernetes.io/projected/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-kube-api-access-8qbpg\") pod \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.097277 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-config\") pod \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\" (UID: \"579a85eb-082b-4fc4-abf3-a4c11fdacf0f\") " Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.103787 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-kube-api-access-8qbpg" (OuterVolumeSpecName: "kube-api-access-8qbpg") pod "579a85eb-082b-4fc4-abf3-a4c11fdacf0f" (UID: "579a85eb-082b-4fc4-abf3-a4c11fdacf0f"). InnerVolumeSpecName "kube-api-access-8qbpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.125015 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-config" (OuterVolumeSpecName: "config") pod "579a85eb-082b-4fc4-abf3-a4c11fdacf0f" (UID: "579a85eb-082b-4fc4-abf3-a4c11fdacf0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.127632 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "579a85eb-082b-4fc4-abf3-a4c11fdacf0f" (UID: "579a85eb-082b-4fc4-abf3-a4c11fdacf0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.127656 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "579a85eb-082b-4fc4-abf3-a4c11fdacf0f" (UID: "579a85eb-082b-4fc4-abf3-a4c11fdacf0f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.134660 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "579a85eb-082b-4fc4-abf3-a4c11fdacf0f" (UID: "579a85eb-082b-4fc4-abf3-a4c11fdacf0f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.168449 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "579a85eb-082b-4fc4-abf3-a4c11fdacf0f" (UID: "579a85eb-082b-4fc4-abf3-a4c11fdacf0f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.201301 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.201346 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.201360 5047 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.201369 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.201380 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.201389 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qbpg\" (UniqueName: \"kubernetes.io/projected/579a85eb-082b-4fc4-abf3-a4c11fdacf0f-kube-api-access-8qbpg\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.441400 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3495c7bc-8bda-4e3e-bb7e-b147b38426cc","Type":"ContainerStarted","Data":"7e24c487dac0f31ebfa1a01de0e94c781fa3bc8fa4506c0265942195d999c60a"} Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.444887 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5856459f5b-mnzgp" event={"ID":"b602e57c-15c3-42cc-b37a-740047deb948","Type":"ContainerStarted","Data":"ac57ae972b740c745ab6d81111292382cca65cd485b190d599b552aa597e419a"} Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.444941 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5856459f5b-mnzgp" event={"ID":"b602e57c-15c3-42cc-b37a-740047deb948","Type":"ContainerStarted","Data":"8dd78583fe6bc8b5e349ae87651f76072d80a628b66dd462ab8922f0b45f99fc"} Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.445157 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.445604 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.448369 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" event={"ID":"579a85eb-082b-4fc4-abf3-a4c11fdacf0f","Type":"ContainerDied","Data":"53997038c6f1af699a1f266476b6068ad9ed9d2a6b351e39483eb48332cbc15f"} Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.448417 5047 scope.go:117] "RemoveContainer" containerID="b9f10d2f1dd9b29560bf5d01af0904d706883905051a4612e772c40391fcfe8b" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.448530 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-5hfhv" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.455640 5047 generic.go:334] "Generic (PLEG): container finished" podID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerID="817654c6409668e7517ce28241e05825ddfc2915782d23b89179dd2479319550" exitCode=0 Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.455797 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70d776-7f7c-41ee-b60d-5a924bd30b7e","Type":"ContainerDied","Data":"817654c6409668e7517ce28241e05825ddfc2915782d23b89179dd2479319550"} Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.457241 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"369ee1fc-d1da-47e9-b1e7-04a114c1e5df","Type":"ContainerStarted","Data":"8c654d51cc7830e0208b5a0c09c7ad45129bb3f596475b43a7a9365d251d9784"} Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.458772 5047 generic.go:334] "Generic (PLEG): container finished" podID="c53524c5-f61d-4400-b15e-75d3a13e8297" containerID="ff18d81ab3929d4bd39416c335b372ef47753f96edbb2e9d38b02a6a42d92032" exitCode=0 Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.458809 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" event={"ID":"c53524c5-f61d-4400-b15e-75d3a13e8297","Type":"ContainerDied","Data":"ff18d81ab3929d4bd39416c335b372ef47753f96edbb2e9d38b02a6a42d92032"} Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.458827 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" event={"ID":"c53524c5-f61d-4400-b15e-75d3a13e8297","Type":"ContainerStarted","Data":"e6911942035096026ad06d77ea8f79879d181b8f2ce45ce50563caaa696c48ae"} Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.472272 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5856459f5b-mnzgp" podStartSLOduration=3.472248573 podStartE2EDuration="3.472248573s" podCreationTimestamp="2026-02-23 07:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:33.470873159 +0000 UTC m=+1375.722200293" watchObservedRunningTime="2026-02-23 07:07:33.472248573 +0000 UTC m=+1375.723575727" Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.563697 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-5hfhv"] Feb 23 07:07:33 crc kubenswrapper[5047]: I0223 07:07:33.572148 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-5hfhv"] Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.267117 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.394309 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="579a85eb-082b-4fc4-abf3-a4c11fdacf0f" path="/var/lib/kubelet/pods/579a85eb-082b-4fc4-abf3-a4c11fdacf0f/volumes" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.429333 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-log-httpd\") pod \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.429544 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8g9j\" (UniqueName: \"kubernetes.io/projected/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-kube-api-access-r8g9j\") pod \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.429682 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-run-httpd\") pod \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.429752 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-config-data\") pod \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.430704 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-combined-ca-bundle\") pod \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.430787 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-scripts\") pod \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.430835 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-sg-core-conf-yaml\") pod \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\" (UID: \"8e70d776-7f7c-41ee-b60d-5a924bd30b7e\") " Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.433270 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8e70d776-7f7c-41ee-b60d-5a924bd30b7e" (UID: "8e70d776-7f7c-41ee-b60d-5a924bd30b7e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.435164 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8e70d776-7f7c-41ee-b60d-5a924bd30b7e" (UID: "8e70d776-7f7c-41ee-b60d-5a924bd30b7e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.441348 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-kube-api-access-r8g9j" (OuterVolumeSpecName: "kube-api-access-r8g9j") pod "8e70d776-7f7c-41ee-b60d-5a924bd30b7e" (UID: "8e70d776-7f7c-41ee-b60d-5a924bd30b7e"). InnerVolumeSpecName "kube-api-access-r8g9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.441442 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-scripts" (OuterVolumeSpecName: "scripts") pod "8e70d776-7f7c-41ee-b60d-5a924bd30b7e" (UID: "8e70d776-7f7c-41ee-b60d-5a924bd30b7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.482719 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8e70d776-7f7c-41ee-b60d-5a924bd30b7e" (UID: "8e70d776-7f7c-41ee-b60d-5a924bd30b7e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.503635 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" event={"ID":"556e6ac3-8c64-4ee2-95f2-511a07bf220b","Type":"ContainerStarted","Data":"aa4a5146307cec757547a6d59c96484c78787b2844a2b96dc84e891c5cb4091e"} Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.510669 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-799dddc985-b669w" event={"ID":"8af89633-d2bc-4f80-9e1e-0eb183f11462","Type":"ContainerStarted","Data":"4408ad01f05541b66645dba36a664dd57a94590d9f81a623e6f9d239cac9e77e"} Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.520338 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70d776-7f7c-41ee-b60d-5a924bd30b7e","Type":"ContainerDied","Data":"2189e74f1c16851895c6382ed06b5927e6abb93a6242832359b383a9615013c8"} Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.520383 5047 scope.go:117] "RemoveContainer" containerID="7dbb0f2cb1fd3a1970239bd3959d74badc51668fe802c20c64485a2570565ebe" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.520436 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.529158 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" event={"ID":"c53524c5-f61d-4400-b15e-75d3a13e8297","Type":"ContainerStarted","Data":"3dd3bd2be726ff15885515d1e19edb35ae6b4705c213bdb72bc493ebf2b48abf"} Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.529403 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.533788 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.533815 5047 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.533825 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.533836 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8g9j\" (UniqueName: \"kubernetes.io/projected/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-kube-api-access-r8g9j\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.533847 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.555976 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" podStartSLOduration=3.555954914 podStartE2EDuration="3.555954914s" podCreationTimestamp="2026-02-23 07:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:34.550624559 +0000 UTC m=+1376.801951693" watchObservedRunningTime="2026-02-23 07:07:34.555954914 +0000 UTC m=+1376.807282048" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.560211 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e70d776-7f7c-41ee-b60d-5a924bd30b7e" (UID: "8e70d776-7f7c-41ee-b60d-5a924bd30b7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.593716 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-config-data" (OuterVolumeSpecName: "config-data") pod "8e70d776-7f7c-41ee-b60d-5a924bd30b7e" (UID: "8e70d776-7f7c-41ee-b60d-5a924bd30b7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.636129 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.636618 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e70d776-7f7c-41ee-b60d-5a924bd30b7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.655991 5047 scope.go:117] "RemoveContainer" containerID="299090d986c8c6ed9b50c0a6fb790c96a09c902c1c2aea7366f06a25dbe324c0" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.692457 5047 scope.go:117] "RemoveContainer" containerID="817654c6409668e7517ce28241e05825ddfc2915782d23b89179dd2479319550" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.739024 5047 scope.go:117] "RemoveContainer" containerID="303b0be2fc137b88102768998fcad85da0339cb31cd9c5b3f96d8dc5f831d866" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.881687 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.890650 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.917039 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:34 crc kubenswrapper[5047]: E0223 07:07:34.917661 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="sg-core" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.917683 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="sg-core" Feb 23 07:07:34 crc kubenswrapper[5047]: E0223 07:07:34.917700 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="proxy-httpd" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.917711 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="proxy-httpd" Feb 23 07:07:34 crc kubenswrapper[5047]: E0223 07:07:34.917727 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="ceilometer-notification-agent" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.917733 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="ceilometer-notification-agent" Feb 23 07:07:34 crc kubenswrapper[5047]: E0223 07:07:34.917748 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="ceilometer-central-agent" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.917755 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="ceilometer-central-agent" Feb 23 07:07:34 crc kubenswrapper[5047]: E0223 07:07:34.917788 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579a85eb-082b-4fc4-abf3-a4c11fdacf0f" containerName="init" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.917796 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="579a85eb-082b-4fc4-abf3-a4c11fdacf0f" containerName="init" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.917990 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="579a85eb-082b-4fc4-abf3-a4c11fdacf0f" containerName="init" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.918013 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="ceilometer-notification-agent" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.918028 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="sg-core" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.918035 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="proxy-httpd" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.918047 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" containerName="ceilometer-central-agent" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.920106 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.923948 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.924013 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:07:34 crc kubenswrapper[5047]: I0223 07:07:34.937045 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.045243 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.045584 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-config-data\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.045671 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-log-httpd\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.045791 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-run-httpd\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.045864 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd5zj\" (UniqueName: \"kubernetes.io/projected/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-kube-api-access-hd5zj\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.045961 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.046094 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-scripts\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.092540 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.148229 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.148325 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-scripts\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.148369 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.148428 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-config-data\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.148451 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-log-httpd\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.148478 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-run-httpd\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.148497 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd5zj\" (UniqueName: \"kubernetes.io/projected/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-kube-api-access-hd5zj\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.149356 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-log-httpd\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.149474 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-run-httpd\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.158774 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.159128 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-config-data\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.159230 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.161845 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-scripts\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.163610 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd5zj\" (UniqueName: \"kubernetes.io/projected/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-kube-api-access-hd5zj\") pod \"ceilometer-0\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.240599 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.565080 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" event={"ID":"556e6ac3-8c64-4ee2-95f2-511a07bf220b","Type":"ContainerStarted","Data":"d7297c9cf9ea2fd2ebb4ac47a524950334eb77e75ff71aaa81cc766db2c30786"} Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.570743 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3495c7bc-8bda-4e3e-bb7e-b147b38426cc","Type":"ContainerStarted","Data":"f10597bbd52866628f9f6ccf870c2f025b4b71e4c880e5bde19c27453acb0dee"} Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.575663 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-799dddc985-b669w" event={"ID":"8af89633-d2bc-4f80-9e1e-0eb183f11462","Type":"ContainerStarted","Data":"9aac78820b50823a3db600bfbb072af48e62a5a3552b7f3389ee00434e00efbe"} Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.603396 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"369ee1fc-d1da-47e9-b1e7-04a114c1e5df","Type":"ContainerStarted","Data":"60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff"} Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.625073 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" podStartSLOduration=3.491637154 podStartE2EDuration="5.625048425s" podCreationTimestamp="2026-02-23 07:07:30 +0000 UTC" firstStartedPulling="2026-02-23 07:07:31.83647376 +0000 UTC m=+1374.087800894" lastFinishedPulling="2026-02-23 07:07:33.969885031 +0000 UTC m=+1376.221212165" observedRunningTime="2026-02-23 07:07:35.593467297 +0000 UTC m=+1377.844794431" watchObservedRunningTime="2026-02-23 07:07:35.625048425 +0000 UTC m=+1377.876375559" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.638415 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-799dddc985-b669w" podStartSLOduration=3.464771054 podStartE2EDuration="5.638385262s" podCreationTimestamp="2026-02-23 07:07:30 +0000 UTC" firstStartedPulling="2026-02-23 07:07:31.815090429 +0000 UTC m=+1374.066417563" lastFinishedPulling="2026-02-23 07:07:33.988704637 +0000 UTC m=+1376.240031771" observedRunningTime="2026-02-23 07:07:35.614322154 +0000 UTC m=+1377.865649308" watchObservedRunningTime="2026-02-23 07:07:35.638385262 +0000 UTC m=+1377.889712396" Feb 23 07:07:35 crc kubenswrapper[5047]: I0223 07:07:35.836388 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:36 crc kubenswrapper[5047]: I0223 07:07:36.354240 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e70d776-7f7c-41ee-b60d-5a924bd30b7e" path="/var/lib/kubelet/pods/8e70d776-7f7c-41ee-b60d-5a924bd30b7e/volumes" Feb 23 07:07:36 crc kubenswrapper[5047]: I0223 07:07:36.650007 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"369ee1fc-d1da-47e9-b1e7-04a114c1e5df","Type":"ContainerStarted","Data":"5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1"} Feb 23 07:07:36 crc kubenswrapper[5047]: I0223 07:07:36.650122 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="369ee1fc-d1da-47e9-b1e7-04a114c1e5df" containerName="cinder-api-log" containerID="cri-o://60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff" gracePeriod=30 Feb 23 07:07:36 crc kubenswrapper[5047]: I0223 07:07:36.650229 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 23 07:07:36 crc kubenswrapper[5047]: I0223 07:07:36.650261 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="369ee1fc-d1da-47e9-b1e7-04a114c1e5df" containerName="cinder-api" containerID="cri-o://5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1" gracePeriod=30 Feb 23 07:07:36 crc kubenswrapper[5047]: I0223 07:07:36.656059 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3495c7bc-8bda-4e3e-bb7e-b147b38426cc","Type":"ContainerStarted","Data":"bed7ad6764de94b649d939d5d9538f190d0a3d063c09a05000c745b08428162d"} Feb 23 07:07:36 crc kubenswrapper[5047]: I0223 07:07:36.661716 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c","Type":"ContainerStarted","Data":"bb7be07bd007c584bd7b01df32314aaf8c744c39fd8e80c73ff9040f3626f3e9"} Feb 23 07:07:36 crc kubenswrapper[5047]: I0223 07:07:36.675712 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.67568847 podStartE2EDuration="5.67568847s" podCreationTimestamp="2026-02-23 07:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:36.674938021 +0000 UTC m=+1378.926265155" watchObservedRunningTime="2026-02-23 07:07:36.67568847 +0000 UTC m=+1378.927015604" Feb 23 07:07:36 crc kubenswrapper[5047]: I0223 07:07:36.700151 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.138964869 podStartE2EDuration="5.700127258s" podCreationTimestamp="2026-02-23 07:07:31 +0000 UTC" firstStartedPulling="2026-02-23 07:07:32.426692637 +0000 UTC m=+1374.678019771" lastFinishedPulling="2026-02-23 07:07:33.987855026 +0000 UTC m=+1376.239182160" observedRunningTime="2026-02-23 07:07:36.694767712 +0000 UTC m=+1378.946094866" watchObservedRunningTime="2026-02-23 07:07:36.700127258 +0000 UTC m=+1378.951454392" Feb 23 07:07:36 crc kubenswrapper[5047]: I0223 07:07:36.834988 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.218567 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.304163 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-etc-machine-id\") pod \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.304307 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-config-data-custom\") pod \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.304359 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-scripts\") pod \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.304395 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ttjc\" (UniqueName: \"kubernetes.io/projected/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-kube-api-access-4ttjc\") pod \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.304454 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-config-data\") pod \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.304501 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-combined-ca-bundle\") pod \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.304580 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-logs\") pod \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\" (UID: \"369ee1fc-d1da-47e9-b1e7-04a114c1e5df\") " Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.305709 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-logs" (OuterVolumeSpecName: "logs") pod "369ee1fc-d1da-47e9-b1e7-04a114c1e5df" (UID: "369ee1fc-d1da-47e9-b1e7-04a114c1e5df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.305817 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "369ee1fc-d1da-47e9-b1e7-04a114c1e5df" (UID: "369ee1fc-d1da-47e9-b1e7-04a114c1e5df"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.312764 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "369ee1fc-d1da-47e9-b1e7-04a114c1e5df" (UID: "369ee1fc-d1da-47e9-b1e7-04a114c1e5df"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.312802 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-kube-api-access-4ttjc" (OuterVolumeSpecName: "kube-api-access-4ttjc") pod "369ee1fc-d1da-47e9-b1e7-04a114c1e5df" (UID: "369ee1fc-d1da-47e9-b1e7-04a114c1e5df"). InnerVolumeSpecName "kube-api-access-4ttjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.314307 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-scripts" (OuterVolumeSpecName: "scripts") pod "369ee1fc-d1da-47e9-b1e7-04a114c1e5df" (UID: "369ee1fc-d1da-47e9-b1e7-04a114c1e5df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.361642 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "369ee1fc-d1da-47e9-b1e7-04a114c1e5df" (UID: "369ee1fc-d1da-47e9-b1e7-04a114c1e5df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.383052 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-config-data" (OuterVolumeSpecName: "config-data") pod "369ee1fc-d1da-47e9-b1e7-04a114c1e5df" (UID: "369ee1fc-d1da-47e9-b1e7-04a114c1e5df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.407105 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.407147 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.407157 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ttjc\" (UniqueName: \"kubernetes.io/projected/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-kube-api-access-4ttjc\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.407170 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.407179 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.407189 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.407199 5047 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/369ee1fc-d1da-47e9-b1e7-04a114c1e5df-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.674466 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c","Type":"ContainerStarted","Data":"8d83ede9036b9fd8c2e334d730c2786748c4c52e7f0242494531c434bc3ec5be"} Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.676525 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c","Type":"ContainerStarted","Data":"b16634f8ca7b2464266a2169de1e2b10c81111ece98874a629f7e7c4dbd657c8"} Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.682098 5047 generic.go:334] "Generic (PLEG): container finished" podID="369ee1fc-d1da-47e9-b1e7-04a114c1e5df" containerID="5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1" exitCode=0 Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.682155 5047 generic.go:334] "Generic (PLEG): container finished" podID="369ee1fc-d1da-47e9-b1e7-04a114c1e5df" containerID="60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff" exitCode=143 Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.683464 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.685047 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"369ee1fc-d1da-47e9-b1e7-04a114c1e5df","Type":"ContainerDied","Data":"5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1"} Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.685101 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"369ee1fc-d1da-47e9-b1e7-04a114c1e5df","Type":"ContainerDied","Data":"60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff"} Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.685111 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"369ee1fc-d1da-47e9-b1e7-04a114c1e5df","Type":"ContainerDied","Data":"8c654d51cc7830e0208b5a0c09c7ad45129bb3f596475b43a7a9365d251d9784"} Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.685130 5047 scope.go:117] "RemoveContainer" containerID="5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.753540 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.757930 5047 scope.go:117] "RemoveContainer" containerID="60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.764060 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.781976 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:07:37 crc kubenswrapper[5047]: E0223 07:07:37.782444 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369ee1fc-d1da-47e9-b1e7-04a114c1e5df" containerName="cinder-api" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.782478 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="369ee1fc-d1da-47e9-b1e7-04a114c1e5df" containerName="cinder-api" Feb 23 07:07:37 crc kubenswrapper[5047]: E0223 07:07:37.782503 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369ee1fc-d1da-47e9-b1e7-04a114c1e5df" containerName="cinder-api-log" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.782510 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="369ee1fc-d1da-47e9-b1e7-04a114c1e5df" containerName="cinder-api-log" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.782716 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="369ee1fc-d1da-47e9-b1e7-04a114c1e5df" containerName="cinder-api-log" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.782754 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="369ee1fc-d1da-47e9-b1e7-04a114c1e5df" containerName="cinder-api" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.783775 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.823513 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.823667 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.823821 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.826762 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.854258 5047 scope.go:117] "RemoveContainer" containerID="5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1" Feb 23 07:07:37 crc kubenswrapper[5047]: E0223 07:07:37.861133 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1\": container with ID starting with 5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1 not found: ID does not exist" containerID="5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.861199 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1"} err="failed to get container status \"5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1\": rpc error: code = NotFound desc = could not find container \"5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1\": container with ID starting with 5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1 not found: ID does not exist" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.861234 5047 scope.go:117] "RemoveContainer" containerID="60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff" Feb 23 07:07:37 crc kubenswrapper[5047]: E0223 07:07:37.866813 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff\": container with ID starting with 60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff not found: ID does not exist" containerID="60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.866839 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff"} err="failed to get container status \"60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff\": rpc error: code = NotFound desc = could not find container \"60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff\": container with ID starting with 60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff not found: ID does not exist" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.866861 5047 scope.go:117] "RemoveContainer" containerID="5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.873627 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1"} err="failed to get container status \"5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1\": rpc error: code = NotFound desc = could not find container \"5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1\": container with ID starting with 5ec2f67c30d345bef385936945adb80f82759e54405f57761f11a10ec47637e1 not found: ID does not exist" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.873663 5047 scope.go:117] "RemoveContainer" containerID="60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.874601 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff"} err="failed to get container status \"60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff\": rpc error: code = NotFound desc = could not find container \"60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff\": container with ID starting with 60a06988ee1570643d84b33a6c04b9982f88d46fd02d482f3b9ae5f80b98f7ff not found: ID does not exist" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.927317 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.927655 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-logs\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.927771 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-config-data\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.928073 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.928179 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-scripts\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.928239 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-config-data-custom\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.928309 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.928712 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8txjc\" (UniqueName: \"kubernetes.io/projected/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-kube-api-access-8txjc\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:37 crc kubenswrapper[5047]: I0223 07:07:37.928848 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.030501 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-config-data\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.030557 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.030585 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-scripts\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.030603 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-config-data-custom\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.030630 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.030698 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8txjc\" (UniqueName: \"kubernetes.io/projected/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-kube-api-access-8txjc\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.030733 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.030762 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.030795 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-logs\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.031553 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-logs\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.032100 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.037246 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-scripts\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.038251 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.038548 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.039584 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-config-data\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.040180 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-config-data-custom\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.043947 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.062612 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8txjc\" (UniqueName: \"kubernetes.io/projected/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-kube-api-access-8txjc\") pod \"cinder-api-0\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.148353 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.392455 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="369ee1fc-d1da-47e9-b1e7-04a114c1e5df" path="/var/lib/kubelet/pods/369ee1fc-d1da-47e9-b1e7-04a114c1e5df/volumes" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.464416 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5cb54ddf68-t6k79"] Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.466447 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.473688 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.473945 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.542921 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-config-data-custom\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.542978 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-config-data\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.543030 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-internal-tls-certs\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.543075 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-combined-ca-bundle\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.543119 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27668b66-4868-448a-b2dd-e270ed4bc677-logs\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.543173 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmrbb\" (UniqueName: \"kubernetes.io/projected/27668b66-4868-448a-b2dd-e270ed4bc677-kube-api-access-bmrbb\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.543221 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-public-tls-certs\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.547628 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5cb54ddf68-t6k79"] Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.649791 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-public-tls-certs\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.649958 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-config-data-custom\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.649980 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-config-data\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.650084 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-internal-tls-certs\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.650174 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-combined-ca-bundle\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.650325 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27668b66-4868-448a-b2dd-e270ed4bc677-logs\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.650565 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmrbb\" (UniqueName: \"kubernetes.io/projected/27668b66-4868-448a-b2dd-e270ed4bc677-kube-api-access-bmrbb\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.653006 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27668b66-4868-448a-b2dd-e270ed4bc677-logs\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.655749 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.656939 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-config-data-custom\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.657609 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.658168 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-combined-ca-bundle\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.667238 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-internal-tls-certs\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.671356 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-public-tls-certs\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.672348 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmrbb\" (UniqueName: \"kubernetes.io/projected/27668b66-4868-448a-b2dd-e270ed4bc677-kube-api-access-bmrbb\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.675293 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-config-data\") pod \"barbican-api-5cb54ddf68-t6k79\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.720923 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:07:38 crc kubenswrapper[5047]: I0223 07:07:38.818589 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:39 crc kubenswrapper[5047]: I0223 07:07:39.400903 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5cb54ddf68-t6k79"] Feb 23 07:07:39 crc kubenswrapper[5047]: W0223 07:07:39.422163 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27668b66_4868_448a_b2dd_e270ed4bc677.slice/crio-b4cbc66300dbd11959b05e9470d5418d9d599f15cb883c82d6ae0561b9afe677 WatchSource:0}: Error finding container b4cbc66300dbd11959b05e9470d5418d9d599f15cb883c82d6ae0561b9afe677: Status 404 returned error can't find the container with id b4cbc66300dbd11959b05e9470d5418d9d599f15cb883c82d6ae0561b9afe677 Feb 23 07:07:39 crc kubenswrapper[5047]: I0223 07:07:39.749284 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c","Type":"ContainerStarted","Data":"1710a1afaf8b8d1863ad4a91900f4520a33242333b0ea1818cb9d87e03a7c7e5"} Feb 23 07:07:39 crc kubenswrapper[5047]: I0223 07:07:39.752276 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cb54ddf68-t6k79" event={"ID":"27668b66-4868-448a-b2dd-e270ed4bc677","Type":"ContainerStarted","Data":"41b717817973f595812fe1eb20824b0b443f9f0d8cd641d6601c7057d79f6854"} Feb 23 07:07:39 crc kubenswrapper[5047]: I0223 07:07:39.752333 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cb54ddf68-t6k79" event={"ID":"27668b66-4868-448a-b2dd-e270ed4bc677","Type":"ContainerStarted","Data":"b4cbc66300dbd11959b05e9470d5418d9d599f15cb883c82d6ae0561b9afe677"} Feb 23 07:07:39 crc kubenswrapper[5047]: I0223 07:07:39.754340 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eb737ca7-c18f-4ff9-9285-bb35ee17cd05","Type":"ContainerStarted","Data":"8fc7018d470fbeb3c101b59cebfc490bd0f1cc9cb98ca2bbc51f91d35a503470"} Feb 23 07:07:39 crc kubenswrapper[5047]: I0223 07:07:39.754393 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eb737ca7-c18f-4ff9-9285-bb35ee17cd05","Type":"ContainerStarted","Data":"e871d47dd294c1855adad5a4275d3faa21d845644bd94d8d7dea5de4a9f81d8f"} Feb 23 07:07:40 crc kubenswrapper[5047]: I0223 07:07:40.765087 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cb54ddf68-t6k79" event={"ID":"27668b66-4868-448a-b2dd-e270ed4bc677","Type":"ContainerStarted","Data":"71481b06f6db4dcf1c5efa4c83fd7a72403b3075d95de76c6a2614c7e972fbcd"} Feb 23 07:07:40 crc kubenswrapper[5047]: I0223 07:07:40.766224 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:40 crc kubenswrapper[5047]: I0223 07:07:40.766245 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:40 crc kubenswrapper[5047]: I0223 07:07:40.771466 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eb737ca7-c18f-4ff9-9285-bb35ee17cd05","Type":"ContainerStarted","Data":"28fb45634504bebe907f3061a04e53baf4e24bef8b044220704d400b3404f735"} Feb 23 07:07:40 crc kubenswrapper[5047]: I0223 07:07:40.772383 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 23 07:07:40 crc kubenswrapper[5047]: I0223 07:07:40.801021 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5cb54ddf68-t6k79" podStartSLOduration=2.800826202 podStartE2EDuration="2.800826202s" podCreationTimestamp="2026-02-23 07:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:40.789782753 +0000 UTC m=+1383.041109897" watchObservedRunningTime="2026-02-23 07:07:40.800826202 +0000 UTC m=+1383.052153346" Feb 23 07:07:40 crc kubenswrapper[5047]: I0223 07:07:40.826168 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.826145982 podStartE2EDuration="3.826145982s" podCreationTimestamp="2026-02-23 07:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:40.806791293 +0000 UTC m=+1383.058118427" watchObservedRunningTime="2026-02-23 07:07:40.826145982 +0000 UTC m=+1383.077473116" Feb 23 07:07:41 crc kubenswrapper[5047]: I0223 07:07:41.964264 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.049899 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-b2csx"] Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.050725 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" podUID="bdaf4744-839c-4cad-aae8-c0b7396d9915" containerName="dnsmasq-dns" containerID="cri-o://992e78ea8204955943ad3ed2818ae4a36e25c710cae1df252541c2cf6ffc793c" gracePeriod=10 Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.086246 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.231925 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.771178 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.807635 5047 generic.go:334] "Generic (PLEG): container finished" podID="bdaf4744-839c-4cad-aae8-c0b7396d9915" containerID="992e78ea8204955943ad3ed2818ae4a36e25c710cae1df252541c2cf6ffc793c" exitCode=0 Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.807905 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3495c7bc-8bda-4e3e-bb7e-b147b38426cc" containerName="cinder-scheduler" containerID="cri-o://f10597bbd52866628f9f6ccf870c2f025b4b71e4c880e5bde19c27453acb0dee" gracePeriod=30 Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.807927 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" event={"ID":"bdaf4744-839c-4cad-aae8-c0b7396d9915","Type":"ContainerDied","Data":"992e78ea8204955943ad3ed2818ae4a36e25c710cae1df252541c2cf6ffc793c"} Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.808020 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" event={"ID":"bdaf4744-839c-4cad-aae8-c0b7396d9915","Type":"ContainerDied","Data":"205259eaeb81467f1579bfaca53fc93c660c4444f400e9fb9fe045ab24e36cba"} Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.808052 5047 scope.go:117] "RemoveContainer" containerID="992e78ea8204955943ad3ed2818ae4a36e25c710cae1df252541c2cf6ffc793c" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.808446 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.809256 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3495c7bc-8bda-4e3e-bb7e-b147b38426cc" containerName="probe" containerID="cri-o://bed7ad6764de94b649d939d5d9538f190d0a3d063c09a05000c745b08428162d" gracePeriod=30 Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.815254 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.855877 5047 scope.go:117] "RemoveContainer" containerID="c11f2ddda63038dac539e4536e912b8a4c9b69c019ac7074f03f3519b4cd2b57" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.882770 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-dns-swift-storage-0\") pod \"bdaf4744-839c-4cad-aae8-c0b7396d9915\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.882995 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-ovsdbserver-nb\") pod \"bdaf4744-839c-4cad-aae8-c0b7396d9915\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.883049 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-dns-svc\") pod \"bdaf4744-839c-4cad-aae8-c0b7396d9915\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.883123 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n66rp\" (UniqueName: \"kubernetes.io/projected/bdaf4744-839c-4cad-aae8-c0b7396d9915-kube-api-access-n66rp\") pod \"bdaf4744-839c-4cad-aae8-c0b7396d9915\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.883181 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-ovsdbserver-sb\") pod \"bdaf4744-839c-4cad-aae8-c0b7396d9915\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.883281 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-config\") pod \"bdaf4744-839c-4cad-aae8-c0b7396d9915\" (UID: \"bdaf4744-839c-4cad-aae8-c0b7396d9915\") " Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.899491 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdaf4744-839c-4cad-aae8-c0b7396d9915-kube-api-access-n66rp" (OuterVolumeSpecName: "kube-api-access-n66rp") pod "bdaf4744-839c-4cad-aae8-c0b7396d9915" (UID: "bdaf4744-839c-4cad-aae8-c0b7396d9915"). InnerVolumeSpecName "kube-api-access-n66rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.960850 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bdaf4744-839c-4cad-aae8-c0b7396d9915" (UID: "bdaf4744-839c-4cad-aae8-c0b7396d9915"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.966304 5047 scope.go:117] "RemoveContainer" containerID="992e78ea8204955943ad3ed2818ae4a36e25c710cae1df252541c2cf6ffc793c" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.966777 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bdaf4744-839c-4cad-aae8-c0b7396d9915" (UID: "bdaf4744-839c-4cad-aae8-c0b7396d9915"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:42 crc kubenswrapper[5047]: E0223 07:07:42.967363 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"992e78ea8204955943ad3ed2818ae4a36e25c710cae1df252541c2cf6ffc793c\": container with ID starting with 992e78ea8204955943ad3ed2818ae4a36e25c710cae1df252541c2cf6ffc793c not found: ID does not exist" containerID="992e78ea8204955943ad3ed2818ae4a36e25c710cae1df252541c2cf6ffc793c" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.967402 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"992e78ea8204955943ad3ed2818ae4a36e25c710cae1df252541c2cf6ffc793c"} err="failed to get container status \"992e78ea8204955943ad3ed2818ae4a36e25c710cae1df252541c2cf6ffc793c\": rpc error: code = NotFound desc = could not find container \"992e78ea8204955943ad3ed2818ae4a36e25c710cae1df252541c2cf6ffc793c\": container with ID starting with 992e78ea8204955943ad3ed2818ae4a36e25c710cae1df252541c2cf6ffc793c not found: ID does not exist" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.967436 5047 scope.go:117] "RemoveContainer" containerID="c11f2ddda63038dac539e4536e912b8a4c9b69c019ac7074f03f3519b4cd2b57" Feb 23 07:07:42 crc kubenswrapper[5047]: E0223 07:07:42.967922 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c11f2ddda63038dac539e4536e912b8a4c9b69c019ac7074f03f3519b4cd2b57\": container with ID starting with c11f2ddda63038dac539e4536e912b8a4c9b69c019ac7074f03f3519b4cd2b57 not found: ID does not exist" containerID="c11f2ddda63038dac539e4536e912b8a4c9b69c019ac7074f03f3519b4cd2b57" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.967949 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c11f2ddda63038dac539e4536e912b8a4c9b69c019ac7074f03f3519b4cd2b57"} err="failed to get container status \"c11f2ddda63038dac539e4536e912b8a4c9b69c019ac7074f03f3519b4cd2b57\": rpc error: code = NotFound desc = could not find container \"c11f2ddda63038dac539e4536e912b8a4c9b69c019ac7074f03f3519b4cd2b57\": container with ID starting with c11f2ddda63038dac539e4536e912b8a4c9b69c019ac7074f03f3519b4cd2b57 not found: ID does not exist" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.991248 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.991293 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n66rp\" (UniqueName: \"kubernetes.io/projected/bdaf4744-839c-4cad-aae8-c0b7396d9915-kube-api-access-n66rp\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:42 crc kubenswrapper[5047]: I0223 07:07:42.991306 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.015465 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bdaf4744-839c-4cad-aae8-c0b7396d9915" (UID: "bdaf4744-839c-4cad-aae8-c0b7396d9915"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.039187 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bdaf4744-839c-4cad-aae8-c0b7396d9915" (UID: "bdaf4744-839c-4cad-aae8-c0b7396d9915"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.078709 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-config" (OuterVolumeSpecName: "config") pod "bdaf4744-839c-4cad-aae8-c0b7396d9915" (UID: "bdaf4744-839c-4cad-aae8-c0b7396d9915"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.093952 5047 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.093988 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.094005 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdaf4744-839c-4cad-aae8-c0b7396d9915-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.185331 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-b2csx"] Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.193837 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-b2csx"] Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.202836 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-54d54c8555-hb2hm"] Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.203264 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-54d54c8555-hb2hm" podUID="89c85b56-2c82-442b-af26-77a1f0d294f4" containerName="neutron-api" containerID="cri-o://e0295183d4622f15cec9404b2b73a238a6d5a5e45dd0fa339ba26d01acc57cfd" gracePeriod=30 Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.203616 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-54d54c8555-hb2hm" podUID="89c85b56-2c82-442b-af26-77a1f0d294f4" containerName="neutron-httpd" containerID="cri-o://2249ebf762a07cddeea531d1e4a71499a988887c4ca6bcba65fac88ec4cbd24f" gracePeriod=30 Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.218094 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-785b8b86cf-6rvf8"] Feb 23 07:07:43 crc kubenswrapper[5047]: E0223 07:07:43.218793 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdaf4744-839c-4cad-aae8-c0b7396d9915" containerName="init" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.218816 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdaf4744-839c-4cad-aae8-c0b7396d9915" containerName="init" Feb 23 07:07:43 crc kubenswrapper[5047]: E0223 07:07:43.218855 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdaf4744-839c-4cad-aae8-c0b7396d9915" containerName="dnsmasq-dns" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.218863 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdaf4744-839c-4cad-aae8-c0b7396d9915" containerName="dnsmasq-dns" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.223268 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdaf4744-839c-4cad-aae8-c0b7396d9915" containerName="dnsmasq-dns" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.224549 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.226428 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-54d54c8555-hb2hm" podUID="89c85b56-2c82-442b-af26-77a1f0d294f4" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9696/\": read tcp 10.217.0.2:39398->10.217.0.152:9696: read: connection reset by peer" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.245524 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-785b8b86cf-6rvf8"] Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.282838 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.298298 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-config\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.298356 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-internal-tls-certs\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.298431 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-public-tls-certs\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.298655 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-ovndb-tls-certs\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.298690 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-combined-ca-bundle\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.298720 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-httpd-config\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.298752 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnvxl\" (UniqueName: \"kubernetes.io/projected/8b703a8a-7e8f-4565-abc7-86f93a83e742-kube-api-access-hnvxl\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.400869 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-httpd-config\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.400989 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnvxl\" (UniqueName: \"kubernetes.io/projected/8b703a8a-7e8f-4565-abc7-86f93a83e742-kube-api-access-hnvxl\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.401477 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-config\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.402026 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-internal-tls-certs\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.402139 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-public-tls-certs\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.402205 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-ovndb-tls-certs\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.402240 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-combined-ca-bundle\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.406936 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-config\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.407614 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-combined-ca-bundle\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.408402 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-httpd-config\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.409164 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-public-tls-certs\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.410206 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-internal-tls-certs\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.410936 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-ovndb-tls-certs\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.422725 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnvxl\" (UniqueName: \"kubernetes.io/projected/8b703a8a-7e8f-4565-abc7-86f93a83e742-kube-api-access-hnvxl\") pod \"neutron-785b8b86cf-6rvf8\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.465257 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.590706 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.819030 5047 generic.go:334] "Generic (PLEG): container finished" podID="89c85b56-2c82-442b-af26-77a1f0d294f4" containerID="2249ebf762a07cddeea531d1e4a71499a988887c4ca6bcba65fac88ec4cbd24f" exitCode=0 Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.819470 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54d54c8555-hb2hm" event={"ID":"89c85b56-2c82-442b-af26-77a1f0d294f4","Type":"ContainerDied","Data":"2249ebf762a07cddeea531d1e4a71499a988887c4ca6bcba65fac88ec4cbd24f"} Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.823119 5047 generic.go:334] "Generic (PLEG): container finished" podID="3495c7bc-8bda-4e3e-bb7e-b147b38426cc" containerID="bed7ad6764de94b649d939d5d9538f190d0a3d063c09a05000c745b08428162d" exitCode=0 Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.823172 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3495c7bc-8bda-4e3e-bb7e-b147b38426cc","Type":"ContainerDied","Data":"bed7ad6764de94b649d939d5d9538f190d0a3d063c09a05000c745b08428162d"} Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.828002 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c","Type":"ContainerStarted","Data":"6cb9f954c7db0d6e83e722d0a6b5a9e9b5954f9e8e9da60cba495b6d5d534e8d"} Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.828137 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:07:43 crc kubenswrapper[5047]: I0223 07:07:43.854253 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.902301847 podStartE2EDuration="9.854233535s" podCreationTimestamp="2026-02-23 07:07:34 +0000 UTC" firstStartedPulling="2026-02-23 07:07:35.85788034 +0000 UTC m=+1378.109207474" lastFinishedPulling="2026-02-23 07:07:42.809812028 +0000 UTC m=+1385.061139162" observedRunningTime="2026-02-23 07:07:43.852157503 +0000 UTC m=+1386.103484637" watchObservedRunningTime="2026-02-23 07:07:43.854233535 +0000 UTC m=+1386.105560669" Feb 23 07:07:44 crc kubenswrapper[5047]: I0223 07:07:44.216310 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-785b8b86cf-6rvf8"] Feb 23 07:07:44 crc kubenswrapper[5047]: W0223 07:07:44.222947 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b703a8a_7e8f_4565_abc7_86f93a83e742.slice/crio-f20d701da5a4e0807519f207c49a42c6639b83dfbf4cfe7619425012b38e9f00 WatchSource:0}: Error finding container f20d701da5a4e0807519f207c49a42c6639b83dfbf4cfe7619425012b38e9f00: Status 404 returned error can't find the container with id f20d701da5a4e0807519f207c49a42c6639b83dfbf4cfe7619425012b38e9f00 Feb 23 07:07:44 crc kubenswrapper[5047]: I0223 07:07:44.355150 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdaf4744-839c-4cad-aae8-c0b7396d9915" path="/var/lib/kubelet/pods/bdaf4744-839c-4cad-aae8-c0b7396d9915/volumes" Feb 23 07:07:44 crc kubenswrapper[5047]: I0223 07:07:44.856564 5047 generic.go:334] "Generic (PLEG): container finished" podID="89c85b56-2c82-442b-af26-77a1f0d294f4" containerID="e0295183d4622f15cec9404b2b73a238a6d5a5e45dd0fa339ba26d01acc57cfd" exitCode=0 Feb 23 07:07:44 crc kubenswrapper[5047]: I0223 07:07:44.856629 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54d54c8555-hb2hm" event={"ID":"89c85b56-2c82-442b-af26-77a1f0d294f4","Type":"ContainerDied","Data":"e0295183d4622f15cec9404b2b73a238a6d5a5e45dd0fa339ba26d01acc57cfd"} Feb 23 07:07:44 crc kubenswrapper[5047]: I0223 07:07:44.868344 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-785b8b86cf-6rvf8" event={"ID":"8b703a8a-7e8f-4565-abc7-86f93a83e742","Type":"ContainerStarted","Data":"563e1e2512e494abff485a89912e960e6a601055dece369633c27c745f0ea956"} Feb 23 07:07:44 crc kubenswrapper[5047]: I0223 07:07:44.868570 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-785b8b86cf-6rvf8" event={"ID":"8b703a8a-7e8f-4565-abc7-86f93a83e742","Type":"ContainerStarted","Data":"09006301b0f6020685672d00703b449ef9d808aef32ba06e059c9e2356603a84"} Feb 23 07:07:44 crc kubenswrapper[5047]: I0223 07:07:44.868595 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-785b8b86cf-6rvf8" event={"ID":"8b703a8a-7e8f-4565-abc7-86f93a83e742","Type":"ContainerStarted","Data":"f20d701da5a4e0807519f207c49a42c6639b83dfbf4cfe7619425012b38e9f00"} Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.415514 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.442163 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wd9m\" (UniqueName: \"kubernetes.io/projected/89c85b56-2c82-442b-af26-77a1f0d294f4-kube-api-access-8wd9m\") pod \"89c85b56-2c82-442b-af26-77a1f0d294f4\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.442271 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-ovndb-tls-certs\") pod \"89c85b56-2c82-442b-af26-77a1f0d294f4\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.442300 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-httpd-config\") pod \"89c85b56-2c82-442b-af26-77a1f0d294f4\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.442433 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-public-tls-certs\") pod \"89c85b56-2c82-442b-af26-77a1f0d294f4\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.442455 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-config\") pod \"89c85b56-2c82-442b-af26-77a1f0d294f4\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.444594 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-internal-tls-certs\") pod \"89c85b56-2c82-442b-af26-77a1f0d294f4\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.444709 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-combined-ca-bundle\") pod \"89c85b56-2c82-442b-af26-77a1f0d294f4\" (UID: \"89c85b56-2c82-442b-af26-77a1f0d294f4\") " Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.450139 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-785b8b86cf-6rvf8" podStartSLOduration=2.450106441 podStartE2EDuration="2.450106441s" podCreationTimestamp="2026-02-23 07:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:44.908988834 +0000 UTC m=+1387.160315968" watchObservedRunningTime="2026-02-23 07:07:45.450106441 +0000 UTC m=+1387.701433585" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.453243 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "89c85b56-2c82-442b-af26-77a1f0d294f4" (UID: "89c85b56-2c82-442b-af26-77a1f0d294f4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.471104 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c85b56-2c82-442b-af26-77a1f0d294f4-kube-api-access-8wd9m" (OuterVolumeSpecName: "kube-api-access-8wd9m") pod "89c85b56-2c82-442b-af26-77a1f0d294f4" (UID: "89c85b56-2c82-442b-af26-77a1f0d294f4"). InnerVolumeSpecName "kube-api-access-8wd9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.518236 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "89c85b56-2c82-442b-af26-77a1f0d294f4" (UID: "89c85b56-2c82-442b-af26-77a1f0d294f4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.519460 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "89c85b56-2c82-442b-af26-77a1f0d294f4" (UID: "89c85b56-2c82-442b-af26-77a1f0d294f4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.522098 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-config" (OuterVolumeSpecName: "config") pod "89c85b56-2c82-442b-af26-77a1f0d294f4" (UID: "89c85b56-2c82-442b-af26-77a1f0d294f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.536761 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89c85b56-2c82-442b-af26-77a1f0d294f4" (UID: "89c85b56-2c82-442b-af26-77a1f0d294f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.548809 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wd9m\" (UniqueName: \"kubernetes.io/projected/89c85b56-2c82-442b-af26-77a1f0d294f4-kube-api-access-8wd9m\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.549007 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.549093 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.549168 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.549247 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.549328 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.557733 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "89c85b56-2c82-442b-af26-77a1f0d294f4" (UID: "89c85b56-2c82-442b-af26-77a1f0d294f4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.651445 5047 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c85b56-2c82-442b-af26-77a1f0d294f4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.882009 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54d54c8555-hb2hm" event={"ID":"89c85b56-2c82-442b-af26-77a1f0d294f4","Type":"ContainerDied","Data":"ee45005f916de2f639683a0185ceb146e57fd7e1d5283e699364f1fa53f26153"} Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.882136 5047 scope.go:117] "RemoveContainer" containerID="2249ebf762a07cddeea531d1e4a71499a988887c4ca6bcba65fac88ec4cbd24f" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.882071 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54d54c8555-hb2hm" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.882405 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.919704 5047 scope.go:117] "RemoveContainer" containerID="e0295183d4622f15cec9404b2b73a238a6d5a5e45dd0fa339ba26d01acc57cfd" Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.935872 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-54d54c8555-hb2hm"] Feb 23 07:07:45 crc kubenswrapper[5047]: I0223 07:07:45.950703 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-54d54c8555-hb2hm"] Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.357316 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89c85b56-2c82-442b-af26-77a1f0d294f4" path="/var/lib/kubelet/pods/89c85b56-2c82-442b-af26-77a1f0d294f4/volumes" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.604483 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.684155 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-config-data\") pod \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.684597 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-combined-ca-bundle\") pod \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.684743 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-etc-machine-id\") pod \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.684926 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-scripts\") pod \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.684897 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3495c7bc-8bda-4e3e-bb7e-b147b38426cc" (UID: "3495c7bc-8bda-4e3e-bb7e-b147b38426cc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.685094 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg8r2\" (UniqueName: \"kubernetes.io/projected/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-kube-api-access-bg8r2\") pod \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.685301 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-config-data-custom\") pod \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\" (UID: \"3495c7bc-8bda-4e3e-bb7e-b147b38426cc\") " Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.686269 5047 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.692081 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3495c7bc-8bda-4e3e-bb7e-b147b38426cc" (UID: "3495c7bc-8bda-4e3e-bb7e-b147b38426cc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.693919 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-scripts" (OuterVolumeSpecName: "scripts") pod "3495c7bc-8bda-4e3e-bb7e-b147b38426cc" (UID: "3495c7bc-8bda-4e3e-bb7e-b147b38426cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.694104 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-kube-api-access-bg8r2" (OuterVolumeSpecName: "kube-api-access-bg8r2") pod "3495c7bc-8bda-4e3e-bb7e-b147b38426cc" (UID: "3495c7bc-8bda-4e3e-bb7e-b147b38426cc"). InnerVolumeSpecName "kube-api-access-bg8r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.742085 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3495c7bc-8bda-4e3e-bb7e-b147b38426cc" (UID: "3495c7bc-8bda-4e3e-bb7e-b147b38426cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.786249 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-config-data" (OuterVolumeSpecName: "config-data") pod "3495c7bc-8bda-4e3e-bb7e-b147b38426cc" (UID: "3495c7bc-8bda-4e3e-bb7e-b147b38426cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.789233 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.789266 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg8r2\" (UniqueName: \"kubernetes.io/projected/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-kube-api-access-bg8r2\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.789280 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.789293 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.789307 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3495c7bc-8bda-4e3e-bb7e-b147b38426cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.898035 5047 generic.go:334] "Generic (PLEG): container finished" podID="3495c7bc-8bda-4e3e-bb7e-b147b38426cc" containerID="f10597bbd52866628f9f6ccf870c2f025b4b71e4c880e5bde19c27453acb0dee" exitCode=0 Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.898166 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3495c7bc-8bda-4e3e-bb7e-b147b38426cc","Type":"ContainerDied","Data":"f10597bbd52866628f9f6ccf870c2f025b4b71e4c880e5bde19c27453acb0dee"} Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.898238 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3495c7bc-8bda-4e3e-bb7e-b147b38426cc","Type":"ContainerDied","Data":"7e24c487dac0f31ebfa1a01de0e94c781fa3bc8fa4506c0265942195d999c60a"} Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.898260 5047 scope.go:117] "RemoveContainer" containerID="bed7ad6764de94b649d939d5d9538f190d0a3d063c09a05000c745b08428162d" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.898330 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.937880 5047 scope.go:117] "RemoveContainer" containerID="f10597bbd52866628f9f6ccf870c2f025b4b71e4c880e5bde19c27453acb0dee" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.940182 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.948078 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.980103 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:46 crc kubenswrapper[5047]: E0223 07:07:46.980832 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c85b56-2c82-442b-af26-77a1f0d294f4" containerName="neutron-api" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.980852 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c85b56-2c82-442b-af26-77a1f0d294f4" containerName="neutron-api" Feb 23 07:07:46 crc kubenswrapper[5047]: E0223 07:07:46.980872 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c85b56-2c82-442b-af26-77a1f0d294f4" containerName="neutron-httpd" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.980880 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c85b56-2c82-442b-af26-77a1f0d294f4" containerName="neutron-httpd" Feb 23 07:07:46 crc kubenswrapper[5047]: E0223 07:07:46.980903 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3495c7bc-8bda-4e3e-bb7e-b147b38426cc" containerName="cinder-scheduler" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.980925 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3495c7bc-8bda-4e3e-bb7e-b147b38426cc" containerName="cinder-scheduler" Feb 23 07:07:46 crc kubenswrapper[5047]: E0223 07:07:46.980956 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3495c7bc-8bda-4e3e-bb7e-b147b38426cc" containerName="probe" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.980962 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3495c7bc-8bda-4e3e-bb7e-b147b38426cc" containerName="probe" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.981149 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="3495c7bc-8bda-4e3e-bb7e-b147b38426cc" containerName="probe" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.981165 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="89c85b56-2c82-442b-af26-77a1f0d294f4" containerName="neutron-api" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.981185 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="3495c7bc-8bda-4e3e-bb7e-b147b38426cc" containerName="cinder-scheduler" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.981203 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="89c85b56-2c82-442b-af26-77a1f0d294f4" containerName="neutron-httpd" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.982570 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.984226 5047 scope.go:117] "RemoveContainer" containerID="bed7ad6764de94b649d939d5d9538f190d0a3d063c09a05000c745b08428162d" Feb 23 07:07:46 crc kubenswrapper[5047]: E0223 07:07:46.984753 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed7ad6764de94b649d939d5d9538f190d0a3d063c09a05000c745b08428162d\": container with ID starting with bed7ad6764de94b649d939d5d9538f190d0a3d063c09a05000c745b08428162d not found: ID does not exist" containerID="bed7ad6764de94b649d939d5d9538f190d0a3d063c09a05000c745b08428162d" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.984807 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed7ad6764de94b649d939d5d9538f190d0a3d063c09a05000c745b08428162d"} err="failed to get container status \"bed7ad6764de94b649d939d5d9538f190d0a3d063c09a05000c745b08428162d\": rpc error: code = NotFound desc = could not find container \"bed7ad6764de94b649d939d5d9538f190d0a3d063c09a05000c745b08428162d\": container with ID starting with bed7ad6764de94b649d939d5d9538f190d0a3d063c09a05000c745b08428162d not found: ID does not exist" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.984844 5047 scope.go:117] "RemoveContainer" containerID="f10597bbd52866628f9f6ccf870c2f025b4b71e4c880e5bde19c27453acb0dee" Feb 23 07:07:46 crc kubenswrapper[5047]: E0223 07:07:46.985192 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f10597bbd52866628f9f6ccf870c2f025b4b71e4c880e5bde19c27453acb0dee\": container with ID starting with f10597bbd52866628f9f6ccf870c2f025b4b71e4c880e5bde19c27453acb0dee not found: ID does not exist" containerID="f10597bbd52866628f9f6ccf870c2f025b4b71e4c880e5bde19c27453acb0dee" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.985227 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f10597bbd52866628f9f6ccf870c2f025b4b71e4c880e5bde19c27453acb0dee"} err="failed to get container status \"f10597bbd52866628f9f6ccf870c2f025b4b71e4c880e5bde19c27453acb0dee\": rpc error: code = NotFound desc = could not find container \"f10597bbd52866628f9f6ccf870c2f025b4b71e4c880e5bde19c27453acb0dee\": container with ID starting with f10597bbd52866628f9f6ccf870c2f025b4b71e4c880e5bde19c27453acb0dee not found: ID does not exist" Feb 23 07:07:46 crc kubenswrapper[5047]: I0223 07:07:46.990564 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.002019 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.095492 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.095579 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.095627 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-scripts\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.095696 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-478lp\" (UniqueName: \"kubernetes.io/projected/4352c518-ada9-4e5e-9327-5bd3c34a2796-kube-api-access-478lp\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.096073 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-config-data\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.096502 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4352c518-ada9-4e5e-9327-5bd3c34a2796-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.198898 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.199453 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.199487 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-scripts\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.199535 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-478lp\" (UniqueName: \"kubernetes.io/projected/4352c518-ada9-4e5e-9327-5bd3c34a2796-kube-api-access-478lp\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.199596 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-config-data\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.199643 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4352c518-ada9-4e5e-9327-5bd3c34a2796-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.199781 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4352c518-ada9-4e5e-9327-5bd3c34a2796-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.206055 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.207791 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-scripts\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.207817 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.208021 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-config-data\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.226785 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-478lp\" (UniqueName: \"kubernetes.io/projected/4352c518-ada9-4e5e-9327-5bd3c34a2796-kube-api-access-478lp\") pod \"cinder-scheduler-0\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.314022 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.563530 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-db5c97f8f-b2csx" podUID="bdaf4744-839c-4cad-aae8-c0b7396d9915" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: i/o timeout" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.638016 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.638445 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:07:47 crc kubenswrapper[5047]: W0223 07:07:47.802294 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4352c518_ada9_4e5e_9327_5bd3c34a2796.slice/crio-34bba24d9435900c4d3d508105adeca7395482f32084702d9e129a01e170eb4d WatchSource:0}: Error finding container 34bba24d9435900c4d3d508105adeca7395482f32084702d9e129a01e170eb4d: Status 404 returned error can't find the container with id 34bba24d9435900c4d3d508105adeca7395482f32084702d9e129a01e170eb4d Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.803946 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.925048 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4352c518-ada9-4e5e-9327-5bd3c34a2796","Type":"ContainerStarted","Data":"34bba24d9435900c4d3d508105adeca7395482f32084702d9e129a01e170eb4d"} Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.932149 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5f4db5cb66-tjpmr"] Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.934438 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:47 crc kubenswrapper[5047]: I0223 07:07:47.953172 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f4db5cb66-tjpmr"] Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.017619 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-internal-tls-certs\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.017787 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.018191 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-public-tls-certs\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.018478 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.018542 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-combined-ca-bundle\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.019251 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a452777f-60a7-4cfc-9e9b-b262be0a27cf-logs\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.019370 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k82bh\" (UniqueName: \"kubernetes.io/projected/a452777f-60a7-4cfc-9e9b-b262be0a27cf-kube-api-access-k82bh\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.121602 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k82bh\" (UniqueName: \"kubernetes.io/projected/a452777f-60a7-4cfc-9e9b-b262be0a27cf-kube-api-access-k82bh\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.121737 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-internal-tls-certs\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.121797 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.121832 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-public-tls-certs\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.121958 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.121989 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-combined-ca-bundle\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.122099 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a452777f-60a7-4cfc-9e9b-b262be0a27cf-logs\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.122893 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a452777f-60a7-4cfc-9e9b-b262be0a27cf-logs\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.126275 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.126580 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.126625 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-public-tls-certs\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.126888 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-internal-tls-certs\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.137107 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-combined-ca-bundle\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.138519 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k82bh\" (UniqueName: \"kubernetes.io/projected/a452777f-60a7-4cfc-9e9b-b262be0a27cf-kube-api-access-k82bh\") pod \"placement-5f4db5cb66-tjpmr\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.266484 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.415600 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3495c7bc-8bda-4e3e-bb7e-b147b38426cc" path="/var/lib/kubelet/pods/3495c7bc-8bda-4e3e-bb7e-b147b38426cc/volumes" Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.786322 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f4db5cb66-tjpmr"] Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.970127 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4352c518-ada9-4e5e-9327-5bd3c34a2796","Type":"ContainerStarted","Data":"b1156e1b07cb20a8e9857fce83fae53cb9cfb48c367023bcb30470fba1c4f122"} Feb 23 07:07:48 crc kubenswrapper[5047]: I0223 07:07:48.972040 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f4db5cb66-tjpmr" event={"ID":"a452777f-60a7-4cfc-9e9b-b262be0a27cf","Type":"ContainerStarted","Data":"a48ca9edf9fb0ab543fac6566b706c45ff5504533f7079893ebae06fdb015716"} Feb 23 07:07:50 crc kubenswrapper[5047]: I0223 07:07:50.038671 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4352c518-ada9-4e5e-9327-5bd3c34a2796","Type":"ContainerStarted","Data":"999e38cfe887aedaec12de40ca31ea006bf118394964b12586bbd7546473dc73"} Feb 23 07:07:50 crc kubenswrapper[5047]: I0223 07:07:50.042736 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f4db5cb66-tjpmr" event={"ID":"a452777f-60a7-4cfc-9e9b-b262be0a27cf","Type":"ContainerStarted","Data":"1bff42b06a8741633963197012e9bcd6401b4df5472444c488f4134bb829bd6d"} Feb 23 07:07:50 crc kubenswrapper[5047]: I0223 07:07:50.042926 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f4db5cb66-tjpmr" event={"ID":"a452777f-60a7-4cfc-9e9b-b262be0a27cf","Type":"ContainerStarted","Data":"a0e78cccdc69fcb4e0931f1814bebbb8de46564c39b911b4f4864d69cd56c138"} Feb 23 07:07:50 crc kubenswrapper[5047]: I0223 07:07:50.043046 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:50 crc kubenswrapper[5047]: I0223 07:07:50.068236 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.068215472 podStartE2EDuration="4.068215472s" podCreationTimestamp="2026-02-23 07:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:50.066183581 +0000 UTC m=+1392.317510715" watchObservedRunningTime="2026-02-23 07:07:50.068215472 +0000 UTC m=+1392.319542606" Feb 23 07:07:50 crc kubenswrapper[5047]: I0223 07:07:50.093444 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5f4db5cb66-tjpmr" podStartSLOduration=3.09342335 podStartE2EDuration="3.09342335s" podCreationTimestamp="2026-02-23 07:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:50.089434438 +0000 UTC m=+1392.340761572" watchObservedRunningTime="2026-02-23 07:07:50.09342335 +0000 UTC m=+1392.344750484" Feb 23 07:07:50 crc kubenswrapper[5047]: I0223 07:07:50.813749 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 23 07:07:50 crc kubenswrapper[5047]: I0223 07:07:50.924370 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:51 crc kubenswrapper[5047]: I0223 07:07:51.051601 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:07:51 crc kubenswrapper[5047]: I0223 07:07:51.094813 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:07:51 crc kubenswrapper[5047]: I0223 07:07:51.231475 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5856459f5b-mnzgp"] Feb 23 07:07:51 crc kubenswrapper[5047]: I0223 07:07:51.231779 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5856459f5b-mnzgp" podUID="b602e57c-15c3-42cc-b37a-740047deb948" containerName="barbican-api-log" containerID="cri-o://ac57ae972b740c745ab6d81111292382cca65cd485b190d599b552aa597e419a" gracePeriod=30 Feb 23 07:07:51 crc kubenswrapper[5047]: I0223 07:07:51.232339 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5856459f5b-mnzgp" podUID="b602e57c-15c3-42cc-b37a-740047deb948" containerName="barbican-api" containerID="cri-o://8dd78583fe6bc8b5e349ae87651f76072d80a628b66dd462ab8922f0b45f99fc" gracePeriod=30 Feb 23 07:07:52 crc kubenswrapper[5047]: I0223 07:07:52.065002 5047 generic.go:334] "Generic (PLEG): container finished" podID="b602e57c-15c3-42cc-b37a-740047deb948" containerID="ac57ae972b740c745ab6d81111292382cca65cd485b190d599b552aa597e419a" exitCode=143 Feb 23 07:07:52 crc kubenswrapper[5047]: I0223 07:07:52.066646 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5856459f5b-mnzgp" event={"ID":"b602e57c-15c3-42cc-b37a-740047deb948","Type":"ContainerDied","Data":"ac57ae972b740c745ab6d81111292382cca65cd485b190d599b552aa597e419a"} Feb 23 07:07:52 crc kubenswrapper[5047]: I0223 07:07:52.314592 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 23 07:07:53 crc kubenswrapper[5047]: I0223 07:07:53.672465 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:07:54 crc kubenswrapper[5047]: I0223 07:07:54.683554 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5856459f5b-mnzgp" podUID="b602e57c-15c3-42cc-b37a-740047deb948" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:55760->10.217.0.158:9311: read: connection reset by peer" Feb 23 07:07:54 crc kubenswrapper[5047]: I0223 07:07:54.683690 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5856459f5b-mnzgp" podUID="b602e57c-15c3-42cc-b37a-740047deb948" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:55776->10.217.0.158:9311: read: connection reset by peer" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.094312 5047 generic.go:334] "Generic (PLEG): container finished" podID="b602e57c-15c3-42cc-b37a-740047deb948" containerID="8dd78583fe6bc8b5e349ae87651f76072d80a628b66dd462ab8922f0b45f99fc" exitCode=0 Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.094402 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5856459f5b-mnzgp" event={"ID":"b602e57c-15c3-42cc-b37a-740047deb948","Type":"ContainerDied","Data":"8dd78583fe6bc8b5e349ae87651f76072d80a628b66dd462ab8922f0b45f99fc"} Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.204336 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.403466 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-config-data-custom\") pod \"b602e57c-15c3-42cc-b37a-740047deb948\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.405036 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-combined-ca-bundle\") pod \"b602e57c-15c3-42cc-b37a-740047deb948\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.405286 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b602e57c-15c3-42cc-b37a-740047deb948-logs\") pod \"b602e57c-15c3-42cc-b37a-740047deb948\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.405444 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg5lz\" (UniqueName: \"kubernetes.io/projected/b602e57c-15c3-42cc-b37a-740047deb948-kube-api-access-cg5lz\") pod \"b602e57c-15c3-42cc-b37a-740047deb948\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.405616 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-config-data\") pod \"b602e57c-15c3-42cc-b37a-740047deb948\" (UID: \"b602e57c-15c3-42cc-b37a-740047deb948\") " Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.407414 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b602e57c-15c3-42cc-b37a-740047deb948-logs" (OuterVolumeSpecName: "logs") pod "b602e57c-15c3-42cc-b37a-740047deb948" (UID: "b602e57c-15c3-42cc-b37a-740047deb948"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.415105 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b602e57c-15c3-42cc-b37a-740047deb948" (UID: "b602e57c-15c3-42cc-b37a-740047deb948"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.432153 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b602e57c-15c3-42cc-b37a-740047deb948-kube-api-access-cg5lz" (OuterVolumeSpecName: "kube-api-access-cg5lz") pod "b602e57c-15c3-42cc-b37a-740047deb948" (UID: "b602e57c-15c3-42cc-b37a-740047deb948"). InnerVolumeSpecName "kube-api-access-cg5lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.482772 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b602e57c-15c3-42cc-b37a-740047deb948" (UID: "b602e57c-15c3-42cc-b37a-740047deb948"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.503492 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-config-data" (OuterVolumeSpecName: "config-data") pod "b602e57c-15c3-42cc-b37a-740047deb948" (UID: "b602e57c-15c3-42cc-b37a-740047deb948"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.508174 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.508275 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.508289 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b602e57c-15c3-42cc-b37a-740047deb948-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.508299 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg5lz\" (UniqueName: \"kubernetes.io/projected/b602e57c-15c3-42cc-b37a-740047deb948-kube-api-access-cg5lz\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.508311 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b602e57c-15c3-42cc-b37a-740047deb948-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.632466 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 23 07:07:55 crc kubenswrapper[5047]: E0223 07:07:55.633148 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b602e57c-15c3-42cc-b37a-740047deb948" containerName="barbican-api-log" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.633188 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b602e57c-15c3-42cc-b37a-740047deb948" containerName="barbican-api-log" Feb 23 07:07:55 crc kubenswrapper[5047]: E0223 07:07:55.633279 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b602e57c-15c3-42cc-b37a-740047deb948" containerName="barbican-api" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.633301 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b602e57c-15c3-42cc-b37a-740047deb948" containerName="barbican-api" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.633716 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b602e57c-15c3-42cc-b37a-740047deb948" containerName="barbican-api" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.633769 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b602e57c-15c3-42cc-b37a-740047deb948" containerName="barbican-api-log" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.635362 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.638079 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.638855 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-m45fr" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.638993 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.645774 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.713045 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae539167-f011-4f50-8ce6-df90580fa157-openstack-config-secret\") pod \"openstackclient\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " pod="openstack/openstackclient" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.713396 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae539167-f011-4f50-8ce6-df90580fa157-openstack-config\") pod \"openstackclient\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " pod="openstack/openstackclient" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.713425 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae539167-f011-4f50-8ce6-df90580fa157-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " pod="openstack/openstackclient" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.713486 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxnk7\" (UniqueName: \"kubernetes.io/projected/ae539167-f011-4f50-8ce6-df90580fa157-kube-api-access-cxnk7\") pod \"openstackclient\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " pod="openstack/openstackclient" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.815822 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae539167-f011-4f50-8ce6-df90580fa157-openstack-config-secret\") pod \"openstackclient\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " pod="openstack/openstackclient" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.815921 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae539167-f011-4f50-8ce6-df90580fa157-openstack-config\") pod \"openstackclient\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " pod="openstack/openstackclient" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.815955 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae539167-f011-4f50-8ce6-df90580fa157-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " pod="openstack/openstackclient" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.816052 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxnk7\" (UniqueName: \"kubernetes.io/projected/ae539167-f011-4f50-8ce6-df90580fa157-kube-api-access-cxnk7\") pod \"openstackclient\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " pod="openstack/openstackclient" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.817404 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae539167-f011-4f50-8ce6-df90580fa157-openstack-config\") pod \"openstackclient\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " pod="openstack/openstackclient" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.822381 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae539167-f011-4f50-8ce6-df90580fa157-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " pod="openstack/openstackclient" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.822548 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae539167-f011-4f50-8ce6-df90580fa157-openstack-config-secret\") pod \"openstackclient\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " pod="openstack/openstackclient" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.834723 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxnk7\" (UniqueName: \"kubernetes.io/projected/ae539167-f011-4f50-8ce6-df90580fa157-kube-api-access-cxnk7\") pod \"openstackclient\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " pod="openstack/openstackclient" Feb 23 07:07:55 crc kubenswrapper[5047]: I0223 07:07:55.956658 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 07:07:56 crc kubenswrapper[5047]: I0223 07:07:56.125876 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5856459f5b-mnzgp" event={"ID":"b602e57c-15c3-42cc-b37a-740047deb948","Type":"ContainerDied","Data":"b9ff29c385a088cd67807ff7cf9447cc7236b3ff0917f6b7b84ff973042bfa02"} Feb 23 07:07:56 crc kubenswrapper[5047]: I0223 07:07:56.125992 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5856459f5b-mnzgp" Feb 23 07:07:56 crc kubenswrapper[5047]: I0223 07:07:56.126010 5047 scope.go:117] "RemoveContainer" containerID="8dd78583fe6bc8b5e349ae87651f76072d80a628b66dd462ab8922f0b45f99fc" Feb 23 07:07:56 crc kubenswrapper[5047]: I0223 07:07:56.162886 5047 scope.go:117] "RemoveContainer" containerID="ac57ae972b740c745ab6d81111292382cca65cd485b190d599b552aa597e419a" Feb 23 07:07:56 crc kubenswrapper[5047]: I0223 07:07:56.183779 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5856459f5b-mnzgp"] Feb 23 07:07:56 crc kubenswrapper[5047]: I0223 07:07:56.199999 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5856459f5b-mnzgp"] Feb 23 07:07:56 crc kubenswrapper[5047]: I0223 07:07:56.352388 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b602e57c-15c3-42cc-b37a-740047deb948" path="/var/lib/kubelet/pods/b602e57c-15c3-42cc-b37a-740047deb948/volumes" Feb 23 07:07:56 crc kubenswrapper[5047]: I0223 07:07:56.493149 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 07:07:56 crc kubenswrapper[5047]: W0223 07:07:56.494569 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae539167_f011_4f50_8ce6_df90580fa157.slice/crio-856eae32c890684ab1f8c68c9d296c8416d340c40308a863b2241e5eaf7aa35b WatchSource:0}: Error finding container 856eae32c890684ab1f8c68c9d296c8416d340c40308a863b2241e5eaf7aa35b: Status 404 returned error can't find the container with id 856eae32c890684ab1f8c68c9d296c8416d340c40308a863b2241e5eaf7aa35b Feb 23 07:07:57 crc kubenswrapper[5047]: I0223 07:07:57.137563 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ae539167-f011-4f50-8ce6-df90580fa157","Type":"ContainerStarted","Data":"856eae32c890684ab1f8c68c9d296c8416d340c40308a863b2241e5eaf7aa35b"} Feb 23 07:07:57 crc kubenswrapper[5047]: I0223 07:07:57.612663 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.460467 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-d867c5cb7-qbvv7"] Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.465342 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.474767 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.475384 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.475494 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.504377 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d867c5cb7-qbvv7"] Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.549702 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-etc-swift\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.549754 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-internal-tls-certs\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.549823 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n4v2\" (UniqueName: \"kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-kube-api-access-2n4v2\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.549857 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f29958fb-3a36-427e-8094-62f7522b7a17-run-httpd\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.549873 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-combined-ca-bundle\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.549894 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f29958fb-3a36-427e-8094-62f7522b7a17-log-httpd\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.549941 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-public-tls-certs\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.549961 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-config-data\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.653189 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n4v2\" (UniqueName: \"kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-kube-api-access-2n4v2\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.653286 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-combined-ca-bundle\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.656093 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f29958fb-3a36-427e-8094-62f7522b7a17-run-httpd\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.656123 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f29958fb-3a36-427e-8094-62f7522b7a17-log-httpd\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.656170 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-public-tls-certs\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.656194 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-config-data\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.656352 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-etc-swift\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.656379 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-internal-tls-certs\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.659520 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f29958fb-3a36-427e-8094-62f7522b7a17-run-httpd\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.660833 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f29958fb-3a36-427e-8094-62f7522b7a17-log-httpd\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.666994 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-config-data\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.668604 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-public-tls-certs\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.671836 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-internal-tls-certs\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.673830 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-etc-swift\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.681654 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n4v2\" (UniqueName: \"kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-kube-api-access-2n4v2\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.689154 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-combined-ca-bundle\") pod \"swift-proxy-d867c5cb7-qbvv7\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:01 crc kubenswrapper[5047]: I0223 07:08:01.792501 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:02 crc kubenswrapper[5047]: I0223 07:08:02.077740 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:02 crc kubenswrapper[5047]: I0223 07:08:02.078607 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="ceilometer-central-agent" containerID="cri-o://8d83ede9036b9fd8c2e334d730c2786748c4c52e7f0242494531c434bc3ec5be" gracePeriod=30 Feb 23 07:08:02 crc kubenswrapper[5047]: I0223 07:08:02.078710 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="proxy-httpd" containerID="cri-o://6cb9f954c7db0d6e83e722d0a6b5a9e9b5954f9e8e9da60cba495b6d5d534e8d" gracePeriod=30 Feb 23 07:08:02 crc kubenswrapper[5047]: I0223 07:08:02.078790 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="sg-core" containerID="cri-o://1710a1afaf8b8d1863ad4a91900f4520a33242333b0ea1818cb9d87e03a7c7e5" gracePeriod=30 Feb 23 07:08:02 crc kubenswrapper[5047]: I0223 07:08:02.078832 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="ceilometer-notification-agent" containerID="cri-o://b16634f8ca7b2464266a2169de1e2b10c81111ece98874a629f7e7c4dbd657c8" gracePeriod=30 Feb 23 07:08:02 crc kubenswrapper[5047]: I0223 07:08:02.084993 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 23 07:08:03 crc kubenswrapper[5047]: I0223 07:08:03.206625 5047 generic.go:334] "Generic (PLEG): container finished" podID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerID="6cb9f954c7db0d6e83e722d0a6b5a9e9b5954f9e8e9da60cba495b6d5d534e8d" exitCode=0 Feb 23 07:08:03 crc kubenswrapper[5047]: I0223 07:08:03.206668 5047 generic.go:334] "Generic (PLEG): container finished" podID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerID="1710a1afaf8b8d1863ad4a91900f4520a33242333b0ea1818cb9d87e03a7c7e5" exitCode=2 Feb 23 07:08:03 crc kubenswrapper[5047]: I0223 07:08:03.206677 5047 generic.go:334] "Generic (PLEG): container finished" podID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerID="8d83ede9036b9fd8c2e334d730c2786748c4c52e7f0242494531c434bc3ec5be" exitCode=0 Feb 23 07:08:03 crc kubenswrapper[5047]: I0223 07:08:03.206700 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c","Type":"ContainerDied","Data":"6cb9f954c7db0d6e83e722d0a6b5a9e9b5954f9e8e9da60cba495b6d5d534e8d"} Feb 23 07:08:03 crc kubenswrapper[5047]: I0223 07:08:03.206737 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c","Type":"ContainerDied","Data":"1710a1afaf8b8d1863ad4a91900f4520a33242333b0ea1818cb9d87e03a7c7e5"} Feb 23 07:08:03 crc kubenswrapper[5047]: I0223 07:08:03.206748 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c","Type":"ContainerDied","Data":"8d83ede9036b9fd8c2e334d730c2786748c4c52e7f0242494531c434bc3ec5be"} Feb 23 07:08:03 crc kubenswrapper[5047]: I0223 07:08:03.367437 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:08:03 crc kubenswrapper[5047]: I0223 07:08:03.367749 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="391de77f-1a20-4fdf-90fb-413ab5e7cc3f" containerName="glance-log" containerID="cri-o://32f4b174839bd5ba33fa916bb23331139b624944cb0f61b780cd7e606f20324a" gracePeriod=30 Feb 23 07:08:03 crc kubenswrapper[5047]: I0223 07:08:03.368272 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="391de77f-1a20-4fdf-90fb-413ab5e7cc3f" containerName="glance-httpd" containerID="cri-o://ace6ecfe551c44e9efb31d1ea90d89e9ff52a5828d4bf6e2593aac5d72081b6e" gracePeriod=30 Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.190858 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-w65gd"] Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.193090 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-w65gd" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.227033 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-w65gd"] Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.245242 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx7c6\" (UniqueName: \"kubernetes.io/projected/49095dcf-0259-4bde-a0c1-695841ebd224-kube-api-access-mx7c6\") pod \"nova-api-db-create-w65gd\" (UID: \"49095dcf-0259-4bde-a0c1-695841ebd224\") " pod="openstack/nova-api-db-create-w65gd" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.245441 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49095dcf-0259-4bde-a0c1-695841ebd224-operator-scripts\") pod \"nova-api-db-create-w65gd\" (UID: \"49095dcf-0259-4bde-a0c1-695841ebd224\") " pod="openstack/nova-api-db-create-w65gd" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.252849 5047 generic.go:334] "Generic (PLEG): container finished" podID="391de77f-1a20-4fdf-90fb-413ab5e7cc3f" containerID="32f4b174839bd5ba33fa916bb23331139b624944cb0f61b780cd7e606f20324a" exitCode=143 Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.252951 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"391de77f-1a20-4fdf-90fb-413ab5e7cc3f","Type":"ContainerDied","Data":"32f4b174839bd5ba33fa916bb23331139b624944cb0f61b780cd7e606f20324a"} Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.292723 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-r5j6n"] Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.294715 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r5j6n" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.304223 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-r5j6n"] Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.347236 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49095dcf-0259-4bde-a0c1-695841ebd224-operator-scripts\") pod \"nova-api-db-create-w65gd\" (UID: \"49095dcf-0259-4bde-a0c1-695841ebd224\") " pod="openstack/nova-api-db-create-w65gd" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.347298 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4675c789-acd8-480a-a09f-52ab9a827bb6-operator-scripts\") pod \"nova-cell0-db-create-r5j6n\" (UID: \"4675c789-acd8-480a-a09f-52ab9a827bb6\") " pod="openstack/nova-cell0-db-create-r5j6n" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.347411 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx7c6\" (UniqueName: \"kubernetes.io/projected/49095dcf-0259-4bde-a0c1-695841ebd224-kube-api-access-mx7c6\") pod \"nova-api-db-create-w65gd\" (UID: \"49095dcf-0259-4bde-a0c1-695841ebd224\") " pod="openstack/nova-api-db-create-w65gd" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.347558 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsj9x\" (UniqueName: \"kubernetes.io/projected/4675c789-acd8-480a-a09f-52ab9a827bb6-kube-api-access-jsj9x\") pod \"nova-cell0-db-create-r5j6n\" (UID: \"4675c789-acd8-480a-a09f-52ab9a827bb6\") " pod="openstack/nova-cell0-db-create-r5j6n" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.348940 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49095dcf-0259-4bde-a0c1-695841ebd224-operator-scripts\") pod \"nova-api-db-create-w65gd\" (UID: \"49095dcf-0259-4bde-a0c1-695841ebd224\") " pod="openstack/nova-api-db-create-w65gd" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.384998 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx7c6\" (UniqueName: \"kubernetes.io/projected/49095dcf-0259-4bde-a0c1-695841ebd224-kube-api-access-mx7c6\") pod \"nova-api-db-create-w65gd\" (UID: \"49095dcf-0259-4bde-a0c1-695841ebd224\") " pod="openstack/nova-api-db-create-w65gd" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.405617 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-ac64-account-create-update-thwgw"] Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.407227 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac64-account-create-update-thwgw" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.433470 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.461560 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4675c789-acd8-480a-a09f-52ab9a827bb6-operator-scripts\") pod \"nova-cell0-db-create-r5j6n\" (UID: \"4675c789-acd8-480a-a09f-52ab9a827bb6\") " pod="openstack/nova-cell0-db-create-r5j6n" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.461664 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67q8t\" (UniqueName: \"kubernetes.io/projected/15d0a292-4600-45ed-a947-37e93bccaea8-kube-api-access-67q8t\") pod \"nova-api-ac64-account-create-update-thwgw\" (UID: \"15d0a292-4600-45ed-a947-37e93bccaea8\") " pod="openstack/nova-api-ac64-account-create-update-thwgw" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.461712 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d0a292-4600-45ed-a947-37e93bccaea8-operator-scripts\") pod \"nova-api-ac64-account-create-update-thwgw\" (UID: \"15d0a292-4600-45ed-a947-37e93bccaea8\") " pod="openstack/nova-api-ac64-account-create-update-thwgw" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.461766 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsj9x\" (UniqueName: \"kubernetes.io/projected/4675c789-acd8-480a-a09f-52ab9a827bb6-kube-api-access-jsj9x\") pod \"nova-cell0-db-create-r5j6n\" (UID: \"4675c789-acd8-480a-a09f-52ab9a827bb6\") " pod="openstack/nova-cell0-db-create-r5j6n" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.462693 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4675c789-acd8-480a-a09f-52ab9a827bb6-operator-scripts\") pod \"nova-cell0-db-create-r5j6n\" (UID: \"4675c789-acd8-480a-a09f-52ab9a827bb6\") " pod="openstack/nova-cell0-db-create-r5j6n" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.492755 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ac64-account-create-update-thwgw"] Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.512507 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-w65gd" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.519106 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-z9zcf"] Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.521381 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z9zcf" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.534535 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsj9x\" (UniqueName: \"kubernetes.io/projected/4675c789-acd8-480a-a09f-52ab9a827bb6-kube-api-access-jsj9x\") pod \"nova-cell0-db-create-r5j6n\" (UID: \"4675c789-acd8-480a-a09f-52ab9a827bb6\") " pod="openstack/nova-cell0-db-create-r5j6n" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.541919 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-z9zcf"] Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.563775 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d0a292-4600-45ed-a947-37e93bccaea8-operator-scripts\") pod \"nova-api-ac64-account-create-update-thwgw\" (UID: \"15d0a292-4600-45ed-a947-37e93bccaea8\") " pod="openstack/nova-api-ac64-account-create-update-thwgw" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.564031 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67q8t\" (UniqueName: \"kubernetes.io/projected/15d0a292-4600-45ed-a947-37e93bccaea8-kube-api-access-67q8t\") pod \"nova-api-ac64-account-create-update-thwgw\" (UID: \"15d0a292-4600-45ed-a947-37e93bccaea8\") " pod="openstack/nova-api-ac64-account-create-update-thwgw" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.586567 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d0a292-4600-45ed-a947-37e93bccaea8-operator-scripts\") pod \"nova-api-ac64-account-create-update-thwgw\" (UID: \"15d0a292-4600-45ed-a947-37e93bccaea8\") " pod="openstack/nova-api-ac64-account-create-update-thwgw" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.609814 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67q8t\" (UniqueName: \"kubernetes.io/projected/15d0a292-4600-45ed-a947-37e93bccaea8-kube-api-access-67q8t\") pod \"nova-api-ac64-account-create-update-thwgw\" (UID: \"15d0a292-4600-45ed-a947-37e93bccaea8\") " pod="openstack/nova-api-ac64-account-create-update-thwgw" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.651506 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r5j6n" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.695256 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7dbl\" (UniqueName: \"kubernetes.io/projected/2c981b84-6b45-4ba9-be71-22f074a5ccd4-kube-api-access-g7dbl\") pod \"nova-cell1-db-create-z9zcf\" (UID: \"2c981b84-6b45-4ba9-be71-22f074a5ccd4\") " pod="openstack/nova-cell1-db-create-z9zcf" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.695463 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c981b84-6b45-4ba9-be71-22f074a5ccd4-operator-scripts\") pod \"nova-cell1-db-create-z9zcf\" (UID: \"2c981b84-6b45-4ba9-be71-22f074a5ccd4\") " pod="openstack/nova-cell1-db-create-z9zcf" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.719532 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5b36-account-create-update-pthd9"] Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.722244 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5b36-account-create-update-pthd9" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.727464 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.811857 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac64-account-create-update-thwgw" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.814047 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpjkd\" (UniqueName: \"kubernetes.io/projected/f2fe919f-6631-4332-af78-35e21d657657-kube-api-access-fpjkd\") pod \"nova-cell0-5b36-account-create-update-pthd9\" (UID: \"f2fe919f-6631-4332-af78-35e21d657657\") " pod="openstack/nova-cell0-5b36-account-create-update-pthd9" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.814127 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2fe919f-6631-4332-af78-35e21d657657-operator-scripts\") pod \"nova-cell0-5b36-account-create-update-pthd9\" (UID: \"f2fe919f-6631-4332-af78-35e21d657657\") " pod="openstack/nova-cell0-5b36-account-create-update-pthd9" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.814184 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7dbl\" (UniqueName: \"kubernetes.io/projected/2c981b84-6b45-4ba9-be71-22f074a5ccd4-kube-api-access-g7dbl\") pod \"nova-cell1-db-create-z9zcf\" (UID: \"2c981b84-6b45-4ba9-be71-22f074a5ccd4\") " pod="openstack/nova-cell1-db-create-z9zcf" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.814273 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c981b84-6b45-4ba9-be71-22f074a5ccd4-operator-scripts\") pod \"nova-cell1-db-create-z9zcf\" (UID: \"2c981b84-6b45-4ba9-be71-22f074a5ccd4\") " pod="openstack/nova-cell1-db-create-z9zcf" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.816279 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5b36-account-create-update-pthd9"] Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.820168 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c981b84-6b45-4ba9-be71-22f074a5ccd4-operator-scripts\") pod \"nova-cell1-db-create-z9zcf\" (UID: \"2c981b84-6b45-4ba9-be71-22f074a5ccd4\") " pod="openstack/nova-cell1-db-create-z9zcf" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.889874 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-08b6-account-create-update-5vjkd"] Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.892629 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7dbl\" (UniqueName: \"kubernetes.io/projected/2c981b84-6b45-4ba9-be71-22f074a5ccd4-kube-api-access-g7dbl\") pod \"nova-cell1-db-create-z9zcf\" (UID: \"2c981b84-6b45-4ba9-be71-22f074a5ccd4\") " pod="openstack/nova-cell1-db-create-z9zcf" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.904987 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-08b6-account-create-update-5vjkd"] Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.905146 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-08b6-account-create-update-5vjkd" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.909888 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.916209 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a96bf2-f9b9-4f1e-a884-202eb437a8fc-operator-scripts\") pod \"nova-cell1-08b6-account-create-update-5vjkd\" (UID: \"c7a96bf2-f9b9-4f1e-a884-202eb437a8fc\") " pod="openstack/nova-cell1-08b6-account-create-update-5vjkd" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.916734 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpjkd\" (UniqueName: \"kubernetes.io/projected/f2fe919f-6631-4332-af78-35e21d657657-kube-api-access-fpjkd\") pod \"nova-cell0-5b36-account-create-update-pthd9\" (UID: \"f2fe919f-6631-4332-af78-35e21d657657\") " pod="openstack/nova-cell0-5b36-account-create-update-pthd9" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.916781 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2fe919f-6631-4332-af78-35e21d657657-operator-scripts\") pod \"nova-cell0-5b36-account-create-update-pthd9\" (UID: \"f2fe919f-6631-4332-af78-35e21d657657\") " pod="openstack/nova-cell0-5b36-account-create-update-pthd9" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.916807 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tpbq\" (UniqueName: \"kubernetes.io/projected/c7a96bf2-f9b9-4f1e-a884-202eb437a8fc-kube-api-access-5tpbq\") pod \"nova-cell1-08b6-account-create-update-5vjkd\" (UID: \"c7a96bf2-f9b9-4f1e-a884-202eb437a8fc\") " pod="openstack/nova-cell1-08b6-account-create-update-5vjkd" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.917851 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2fe919f-6631-4332-af78-35e21d657657-operator-scripts\") pod \"nova-cell0-5b36-account-create-update-pthd9\" (UID: \"f2fe919f-6631-4332-af78-35e21d657657\") " pod="openstack/nova-cell0-5b36-account-create-update-pthd9" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.956409 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpjkd\" (UniqueName: \"kubernetes.io/projected/f2fe919f-6631-4332-af78-35e21d657657-kube-api-access-fpjkd\") pod \"nova-cell0-5b36-account-create-update-pthd9\" (UID: \"f2fe919f-6631-4332-af78-35e21d657657\") " pod="openstack/nova-cell0-5b36-account-create-update-pthd9" Feb 23 07:08:04 crc kubenswrapper[5047]: I0223 07:08:04.968008 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z9zcf" Feb 23 07:08:05 crc kubenswrapper[5047]: I0223 07:08:05.018945 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tpbq\" (UniqueName: \"kubernetes.io/projected/c7a96bf2-f9b9-4f1e-a884-202eb437a8fc-kube-api-access-5tpbq\") pod \"nova-cell1-08b6-account-create-update-5vjkd\" (UID: \"c7a96bf2-f9b9-4f1e-a884-202eb437a8fc\") " pod="openstack/nova-cell1-08b6-account-create-update-5vjkd" Feb 23 07:08:05 crc kubenswrapper[5047]: I0223 07:08:05.019056 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a96bf2-f9b9-4f1e-a884-202eb437a8fc-operator-scripts\") pod \"nova-cell1-08b6-account-create-update-5vjkd\" (UID: \"c7a96bf2-f9b9-4f1e-a884-202eb437a8fc\") " pod="openstack/nova-cell1-08b6-account-create-update-5vjkd" Feb 23 07:08:05 crc kubenswrapper[5047]: I0223 07:08:05.019929 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a96bf2-f9b9-4f1e-a884-202eb437a8fc-operator-scripts\") pod \"nova-cell1-08b6-account-create-update-5vjkd\" (UID: \"c7a96bf2-f9b9-4f1e-a884-202eb437a8fc\") " pod="openstack/nova-cell1-08b6-account-create-update-5vjkd" Feb 23 07:08:05 crc kubenswrapper[5047]: I0223 07:08:05.038766 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tpbq\" (UniqueName: \"kubernetes.io/projected/c7a96bf2-f9b9-4f1e-a884-202eb437a8fc-kube-api-access-5tpbq\") pod \"nova-cell1-08b6-account-create-update-5vjkd\" (UID: \"c7a96bf2-f9b9-4f1e-a884-202eb437a8fc\") " pod="openstack/nova-cell1-08b6-account-create-update-5vjkd" Feb 23 07:08:05 crc kubenswrapper[5047]: I0223 07:08:05.120172 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5b36-account-create-update-pthd9" Feb 23 07:08:05 crc kubenswrapper[5047]: I0223 07:08:05.242547 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.162:3000/\": dial tcp 10.217.0.162:3000: connect: connection refused" Feb 23 07:08:05 crc kubenswrapper[5047]: I0223 07:08:05.270699 5047 generic.go:334] "Generic (PLEG): container finished" podID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerID="b16634f8ca7b2464266a2169de1e2b10c81111ece98874a629f7e7c4dbd657c8" exitCode=0 Feb 23 07:08:05 crc kubenswrapper[5047]: I0223 07:08:05.270752 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c","Type":"ContainerDied","Data":"b16634f8ca7b2464266a2169de1e2b10c81111ece98874a629f7e7c4dbd657c8"} Feb 23 07:08:05 crc kubenswrapper[5047]: I0223 07:08:05.307326 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-08b6-account-create-update-5vjkd" Feb 23 07:08:07 crc kubenswrapper[5047]: I0223 07:08:07.296773 5047 generic.go:334] "Generic (PLEG): container finished" podID="391de77f-1a20-4fdf-90fb-413ab5e7cc3f" containerID="ace6ecfe551c44e9efb31d1ea90d89e9ff52a5828d4bf6e2593aac5d72081b6e" exitCode=0 Feb 23 07:08:07 crc kubenswrapper[5047]: I0223 07:08:07.296847 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"391de77f-1a20-4fdf-90fb-413ab5e7cc3f","Type":"ContainerDied","Data":"ace6ecfe551c44e9efb31d1ea90d89e9ff52a5828d4bf6e2593aac5d72081b6e"} Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.671976 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.807096 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-sg-core-conf-yaml\") pod \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.807203 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-log-httpd\") pod \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.807271 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd5zj\" (UniqueName: \"kubernetes.io/projected/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-kube-api-access-hd5zj\") pod \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.807347 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-scripts\") pod \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.807404 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-config-data\") pod \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.807579 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-run-httpd\") pod \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.807634 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-combined-ca-bundle\") pod \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\" (UID: \"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c\") " Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.809827 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" (UID: "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.810659 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" (UID: "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.828177 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-scripts" (OuterVolumeSpecName: "scripts") pod "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" (UID: "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.828708 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-kube-api-access-hd5zj" (OuterVolumeSpecName: "kube-api-access-hd5zj") pod "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" (UID: "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c"). InnerVolumeSpecName "kube-api-access-hd5zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.872222 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" (UID: "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.912740 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.912782 5047 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.912794 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.912803 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd5zj\" (UniqueName: \"kubernetes.io/projected/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-kube-api-access-hd5zj\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.912815 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.952152 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-config-data" (OuterVolumeSpecName: "config-data") pod "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" (UID: "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:08 crc kubenswrapper[5047]: I0223 07:08:08.957978 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" (UID: "a4d4b313-c993-4db2-ac64-87a5e7b3eb9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.015383 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.015429 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.054657 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.100383 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-08b6-account-create-update-5vjkd"] Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.123671 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5b36-account-create-update-pthd9"] Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.219061 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-scripts\") pod \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.219241 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-httpd-run\") pod \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.219332 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.219362 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-config-data\") pod \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.219383 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-w65gd"] Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.219443 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-combined-ca-bundle\") pod \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.219475 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-logs\") pod \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.219501 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6xjw\" (UniqueName: \"kubernetes.io/projected/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-kube-api-access-s6xjw\") pod \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.219561 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-internal-tls-certs\") pod \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\" (UID: \"391de77f-1a20-4fdf-90fb-413ab5e7cc3f\") " Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.220075 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "391de77f-1a20-4fdf-90fb-413ab5e7cc3f" (UID: "391de77f-1a20-4fdf-90fb-413ab5e7cc3f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.222695 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-logs" (OuterVolumeSpecName: "logs") pod "391de77f-1a20-4fdf-90fb-413ab5e7cc3f" (UID: "391de77f-1a20-4fdf-90fb-413ab5e7cc3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.236935 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-scripts" (OuterVolumeSpecName: "scripts") pod "391de77f-1a20-4fdf-90fb-413ab5e7cc3f" (UID: "391de77f-1a20-4fdf-90fb-413ab5e7cc3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.246864 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-z9zcf"] Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.250721 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "391de77f-1a20-4fdf-90fb-413ab5e7cc3f" (UID: "391de77f-1a20-4fdf-90fb-413ab5e7cc3f"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.252723 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-kube-api-access-s6xjw" (OuterVolumeSpecName: "kube-api-access-s6xjw") pod "391de77f-1a20-4fdf-90fb-413ab5e7cc3f" (UID: "391de77f-1a20-4fdf-90fb-413ab5e7cc3f"). InnerVolumeSpecName "kube-api-access-s6xjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.276689 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "391de77f-1a20-4fdf-90fb-413ab5e7cc3f" (UID: "391de77f-1a20-4fdf-90fb-413ab5e7cc3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.301369 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-r5j6n"] Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.322237 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.322310 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.322326 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.322343 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.322356 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6xjw\" (UniqueName: \"kubernetes.io/projected/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-kube-api-access-s6xjw\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.322367 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.340797 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ac64-account-create-update-thwgw"] Feb 23 07:08:09 crc kubenswrapper[5047]: W0223 07:08:09.342394 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf29958fb_3a36_427e_8094_62f7522b7a17.slice/crio-50597a9835253b8b1600f3f45196fa05180d5e0f4546ef14277aebfeb1eff068 WatchSource:0}: Error finding container 50597a9835253b8b1600f3f45196fa05180d5e0f4546ef14277aebfeb1eff068: Status 404 returned error can't find the container with id 50597a9835253b8b1600f3f45196fa05180d5e0f4546ef14277aebfeb1eff068 Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.347789 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-config-data" (OuterVolumeSpecName: "config-data") pod "391de77f-1a20-4fdf-90fb-413ab5e7cc3f" (UID: "391de77f-1a20-4fdf-90fb-413ab5e7cc3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.376988 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d867c5cb7-qbvv7"] Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.385173 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"391de77f-1a20-4fdf-90fb-413ab5e7cc3f","Type":"ContainerDied","Data":"5e71b3aefe34ac6a57d430db9928cacbb52c8185e7e90d9498c75b2061200258"} Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.385228 5047 scope.go:117] "RemoveContainer" containerID="ace6ecfe551c44e9efb31d1ea90d89e9ff52a5828d4bf6e2593aac5d72081b6e" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.385380 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.389100 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ae539167-f011-4f50-8ce6-df90580fa157","Type":"ContainerStarted","Data":"458110c59c1b32388e1bbf0559e62ae01cdd38bb1a5dc8a73f30c5ca7efaf50f"} Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.394835 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-w65gd" event={"ID":"49095dcf-0259-4bde-a0c1-695841ebd224","Type":"ContainerStarted","Data":"09be42f3b7665681c358e4273e8d5025688f3c133eabb8dc3d3da52e68d71a11"} Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.395804 5047 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.398062 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-08b6-account-create-update-5vjkd" event={"ID":"c7a96bf2-f9b9-4f1e-a884-202eb437a8fc","Type":"ContainerStarted","Data":"81a296456eba8a685b7fe86ea6b915aa517ee2f48181d35754bc4e9b0b790d5d"} Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.400406 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5b36-account-create-update-pthd9" event={"ID":"f2fe919f-6631-4332-af78-35e21d657657","Type":"ContainerStarted","Data":"ee812f08421e951f9e204307a2b73013867dc010549b5e8094575c3742d2ed48"} Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.403409 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "391de77f-1a20-4fdf-90fb-413ab5e7cc3f" (UID: "391de77f-1a20-4fdf-90fb-413ab5e7cc3f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.410213 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z9zcf" event={"ID":"2c981b84-6b45-4ba9-be71-22f074a5ccd4","Type":"ContainerStarted","Data":"eb18bc35df2b49d645ec8ce839027d236c0e4766cf7d7fde14e96146ceee2518"} Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.416356 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac64-account-create-update-thwgw" event={"ID":"15d0a292-4600-45ed-a947-37e93bccaea8","Type":"ContainerStarted","Data":"d594301eb3e738e73919431797da472e34863fb2feb998b0cd2252c4858a8821"} Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.424578 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.424613 5047 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.424628 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/391de77f-1a20-4fdf-90fb-413ab5e7cc3f-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.426409 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.7709582360000002 podStartE2EDuration="14.426387835s" podCreationTimestamp="2026-02-23 07:07:55 +0000 UTC" firstStartedPulling="2026-02-23 07:07:56.498763852 +0000 UTC m=+1398.750090996" lastFinishedPulling="2026-02-23 07:08:08.154193461 +0000 UTC m=+1410.405520595" observedRunningTime="2026-02-23 07:08:09.412501484 +0000 UTC m=+1411.663828618" watchObservedRunningTime="2026-02-23 07:08:09.426387835 +0000 UTC m=+1411.677714959" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.426840 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d4b313-c993-4db2-ac64-87a5e7b3eb9c","Type":"ContainerDied","Data":"bb7be07bd007c584bd7b01df32314aaf8c744c39fd8e80c73ff9040f3626f3e9"} Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.427011 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.456692 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-r5j6n" event={"ID":"4675c789-acd8-480a-a09f-52ab9a827bb6","Type":"ContainerStarted","Data":"372e5c4ca04c886e66d514b6daa8fdd52c3312d6d7a2b3570eaf542da658cd64"} Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.463124 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d867c5cb7-qbvv7" event={"ID":"f29958fb-3a36-427e-8094-62f7522b7a17","Type":"ContainerStarted","Data":"50597a9835253b8b1600f3f45196fa05180d5e0f4546ef14277aebfeb1eff068"} Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.468246 5047 scope.go:117] "RemoveContainer" containerID="32f4b174839bd5ba33fa916bb23331139b624944cb0f61b780cd7e606f20324a" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.512687 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.549563 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.559121 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:09 crc kubenswrapper[5047]: E0223 07:08:09.559673 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="sg-core" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.559689 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="sg-core" Feb 23 07:08:09 crc kubenswrapper[5047]: E0223 07:08:09.559702 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391de77f-1a20-4fdf-90fb-413ab5e7cc3f" containerName="glance-httpd" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.559709 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="391de77f-1a20-4fdf-90fb-413ab5e7cc3f" containerName="glance-httpd" Feb 23 07:08:09 crc kubenswrapper[5047]: E0223 07:08:09.559721 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391de77f-1a20-4fdf-90fb-413ab5e7cc3f" containerName="glance-log" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.559728 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="391de77f-1a20-4fdf-90fb-413ab5e7cc3f" containerName="glance-log" Feb 23 07:08:09 crc kubenswrapper[5047]: E0223 07:08:09.559749 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="proxy-httpd" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.559756 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="proxy-httpd" Feb 23 07:08:09 crc kubenswrapper[5047]: E0223 07:08:09.559773 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="ceilometer-central-agent" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.559779 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="ceilometer-central-agent" Feb 23 07:08:09 crc kubenswrapper[5047]: E0223 07:08:09.559793 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="ceilometer-notification-agent" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.559817 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="ceilometer-notification-agent" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.560050 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="ceilometer-central-agent" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.560064 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="391de77f-1a20-4fdf-90fb-413ab5e7cc3f" containerName="glance-log" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.560082 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="ceilometer-notification-agent" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.560089 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="sg-core" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.560099 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" containerName="proxy-httpd" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.560111 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="391de77f-1a20-4fdf-90fb-413ab5e7cc3f" containerName="glance-httpd" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.562013 5047 scope.go:117] "RemoveContainer" containerID="6cb9f954c7db0d6e83e722d0a6b5a9e9b5954f9e8e9da60cba495b6d5d534e8d" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.562173 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.566176 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.566475 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.567224 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.630513 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eae8085b-f687-4458-b894-945152e6bf1c-log-httpd\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.630581 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-scripts\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.630639 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-config-data\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.630697 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eae8085b-f687-4458-b894-945152e6bf1c-run-httpd\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.630717 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.630782 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.630799 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzchz\" (UniqueName: \"kubernetes.io/projected/eae8085b-f687-4458-b894-945152e6bf1c-kube-api-access-xzchz\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.636166 5047 scope.go:117] "RemoveContainer" containerID="1710a1afaf8b8d1863ad4a91900f4520a33242333b0ea1818cb9d87e03a7c7e5" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.668147 5047 scope.go:117] "RemoveContainer" containerID="b16634f8ca7b2464266a2169de1e2b10c81111ece98874a629f7e7c4dbd657c8" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.698486 5047 scope.go:117] "RemoveContainer" containerID="8d83ede9036b9fd8c2e334d730c2786748c4c52e7f0242494531c434bc3ec5be" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.733200 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eae8085b-f687-4458-b894-945152e6bf1c-run-httpd\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.733264 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.733334 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.733365 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzchz\" (UniqueName: \"kubernetes.io/projected/eae8085b-f687-4458-b894-945152e6bf1c-kube-api-access-xzchz\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.733414 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eae8085b-f687-4458-b894-945152e6bf1c-log-httpd\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.733454 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-scripts\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.733493 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-config-data\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.739189 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.739532 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eae8085b-f687-4458-b894-945152e6bf1c-run-httpd\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.739768 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-config-data\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.740402 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eae8085b-f687-4458-b894-945152e6bf1c-log-httpd\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.769383 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.769530 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-scripts\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.776423 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.781079 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.806134 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.840094 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzchz\" (UniqueName: \"kubernetes.io/projected/eae8085b-f687-4458-b894-945152e6bf1c-kube-api-access-xzchz\") pod \"ceilometer-0\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.845102 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.845241 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.853575 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.854297 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.908816 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.948983 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.949133 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d03df4-c334-4c64-a273-e4e307df5add-logs\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.949256 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.949371 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.949467 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.949546 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nmbl\" (UniqueName: \"kubernetes.io/projected/69d03df4-c334-4c64-a273-e4e307df5add-kube-api-access-7nmbl\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.949612 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:09 crc kubenswrapper[5047]: I0223 07:08:09.957307 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69d03df4-c334-4c64-a273-e4e307df5add-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.059438 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nmbl\" (UniqueName: \"kubernetes.io/projected/69d03df4-c334-4c64-a273-e4e307df5add-kube-api-access-7nmbl\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.061448 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.061603 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69d03df4-c334-4c64-a273-e4e307df5add-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.063306 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.063600 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d03df4-c334-4c64-a273-e4e307df5add-logs\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.063778 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.064140 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.076729 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.078827 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.079130 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d03df4-c334-4c64-a273-e4e307df5add-logs\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.095549 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69d03df4-c334-4c64-a273-e4e307df5add-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.098328 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.098393 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.110300 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.118808 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nmbl\" (UniqueName: \"kubernetes.io/projected/69d03df4-c334-4c64-a273-e4e307df5add-kube-api-access-7nmbl\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.119420 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.218576 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.371521 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="391de77f-1a20-4fdf-90fb-413ab5e7cc3f" path="/var/lib/kubelet/pods/391de77f-1a20-4fdf-90fb-413ab5e7cc3f/volumes" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.372636 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4d4b313-c993-4db2-ac64-87a5e7b3eb9c" path="/var/lib/kubelet/pods/a4d4b313-c993-4db2-ac64-87a5e7b3eb9c/volumes" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.490954 5047 generic.go:334] "Generic (PLEG): container finished" podID="c7a96bf2-f9b9-4f1e-a884-202eb437a8fc" containerID="95da99e7abef27d0904bfe545c3f30b1b0448c94b49b798f69241c902b5aaf0e" exitCode=0 Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.491116 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-08b6-account-create-update-5vjkd" event={"ID":"c7a96bf2-f9b9-4f1e-a884-202eb437a8fc","Type":"ContainerDied","Data":"95da99e7abef27d0904bfe545c3f30b1b0448c94b49b798f69241c902b5aaf0e"} Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.494653 5047 generic.go:334] "Generic (PLEG): container finished" podID="4675c789-acd8-480a-a09f-52ab9a827bb6" containerID="0e3b47db4cbec9007d82a1b680b785316dfde205b6f0c34008d7119621f5b73a" exitCode=0 Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.494727 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-r5j6n" event={"ID":"4675c789-acd8-480a-a09f-52ab9a827bb6","Type":"ContainerDied","Data":"0e3b47db4cbec9007d82a1b680b785316dfde205b6f0c34008d7119621f5b73a"} Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.498154 5047 generic.go:334] "Generic (PLEG): container finished" podID="f2fe919f-6631-4332-af78-35e21d657657" containerID="b44f6b2c8f1e447760bbca2e2eef8fead4c67ab049896aef3ccca0fea1b3b631" exitCode=0 Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.498281 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5b36-account-create-update-pthd9" event={"ID":"f2fe919f-6631-4332-af78-35e21d657657","Type":"ContainerDied","Data":"b44f6b2c8f1e447760bbca2e2eef8fead4c67ab049896aef3ccca0fea1b3b631"} Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.505353 5047 generic.go:334] "Generic (PLEG): container finished" podID="15d0a292-4600-45ed-a947-37e93bccaea8" containerID="61434e9e4d467386a78cd46fa836a42ead7ba33809fa4c13e1ad548d69e1b881" exitCode=0 Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.505423 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac64-account-create-update-thwgw" event={"ID":"15d0a292-4600-45ed-a947-37e93bccaea8","Type":"ContainerDied","Data":"61434e9e4d467386a78cd46fa836a42ead7ba33809fa4c13e1ad548d69e1b881"} Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.507190 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.510183 5047 generic.go:334] "Generic (PLEG): container finished" podID="49095dcf-0259-4bde-a0c1-695841ebd224" containerID="b2223304e21e1939208327e76234fdea34d918e35b270aa7f780d4e3c732e5cb" exitCode=0 Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.510230 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-w65gd" event={"ID":"49095dcf-0259-4bde-a0c1-695841ebd224","Type":"ContainerDied","Data":"b2223304e21e1939208327e76234fdea34d918e35b270aa7f780d4e3c732e5cb"} Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.512706 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d867c5cb7-qbvv7" event={"ID":"f29958fb-3a36-427e-8094-62f7522b7a17","Type":"ContainerStarted","Data":"370e63e6a062f56f822bb2fff7821e2ee1f32ee82378743cc49b225b72866f01"} Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.518448 5047 generic.go:334] "Generic (PLEG): container finished" podID="2c981b84-6b45-4ba9-be71-22f074a5ccd4" containerID="a45e4023fdc3f5de91bd12746e0ec06ad8dc73f14a73ff3efba60ce6bbe4cb68" exitCode=0 Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.519966 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z9zcf" event={"ID":"2c981b84-6b45-4ba9-be71-22f074a5ccd4","Type":"ContainerDied","Data":"a45e4023fdc3f5de91bd12746e0ec06ad8dc73f14a73ff3efba60ce6bbe4cb68"} Feb 23 07:08:10 crc kubenswrapper[5047]: W0223 07:08:10.544178 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeae8085b_f687_4458_b894_945152e6bf1c.slice/crio-3296dbbd357a381d279f52bf9fd9e406cf4782ef6e740c7b1cceb5e3a943922b WatchSource:0}: Error finding container 3296dbbd357a381d279f52bf9fd9e406cf4782ef6e740c7b1cceb5e3a943922b: Status 404 returned error can't find the container with id 3296dbbd357a381d279f52bf9fd9e406cf4782ef6e740c7b1cceb5e3a943922b Feb 23 07:08:10 crc kubenswrapper[5047]: E0223 07:08:10.556842 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4675c789_acd8_480a_a09f_52ab9a827bb6.slice/crio-conmon-0e3b47db4cbec9007d82a1b680b785316dfde205b6f0c34008d7119621f5b73a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4675c789_acd8_480a_a09f_52ab9a827bb6.slice/crio-0e3b47db4cbec9007d82a1b680b785316dfde205b6f0c34008d7119621f5b73a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15d0a292_4600_45ed_a947_37e93bccaea8.slice/crio-61434e9e4d467386a78cd46fa836a42ead7ba33809fa4c13e1ad548d69e1b881.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15d0a292_4600_45ed_a947_37e93bccaea8.slice/crio-conmon-61434e9e4d467386a78cd46fa836a42ead7ba33809fa4c13e1ad548d69e1b881.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7a96bf2_f9b9_4f1e_a884_202eb437a8fc.slice/crio-conmon-95da99e7abef27d0904bfe545c3f30b1b0448c94b49b798f69241c902b5aaf0e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2fe919f_6631_4332_af78_35e21d657657.slice/crio-b44f6b2c8f1e447760bbca2e2eef8fead4c67ab049896aef3ccca0fea1b3b631.scope\": RecentStats: unable to find data in memory cache]" Feb 23 07:08:10 crc kubenswrapper[5047]: I0223 07:08:10.573878 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:11 crc kubenswrapper[5047]: I0223 07:08:11.374220 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:08:11 crc kubenswrapper[5047]: I0223 07:08:11.530565 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69d03df4-c334-4c64-a273-e4e307df5add","Type":"ContainerStarted","Data":"e519685364edf1623978c606be8efb8fc2ac904406f478ece6affa0ab3a2f58b"} Feb 23 07:08:11 crc kubenswrapper[5047]: I0223 07:08:11.536443 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d867c5cb7-qbvv7" event={"ID":"f29958fb-3a36-427e-8094-62f7522b7a17","Type":"ContainerStarted","Data":"b5495e9f8841c7dec1c3b19f82d389219f186ba7641ce0bc83c99ddb4383352d"} Feb 23 07:08:11 crc kubenswrapper[5047]: I0223 07:08:11.536844 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:11 crc kubenswrapper[5047]: I0223 07:08:11.536920 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:11 crc kubenswrapper[5047]: I0223 07:08:11.544289 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eae8085b-f687-4458-b894-945152e6bf1c","Type":"ContainerStarted","Data":"3fbc087f12473c0d3d8711443916113e34cd574d564699d6043b597f9dd6f857"} Feb 23 07:08:11 crc kubenswrapper[5047]: I0223 07:08:11.544382 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eae8085b-f687-4458-b894-945152e6bf1c","Type":"ContainerStarted","Data":"3296dbbd357a381d279f52bf9fd9e406cf4782ef6e740c7b1cceb5e3a943922b"} Feb 23 07:08:11 crc kubenswrapper[5047]: I0223 07:08:11.570367 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-d867c5cb7-qbvv7" podStartSLOduration=10.570338522 podStartE2EDuration="10.570338522s" podCreationTimestamp="2026-02-23 07:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:11.561455078 +0000 UTC m=+1413.812782212" watchObservedRunningTime="2026-02-23 07:08:11.570338522 +0000 UTC m=+1413.821665666" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.157109 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r5j6n" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.194881 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4675c789-acd8-480a-a09f-52ab9a827bb6-operator-scripts\") pod \"4675c789-acd8-480a-a09f-52ab9a827bb6\" (UID: \"4675c789-acd8-480a-a09f-52ab9a827bb6\") " Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.195027 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsj9x\" (UniqueName: \"kubernetes.io/projected/4675c789-acd8-480a-a09f-52ab9a827bb6-kube-api-access-jsj9x\") pod \"4675c789-acd8-480a-a09f-52ab9a827bb6\" (UID: \"4675c789-acd8-480a-a09f-52ab9a827bb6\") " Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.196421 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4675c789-acd8-480a-a09f-52ab9a827bb6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4675c789-acd8-480a-a09f-52ab9a827bb6" (UID: "4675c789-acd8-480a-a09f-52ab9a827bb6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.213186 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4675c789-acd8-480a-a09f-52ab9a827bb6-kube-api-access-jsj9x" (OuterVolumeSpecName: "kube-api-access-jsj9x") pod "4675c789-acd8-480a-a09f-52ab9a827bb6" (UID: "4675c789-acd8-480a-a09f-52ab9a827bb6"). InnerVolumeSpecName "kube-api-access-jsj9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.303606 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4675c789-acd8-480a-a09f-52ab9a827bb6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.303640 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsj9x\" (UniqueName: \"kubernetes.io/projected/4675c789-acd8-480a-a09f-52ab9a827bb6-kube-api-access-jsj9x\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.314558 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac64-account-create-update-thwgw" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.350164 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-w65gd" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.407450 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d0a292-4600-45ed-a947-37e93bccaea8-operator-scripts\") pod \"15d0a292-4600-45ed-a947-37e93bccaea8\" (UID: \"15d0a292-4600-45ed-a947-37e93bccaea8\") " Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.407528 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49095dcf-0259-4bde-a0c1-695841ebd224-operator-scripts\") pod \"49095dcf-0259-4bde-a0c1-695841ebd224\" (UID: \"49095dcf-0259-4bde-a0c1-695841ebd224\") " Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.407757 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx7c6\" (UniqueName: \"kubernetes.io/projected/49095dcf-0259-4bde-a0c1-695841ebd224-kube-api-access-mx7c6\") pod \"49095dcf-0259-4bde-a0c1-695841ebd224\" (UID: \"49095dcf-0259-4bde-a0c1-695841ebd224\") " Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.407843 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67q8t\" (UniqueName: \"kubernetes.io/projected/15d0a292-4600-45ed-a947-37e93bccaea8-kube-api-access-67q8t\") pod \"15d0a292-4600-45ed-a947-37e93bccaea8\" (UID: \"15d0a292-4600-45ed-a947-37e93bccaea8\") " Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.420729 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z9zcf" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.422606 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49095dcf-0259-4bde-a0c1-695841ebd224-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49095dcf-0259-4bde-a0c1-695841ebd224" (UID: "49095dcf-0259-4bde-a0c1-695841ebd224"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.427254 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15d0a292-4600-45ed-a947-37e93bccaea8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15d0a292-4600-45ed-a947-37e93bccaea8" (UID: "15d0a292-4600-45ed-a947-37e93bccaea8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.440293 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-08b6-account-create-update-5vjkd" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.478853 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49095dcf-0259-4bde-a0c1-695841ebd224-kube-api-access-mx7c6" (OuterVolumeSpecName: "kube-api-access-mx7c6") pod "49095dcf-0259-4bde-a0c1-695841ebd224" (UID: "49095dcf-0259-4bde-a0c1-695841ebd224"). InnerVolumeSpecName "kube-api-access-mx7c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.503194 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15d0a292-4600-45ed-a947-37e93bccaea8-kube-api-access-67q8t" (OuterVolumeSpecName: "kube-api-access-67q8t") pod "15d0a292-4600-45ed-a947-37e93bccaea8" (UID: "15d0a292-4600-45ed-a947-37e93bccaea8"). InnerVolumeSpecName "kube-api-access-67q8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.507842 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.509819 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7dbl\" (UniqueName: \"kubernetes.io/projected/2c981b84-6b45-4ba9-be71-22f074a5ccd4-kube-api-access-g7dbl\") pod \"2c981b84-6b45-4ba9-be71-22f074a5ccd4\" (UID: \"2c981b84-6b45-4ba9-be71-22f074a5ccd4\") " Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.510136 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tpbq\" (UniqueName: \"kubernetes.io/projected/c7a96bf2-f9b9-4f1e-a884-202eb437a8fc-kube-api-access-5tpbq\") pod \"c7a96bf2-f9b9-4f1e-a884-202eb437a8fc\" (UID: \"c7a96bf2-f9b9-4f1e-a884-202eb437a8fc\") " Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.510202 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a96bf2-f9b9-4f1e-a884-202eb437a8fc-operator-scripts\") pod \"c7a96bf2-f9b9-4f1e-a884-202eb437a8fc\" (UID: \"c7a96bf2-f9b9-4f1e-a884-202eb437a8fc\") " Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.510242 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c981b84-6b45-4ba9-be71-22f074a5ccd4-operator-scripts\") pod \"2c981b84-6b45-4ba9-be71-22f074a5ccd4\" (UID: \"2c981b84-6b45-4ba9-be71-22f074a5ccd4\") " Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.510810 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67q8t\" (UniqueName: \"kubernetes.io/projected/15d0a292-4600-45ed-a947-37e93bccaea8-kube-api-access-67q8t\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.510830 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d0a292-4600-45ed-a947-37e93bccaea8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.510841 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49095dcf-0259-4bde-a0c1-695841ebd224-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.510851 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx7c6\" (UniqueName: \"kubernetes.io/projected/49095dcf-0259-4bde-a0c1-695841ebd224-kube-api-access-mx7c6\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.513880 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7a96bf2-f9b9-4f1e-a884-202eb437a8fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7a96bf2-f9b9-4f1e-a884-202eb437a8fc" (UID: "c7a96bf2-f9b9-4f1e-a884-202eb437a8fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.526702 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c981b84-6b45-4ba9-be71-22f074a5ccd4-kube-api-access-g7dbl" (OuterVolumeSpecName: "kube-api-access-g7dbl") pod "2c981b84-6b45-4ba9-be71-22f074a5ccd4" (UID: "2c981b84-6b45-4ba9-be71-22f074a5ccd4"). InnerVolumeSpecName "kube-api-access-g7dbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.530294 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a96bf2-f9b9-4f1e-a884-202eb437a8fc-kube-api-access-5tpbq" (OuterVolumeSpecName: "kube-api-access-5tpbq") pod "c7a96bf2-f9b9-4f1e-a884-202eb437a8fc" (UID: "c7a96bf2-f9b9-4f1e-a884-202eb437a8fc"). InnerVolumeSpecName "kube-api-access-5tpbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.544930 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5b36-account-create-update-pthd9" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.581647 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c981b84-6b45-4ba9-be71-22f074a5ccd4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c981b84-6b45-4ba9-be71-22f074a5ccd4" (UID: "2c981b84-6b45-4ba9-be71-22f074a5ccd4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.615729 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpjkd\" (UniqueName: \"kubernetes.io/projected/f2fe919f-6631-4332-af78-35e21d657657-kube-api-access-fpjkd\") pod \"f2fe919f-6631-4332-af78-35e21d657657\" (UID: \"f2fe919f-6631-4332-af78-35e21d657657\") " Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.615896 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2fe919f-6631-4332-af78-35e21d657657-operator-scripts\") pod \"f2fe919f-6631-4332-af78-35e21d657657\" (UID: \"f2fe919f-6631-4332-af78-35e21d657657\") " Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.616476 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tpbq\" (UniqueName: \"kubernetes.io/projected/c7a96bf2-f9b9-4f1e-a884-202eb437a8fc-kube-api-access-5tpbq\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.616493 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a96bf2-f9b9-4f1e-a884-202eb437a8fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.616503 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c981b84-6b45-4ba9-be71-22f074a5ccd4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.616513 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7dbl\" (UniqueName: \"kubernetes.io/projected/2c981b84-6b45-4ba9-be71-22f074a5ccd4-kube-api-access-g7dbl\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.617108 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2fe919f-6631-4332-af78-35e21d657657-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2fe919f-6631-4332-af78-35e21d657657" (UID: "f2fe919f-6631-4332-af78-35e21d657657"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.620783 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2fe919f-6631-4332-af78-35e21d657657-kube-api-access-fpjkd" (OuterVolumeSpecName: "kube-api-access-fpjkd") pod "f2fe919f-6631-4332-af78-35e21d657657" (UID: "f2fe919f-6631-4332-af78-35e21d657657"). InnerVolumeSpecName "kube-api-access-fpjkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.640378 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z9zcf" event={"ID":"2c981b84-6b45-4ba9-be71-22f074a5ccd4","Type":"ContainerDied","Data":"eb18bc35df2b49d645ec8ce839027d236c0e4766cf7d7fde14e96146ceee2518"} Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.640429 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb18bc35df2b49d645ec8ce839027d236c0e4766cf7d7fde14e96146ceee2518" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.640499 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z9zcf" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.663619 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5b36-account-create-update-pthd9" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.663761 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5b36-account-create-update-pthd9" event={"ID":"f2fe919f-6631-4332-af78-35e21d657657","Type":"ContainerDied","Data":"ee812f08421e951f9e204307a2b73013867dc010549b5e8094575c3742d2ed48"} Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.663826 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee812f08421e951f9e204307a2b73013867dc010549b5e8094575c3742d2ed48" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.677094 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac64-account-create-update-thwgw" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.677113 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac64-account-create-update-thwgw" event={"ID":"15d0a292-4600-45ed-a947-37e93bccaea8","Type":"ContainerDied","Data":"d594301eb3e738e73919431797da472e34863fb2feb998b0cd2252c4858a8821"} Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.677156 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d594301eb3e738e73919431797da472e34863fb2feb998b0cd2252c4858a8821" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.704880 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eae8085b-f687-4458-b894-945152e6bf1c","Type":"ContainerStarted","Data":"eaa82cc27eb171c181a912e4e4fae34fda847e4b1b717c32d57ccb3666d1e945"} Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.709027 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-w65gd" event={"ID":"49095dcf-0259-4bde-a0c1-695841ebd224","Type":"ContainerDied","Data":"09be42f3b7665681c358e4273e8d5025688f3c133eabb8dc3d3da52e68d71a11"} Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.709059 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09be42f3b7665681c358e4273e8d5025688f3c133eabb8dc3d3da52e68d71a11" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.709159 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-w65gd" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.715053 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-08b6-account-create-update-5vjkd" event={"ID":"c7a96bf2-f9b9-4f1e-a884-202eb437a8fc","Type":"ContainerDied","Data":"81a296456eba8a685b7fe86ea6b915aa517ee2f48181d35754bc4e9b0b790d5d"} Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.715080 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81a296456eba8a685b7fe86ea6b915aa517ee2f48181d35754bc4e9b0b790d5d" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.715127 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-08b6-account-create-update-5vjkd" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.718497 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2fe919f-6631-4332-af78-35e21d657657-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.718530 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpjkd\" (UniqueName: \"kubernetes.io/projected/f2fe919f-6631-4332-af78-35e21d657657-kube-api-access-fpjkd\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.718619 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r5j6n" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.718696 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-r5j6n" event={"ID":"4675c789-acd8-480a-a09f-52ab9a827bb6","Type":"ContainerDied","Data":"372e5c4ca04c886e66d514b6daa8fdd52c3312d6d7a2b3570eaf542da658cd64"} Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.718733 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="372e5c4ca04c886e66d514b6daa8fdd52c3312d6d7a2b3570eaf542da658cd64" Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.988058 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.988366 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="261cbbc0-ab7f-4259-94b2-00ab04e23187" containerName="glance-log" containerID="cri-o://1dd1586e3c8a4c703445e7b1be23cebbba7ba7a30a2d668e41638e09f76c54f4" gracePeriod=30 Feb 23 07:08:12 crc kubenswrapper[5047]: I0223 07:08:12.988559 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="261cbbc0-ab7f-4259-94b2-00ab04e23187" containerName="glance-httpd" containerID="cri-o://fc5b0a7fc5af478f48124dfbe64534cecb6a01a445a34e3c0217362bf0299435" gracePeriod=30 Feb 23 07:08:13 crc kubenswrapper[5047]: I0223 07:08:13.745108 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eae8085b-f687-4458-b894-945152e6bf1c","Type":"ContainerStarted","Data":"abe051ca69b34167efc9d0de57b43e33b3c3eef04bc7b9d896ccfd8cfb9d4325"} Feb 23 07:08:13 crc kubenswrapper[5047]: I0223 07:08:13.748170 5047 generic.go:334] "Generic (PLEG): container finished" podID="261cbbc0-ab7f-4259-94b2-00ab04e23187" containerID="1dd1586e3c8a4c703445e7b1be23cebbba7ba7a30a2d668e41638e09f76c54f4" exitCode=143 Feb 23 07:08:13 crc kubenswrapper[5047]: I0223 07:08:13.748244 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"261cbbc0-ab7f-4259-94b2-00ab04e23187","Type":"ContainerDied","Data":"1dd1586e3c8a4c703445e7b1be23cebbba7ba7a30a2d668e41638e09f76c54f4"} Feb 23 07:08:13 crc kubenswrapper[5047]: I0223 07:08:13.753388 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69d03df4-c334-4c64-a273-e4e307df5add","Type":"ContainerStarted","Data":"d3073ef9efc88121d60f580513fe1f931a6371611ccd81a1a8970553da615a84"} Feb 23 07:08:13 crc kubenswrapper[5047]: I0223 07:08:13.753425 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69d03df4-c334-4c64-a273-e4e307df5add","Type":"ContainerStarted","Data":"92383f49d49f2ba213699c653ceecca29ba9aba4cc99635c1909b4aa5789dcd9"} Feb 23 07:08:13 crc kubenswrapper[5047]: I0223 07:08:13.782171 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:08:13 crc kubenswrapper[5047]: I0223 07:08:13.791508 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.791483421 podStartE2EDuration="4.791483421s" podCreationTimestamp="2026-02-23 07:08:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:13.777796545 +0000 UTC m=+1416.029123679" watchObservedRunningTime="2026-02-23 07:08:13.791483421 +0000 UTC m=+1416.042810555" Feb 23 07:08:13 crc kubenswrapper[5047]: I0223 07:08:13.918772 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5ff9c9c5c6-pkfqh"] Feb 23 07:08:13 crc kubenswrapper[5047]: I0223 07:08:13.919436 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5ff9c9c5c6-pkfqh" podUID="19e4074e-d9fa-436c-97d8-f9d90ea147e2" containerName="neutron-api" containerID="cri-o://39e5ccb29704604c78f137df87ae9930b8b1ecf21406a4701a0850a629ebcdfd" gracePeriod=30 Feb 23 07:08:13 crc kubenswrapper[5047]: I0223 07:08:13.919509 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5ff9c9c5c6-pkfqh" podUID="19e4074e-d9fa-436c-97d8-f9d90ea147e2" containerName="neutron-httpd" containerID="cri-o://48beb39652f6bbd9ec63e7ff1fa9cf1cf2543500d09199781370b0ee22d9c880" gracePeriod=30 Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.766357 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eae8085b-f687-4458-b894-945152e6bf1c","Type":"ContainerStarted","Data":"d1225b352aa5840144c80459cab2c25ad24b82d59bf992e67f934f3278ca967e"} Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.766572 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="ceilometer-central-agent" containerID="cri-o://3fbc087f12473c0d3d8711443916113e34cd574d564699d6043b597f9dd6f857" gracePeriod=30 Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.766578 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="proxy-httpd" containerID="cri-o://d1225b352aa5840144c80459cab2c25ad24b82d59bf992e67f934f3278ca967e" gracePeriod=30 Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.766891 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.766628 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="ceilometer-notification-agent" containerID="cri-o://eaa82cc27eb171c181a912e4e4fae34fda847e4b1b717c32d57ccb3666d1e945" gracePeriod=30 Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.766603 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="sg-core" containerID="cri-o://abe051ca69b34167efc9d0de57b43e33b3c3eef04bc7b9d896ccfd8cfb9d4325" gracePeriod=30 Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.772311 5047 generic.go:334] "Generic (PLEG): container finished" podID="19e4074e-d9fa-436c-97d8-f9d90ea147e2" containerID="48beb39652f6bbd9ec63e7ff1fa9cf1cf2543500d09199781370b0ee22d9c880" exitCode=0 Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.772555 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ff9c9c5c6-pkfqh" event={"ID":"19e4074e-d9fa-436c-97d8-f9d90ea147e2","Type":"ContainerDied","Data":"48beb39652f6bbd9ec63e7ff1fa9cf1cf2543500d09199781370b0ee22d9c880"} Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.788004 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.151209939 podStartE2EDuration="5.787985708s" podCreationTimestamp="2026-02-23 07:08:09 +0000 UTC" firstStartedPulling="2026-02-23 07:08:10.549374608 +0000 UTC m=+1412.800701742" lastFinishedPulling="2026-02-23 07:08:14.186150367 +0000 UTC m=+1416.437477511" observedRunningTime="2026-02-23 07:08:14.786587762 +0000 UTC m=+1417.037914896" watchObservedRunningTime="2026-02-23 07:08:14.787985708 +0000 UTC m=+1417.039312842" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.981185 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wlldh"] Feb 23 07:08:14 crc kubenswrapper[5047]: E0223 07:08:14.981976 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4675c789-acd8-480a-a09f-52ab9a827bb6" containerName="mariadb-database-create" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.981993 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4675c789-acd8-480a-a09f-52ab9a827bb6" containerName="mariadb-database-create" Feb 23 07:08:14 crc kubenswrapper[5047]: E0223 07:08:14.982008 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49095dcf-0259-4bde-a0c1-695841ebd224" containerName="mariadb-database-create" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.982016 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="49095dcf-0259-4bde-a0c1-695841ebd224" containerName="mariadb-database-create" Feb 23 07:08:14 crc kubenswrapper[5047]: E0223 07:08:14.982047 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c981b84-6b45-4ba9-be71-22f074a5ccd4" containerName="mariadb-database-create" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.982053 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c981b84-6b45-4ba9-be71-22f074a5ccd4" containerName="mariadb-database-create" Feb 23 07:08:14 crc kubenswrapper[5047]: E0223 07:08:14.982065 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a96bf2-f9b9-4f1e-a884-202eb437a8fc" containerName="mariadb-account-create-update" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.982070 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a96bf2-f9b9-4f1e-a884-202eb437a8fc" containerName="mariadb-account-create-update" Feb 23 07:08:14 crc kubenswrapper[5047]: E0223 07:08:14.982081 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2fe919f-6631-4332-af78-35e21d657657" containerName="mariadb-account-create-update" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.982087 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2fe919f-6631-4332-af78-35e21d657657" containerName="mariadb-account-create-update" Feb 23 07:08:14 crc kubenswrapper[5047]: E0223 07:08:14.982097 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d0a292-4600-45ed-a947-37e93bccaea8" containerName="mariadb-account-create-update" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.982104 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d0a292-4600-45ed-a947-37e93bccaea8" containerName="mariadb-account-create-update" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.982265 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a96bf2-f9b9-4f1e-a884-202eb437a8fc" containerName="mariadb-account-create-update" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.982279 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2fe919f-6631-4332-af78-35e21d657657" containerName="mariadb-account-create-update" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.982293 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="49095dcf-0259-4bde-a0c1-695841ebd224" containerName="mariadb-database-create" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.982302 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d0a292-4600-45ed-a947-37e93bccaea8" containerName="mariadb-account-create-update" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.982311 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c981b84-6b45-4ba9-be71-22f074a5ccd4" containerName="mariadb-database-create" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.982319 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4675c789-acd8-480a-a09f-52ab9a827bb6" containerName="mariadb-database-create" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.985758 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.988439 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.988439 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-spntp" Feb 23 07:08:14 crc kubenswrapper[5047]: I0223 07:08:14.988603 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.010848 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wlldh"] Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.108604 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-scripts\") pod \"nova-cell0-conductor-db-sync-wlldh\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.108675 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-config-data\") pod \"nova-cell0-conductor-db-sync-wlldh\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.108786 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvxfg\" (UniqueName: \"kubernetes.io/projected/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-kube-api-access-vvxfg\") pod \"nova-cell0-conductor-db-sync-wlldh\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.108915 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wlldh\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.211469 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-scripts\") pod \"nova-cell0-conductor-db-sync-wlldh\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.211546 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-config-data\") pod \"nova-cell0-conductor-db-sync-wlldh\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.211618 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvxfg\" (UniqueName: \"kubernetes.io/projected/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-kube-api-access-vvxfg\") pod \"nova-cell0-conductor-db-sync-wlldh\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.211655 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wlldh\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.219533 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-config-data\") pod \"nova-cell0-conductor-db-sync-wlldh\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.219543 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wlldh\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.223312 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-scripts\") pod \"nova-cell0-conductor-db-sync-wlldh\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.231379 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvxfg\" (UniqueName: \"kubernetes.io/projected/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-kube-api-access-vvxfg\") pod \"nova-cell0-conductor-db-sync-wlldh\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.306548 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.783325 5047 generic.go:334] "Generic (PLEG): container finished" podID="eae8085b-f687-4458-b894-945152e6bf1c" containerID="d1225b352aa5840144c80459cab2c25ad24b82d59bf992e67f934f3278ca967e" exitCode=0 Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.783824 5047 generic.go:334] "Generic (PLEG): container finished" podID="eae8085b-f687-4458-b894-945152e6bf1c" containerID="abe051ca69b34167efc9d0de57b43e33b3c3eef04bc7b9d896ccfd8cfb9d4325" exitCode=2 Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.783406 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eae8085b-f687-4458-b894-945152e6bf1c","Type":"ContainerDied","Data":"d1225b352aa5840144c80459cab2c25ad24b82d59bf992e67f934f3278ca967e"} Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.783874 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eae8085b-f687-4458-b894-945152e6bf1c","Type":"ContainerDied","Data":"abe051ca69b34167efc9d0de57b43e33b3c3eef04bc7b9d896ccfd8cfb9d4325"} Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.783886 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eae8085b-f687-4458-b894-945152e6bf1c","Type":"ContainerDied","Data":"eaa82cc27eb171c181a912e4e4fae34fda847e4b1b717c32d57ccb3666d1e945"} Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.783835 5047 generic.go:334] "Generic (PLEG): container finished" podID="eae8085b-f687-4458-b894-945152e6bf1c" containerID="eaa82cc27eb171c181a912e4e4fae34fda847e4b1b717c32d57ccb3666d1e945" exitCode=0 Feb 23 07:08:15 crc kubenswrapper[5047]: I0223 07:08:15.813878 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wlldh"] Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.791103 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.798214 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wlldh" event={"ID":"8a268e45-febe-4ae8-8c29-8f058d3b7e1d","Type":"ContainerStarted","Data":"e24da6e33daca5cc7a90ce5df0757966a1a477ed7b1390e86ff6aa3cb330c7f1"} Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.815179 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.815825 5047 generic.go:334] "Generic (PLEG): container finished" podID="261cbbc0-ab7f-4259-94b2-00ab04e23187" containerID="fc5b0a7fc5af478f48124dfbe64534cecb6a01a445a34e3c0217362bf0299435" exitCode=0 Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.815893 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"261cbbc0-ab7f-4259-94b2-00ab04e23187","Type":"ContainerDied","Data":"fc5b0a7fc5af478f48124dfbe64534cecb6a01a445a34e3c0217362bf0299435"} Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.815965 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"261cbbc0-ab7f-4259-94b2-00ab04e23187","Type":"ContainerDied","Data":"c009f361846efdcfc899415f01d7ab42bdc42d46552a6d6ada17f72951240444"} Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.815970 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.815990 5047 scope.go:117] "RemoveContainer" containerID="fc5b0a7fc5af478f48124dfbe64534cecb6a01a445a34e3c0217362bf0299435" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.856405 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.889677 5047 scope.go:117] "RemoveContainer" containerID="1dd1586e3c8a4c703445e7b1be23cebbba7ba7a30a2d668e41638e09f76c54f4" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.952494 5047 scope.go:117] "RemoveContainer" containerID="fc5b0a7fc5af478f48124dfbe64534cecb6a01a445a34e3c0217362bf0299435" Feb 23 07:08:16 crc kubenswrapper[5047]: E0223 07:08:16.952975 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc5b0a7fc5af478f48124dfbe64534cecb6a01a445a34e3c0217362bf0299435\": container with ID starting with fc5b0a7fc5af478f48124dfbe64534cecb6a01a445a34e3c0217362bf0299435 not found: ID does not exist" containerID="fc5b0a7fc5af478f48124dfbe64534cecb6a01a445a34e3c0217362bf0299435" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.953009 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc5b0a7fc5af478f48124dfbe64534cecb6a01a445a34e3c0217362bf0299435"} err="failed to get container status \"fc5b0a7fc5af478f48124dfbe64534cecb6a01a445a34e3c0217362bf0299435\": rpc error: code = NotFound desc = could not find container \"fc5b0a7fc5af478f48124dfbe64534cecb6a01a445a34e3c0217362bf0299435\": container with ID starting with fc5b0a7fc5af478f48124dfbe64534cecb6a01a445a34e3c0217362bf0299435 not found: ID does not exist" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.953031 5047 scope.go:117] "RemoveContainer" containerID="1dd1586e3c8a4c703445e7b1be23cebbba7ba7a30a2d668e41638e09f76c54f4" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.954867 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-public-tls-certs\") pod \"261cbbc0-ab7f-4259-94b2-00ab04e23187\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.954936 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-config-data\") pod \"261cbbc0-ab7f-4259-94b2-00ab04e23187\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.955002 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-scripts\") pod \"261cbbc0-ab7f-4259-94b2-00ab04e23187\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.955072 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/261cbbc0-ab7f-4259-94b2-00ab04e23187-httpd-run\") pod \"261cbbc0-ab7f-4259-94b2-00ab04e23187\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.955138 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/261cbbc0-ab7f-4259-94b2-00ab04e23187-logs\") pod \"261cbbc0-ab7f-4259-94b2-00ab04e23187\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.955176 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-combined-ca-bundle\") pod \"261cbbc0-ab7f-4259-94b2-00ab04e23187\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.955208 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"261cbbc0-ab7f-4259-94b2-00ab04e23187\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.955261 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtqwl\" (UniqueName: \"kubernetes.io/projected/261cbbc0-ab7f-4259-94b2-00ab04e23187-kube-api-access-jtqwl\") pod \"261cbbc0-ab7f-4259-94b2-00ab04e23187\" (UID: \"261cbbc0-ab7f-4259-94b2-00ab04e23187\") " Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.956387 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/261cbbc0-ab7f-4259-94b2-00ab04e23187-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "261cbbc0-ab7f-4259-94b2-00ab04e23187" (UID: "261cbbc0-ab7f-4259-94b2-00ab04e23187"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:16 crc kubenswrapper[5047]: E0223 07:08:16.956571 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd1586e3c8a4c703445e7b1be23cebbba7ba7a30a2d668e41638e09f76c54f4\": container with ID starting with 1dd1586e3c8a4c703445e7b1be23cebbba7ba7a30a2d668e41638e09f76c54f4 not found: ID does not exist" containerID="1dd1586e3c8a4c703445e7b1be23cebbba7ba7a30a2d668e41638e09f76c54f4" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.956617 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd1586e3c8a4c703445e7b1be23cebbba7ba7a30a2d668e41638e09f76c54f4"} err="failed to get container status \"1dd1586e3c8a4c703445e7b1be23cebbba7ba7a30a2d668e41638e09f76c54f4\": rpc error: code = NotFound desc = could not find container \"1dd1586e3c8a4c703445e7b1be23cebbba7ba7a30a2d668e41638e09f76c54f4\": container with ID starting with 1dd1586e3c8a4c703445e7b1be23cebbba7ba7a30a2d668e41638e09f76c54f4 not found: ID does not exist" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.957231 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/261cbbc0-ab7f-4259-94b2-00ab04e23187-logs" (OuterVolumeSpecName: "logs") pod "261cbbc0-ab7f-4259-94b2-00ab04e23187" (UID: "261cbbc0-ab7f-4259-94b2-00ab04e23187"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.963663 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "261cbbc0-ab7f-4259-94b2-00ab04e23187" (UID: "261cbbc0-ab7f-4259-94b2-00ab04e23187"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.965835 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/261cbbc0-ab7f-4259-94b2-00ab04e23187-kube-api-access-jtqwl" (OuterVolumeSpecName: "kube-api-access-jtqwl") pod "261cbbc0-ab7f-4259-94b2-00ab04e23187" (UID: "261cbbc0-ab7f-4259-94b2-00ab04e23187"). InnerVolumeSpecName "kube-api-access-jtqwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:16 crc kubenswrapper[5047]: I0223 07:08:16.969183 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-scripts" (OuterVolumeSpecName: "scripts") pod "261cbbc0-ab7f-4259-94b2-00ab04e23187" (UID: "261cbbc0-ab7f-4259-94b2-00ab04e23187"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.020921 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-config-data" (OuterVolumeSpecName: "config-data") pod "261cbbc0-ab7f-4259-94b2-00ab04e23187" (UID: "261cbbc0-ab7f-4259-94b2-00ab04e23187"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.022947 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "261cbbc0-ab7f-4259-94b2-00ab04e23187" (UID: "261cbbc0-ab7f-4259-94b2-00ab04e23187"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.058715 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.058792 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.058806 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtqwl\" (UniqueName: \"kubernetes.io/projected/261cbbc0-ab7f-4259-94b2-00ab04e23187-kube-api-access-jtqwl\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.058830 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.058841 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.058852 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/261cbbc0-ab7f-4259-94b2-00ab04e23187-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.058865 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/261cbbc0-ab7f-4259-94b2-00ab04e23187-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.090632 5047 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.101420 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "261cbbc0-ab7f-4259-94b2-00ab04e23187" (UID: "261cbbc0-ab7f-4259-94b2-00ab04e23187"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.165766 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/261cbbc0-ab7f-4259-94b2-00ab04e23187-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.165828 5047 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.211365 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.236020 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.245825 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:08:17 crc kubenswrapper[5047]: E0223 07:08:17.246341 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261cbbc0-ab7f-4259-94b2-00ab04e23187" containerName="glance-httpd" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.246354 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="261cbbc0-ab7f-4259-94b2-00ab04e23187" containerName="glance-httpd" Feb 23 07:08:17 crc kubenswrapper[5047]: E0223 07:08:17.246376 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261cbbc0-ab7f-4259-94b2-00ab04e23187" containerName="glance-log" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.246381 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="261cbbc0-ab7f-4259-94b2-00ab04e23187" containerName="glance-log" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.246558 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="261cbbc0-ab7f-4259-94b2-00ab04e23187" containerName="glance-log" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.246581 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="261cbbc0-ab7f-4259-94b2-00ab04e23187" containerName="glance-httpd" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.247698 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.255777 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.258535 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.258993 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.370165 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-scripts\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.370271 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njggm\" (UniqueName: \"kubernetes.io/projected/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-kube-api-access-njggm\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.370314 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.370350 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-config-data\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.370459 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.370485 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-logs\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.370519 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.370564 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.472737 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.472931 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.472963 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-scripts\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.473013 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njggm\" (UniqueName: \"kubernetes.io/projected/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-kube-api-access-njggm\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.473077 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.473124 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-config-data\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.473178 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.473215 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-logs\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.473962 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-logs\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.473953 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.474113 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.477464 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-scripts\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.477639 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.479755 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.482423 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-config-data\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.495011 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njggm\" (UniqueName: \"kubernetes.io/projected/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-kube-api-access-njggm\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.518376 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " pod="openstack/glance-default-external-api-0" Feb 23 07:08:17 crc kubenswrapper[5047]: I0223 07:08:17.615488 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:08:18 crc kubenswrapper[5047]: I0223 07:08:18.203758 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:08:18 crc kubenswrapper[5047]: I0223 07:08:18.360390 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="261cbbc0-ab7f-4259-94b2-00ab04e23187" path="/var/lib/kubelet/pods/261cbbc0-ab7f-4259-94b2-00ab04e23187/volumes" Feb 23 07:08:18 crc kubenswrapper[5047]: I0223 07:08:18.852266 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d9eb562-3f84-458f-885a-e2fbb3e86bf3","Type":"ContainerStarted","Data":"274a07768bdf167f837f50d8e0595e4b99f3367f27a526c0ec7cbb5e02613e3a"} Feb 23 07:08:19 crc kubenswrapper[5047]: I0223 07:08:19.658297 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:08:19 crc kubenswrapper[5047]: I0223 07:08:19.889202 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d9eb562-3f84-458f-885a-e2fbb3e86bf3","Type":"ContainerStarted","Data":"182404da8e3655734af75c4e164a4fa1026d1387cba22058cc3f4e7a9156ac6b"} Feb 23 07:08:19 crc kubenswrapper[5047]: I0223 07:08:19.889269 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d9eb562-3f84-458f-885a-e2fbb3e86bf3","Type":"ContainerStarted","Data":"9fde3b6829cb3940810ad8ed9c3d9375377b1ab751496ae29f4114c773487dad"} Feb 23 07:08:19 crc kubenswrapper[5047]: I0223 07:08:19.912464 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:08:19 crc kubenswrapper[5047]: I0223 07:08:19.918976 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.918955721 podStartE2EDuration="2.918955721s" podCreationTimestamp="2026-02-23 07:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:19.914005327 +0000 UTC m=+1422.165332461" watchObservedRunningTime="2026-02-23 07:08:19.918955721 +0000 UTC m=+1422.170282855" Feb 23 07:08:19 crc kubenswrapper[5047]: I0223 07:08:19.986877 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c6b9994d4-nt845"] Feb 23 07:08:19 crc kubenswrapper[5047]: I0223 07:08:19.988417 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-c6b9994d4-nt845" podUID="a7cca3de-06a4-4adf-b400-fe78a34dbb65" containerName="placement-log" containerID="cri-o://71648def4db91a786a875c748f6f4a8187a401446c788caf05a7425b249ebded" gracePeriod=30 Feb 23 07:08:19 crc kubenswrapper[5047]: I0223 07:08:19.988658 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-c6b9994d4-nt845" podUID="a7cca3de-06a4-4adf-b400-fe78a34dbb65" containerName="placement-api" containerID="cri-o://4b60e5151d596147b7021c9545594a95c4b3f30414105fd3a7f9fe33908978eb" gracePeriod=30 Feb 23 07:08:20 crc kubenswrapper[5047]: I0223 07:08:20.574181 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:20 crc kubenswrapper[5047]: I0223 07:08:20.574650 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:20 crc kubenswrapper[5047]: I0223 07:08:20.624027 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:20 crc kubenswrapper[5047]: I0223 07:08:20.642625 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:20 crc kubenswrapper[5047]: I0223 07:08:20.913856 5047 generic.go:334] "Generic (PLEG): container finished" podID="a7cca3de-06a4-4adf-b400-fe78a34dbb65" containerID="71648def4db91a786a875c748f6f4a8187a401446c788caf05a7425b249ebded" exitCode=143 Feb 23 07:08:20 crc kubenswrapper[5047]: I0223 07:08:20.913963 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c6b9994d4-nt845" event={"ID":"a7cca3de-06a4-4adf-b400-fe78a34dbb65","Type":"ContainerDied","Data":"71648def4db91a786a875c748f6f4a8187a401446c788caf05a7425b249ebded"} Feb 23 07:08:20 crc kubenswrapper[5047]: I0223 07:08:20.915686 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:20 crc kubenswrapper[5047]: I0223 07:08:20.915707 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:21 crc kubenswrapper[5047]: I0223 07:08:21.929767 5047 generic.go:334] "Generic (PLEG): container finished" podID="eae8085b-f687-4458-b894-945152e6bf1c" containerID="3fbc087f12473c0d3d8711443916113e34cd574d564699d6043b597f9dd6f857" exitCode=0 Feb 23 07:08:21 crc kubenswrapper[5047]: I0223 07:08:21.929849 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eae8085b-f687-4458-b894-945152e6bf1c","Type":"ContainerDied","Data":"3fbc087f12473c0d3d8711443916113e34cd574d564699d6043b597f9dd6f857"} Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.501224 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.610672 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-combined-ca-bundle\") pod \"eae8085b-f687-4458-b894-945152e6bf1c\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.610891 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eae8085b-f687-4458-b894-945152e6bf1c-log-httpd\") pod \"eae8085b-f687-4458-b894-945152e6bf1c\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.610993 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-scripts\") pod \"eae8085b-f687-4458-b894-945152e6bf1c\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.611029 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eae8085b-f687-4458-b894-945152e6bf1c-run-httpd\") pod \"eae8085b-f687-4458-b894-945152e6bf1c\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.611090 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-config-data\") pod \"eae8085b-f687-4458-b894-945152e6bf1c\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.611139 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-sg-core-conf-yaml\") pod \"eae8085b-f687-4458-b894-945152e6bf1c\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.611256 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzchz\" (UniqueName: \"kubernetes.io/projected/eae8085b-f687-4458-b894-945152e6bf1c-kube-api-access-xzchz\") pod \"eae8085b-f687-4458-b894-945152e6bf1c\" (UID: \"eae8085b-f687-4458-b894-945152e6bf1c\") " Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.612845 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eae8085b-f687-4458-b894-945152e6bf1c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eae8085b-f687-4458-b894-945152e6bf1c" (UID: "eae8085b-f687-4458-b894-945152e6bf1c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.613120 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eae8085b-f687-4458-b894-945152e6bf1c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eae8085b-f687-4458-b894-945152e6bf1c" (UID: "eae8085b-f687-4458-b894-945152e6bf1c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.619252 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae8085b-f687-4458-b894-945152e6bf1c-kube-api-access-xzchz" (OuterVolumeSpecName: "kube-api-access-xzchz") pod "eae8085b-f687-4458-b894-945152e6bf1c" (UID: "eae8085b-f687-4458-b894-945152e6bf1c"). InnerVolumeSpecName "kube-api-access-xzchz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.651267 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-scripts" (OuterVolumeSpecName: "scripts") pod "eae8085b-f687-4458-b894-945152e6bf1c" (UID: "eae8085b-f687-4458-b894-945152e6bf1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.655793 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eae8085b-f687-4458-b894-945152e6bf1c" (UID: "eae8085b-f687-4458-b894-945152e6bf1c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.714409 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.714448 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eae8085b-f687-4458-b894-945152e6bf1c-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.714459 5047 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.714468 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzchz\" (UniqueName: \"kubernetes.io/projected/eae8085b-f687-4458-b894-945152e6bf1c-kube-api-access-xzchz\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.714481 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eae8085b-f687-4458-b894-945152e6bf1c-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.727638 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eae8085b-f687-4458-b894-945152e6bf1c" (UID: "eae8085b-f687-4458-b894-945152e6bf1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.798220 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-config-data" (OuterVolumeSpecName: "config-data") pod "eae8085b-f687-4458-b894-945152e6bf1c" (UID: "eae8085b-f687-4458-b894-945152e6bf1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.816717 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.816754 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae8085b-f687-4458-b894-945152e6bf1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.957077 5047 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.957102 5047 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.957137 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.957068 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eae8085b-f687-4458-b894-945152e6bf1c","Type":"ContainerDied","Data":"3296dbbd357a381d279f52bf9fd9e406cf4782ef6e740c7b1cceb5e3a943922b"} Feb 23 07:08:22 crc kubenswrapper[5047]: I0223 07:08:22.957331 5047 scope.go:117] "RemoveContainer" containerID="d1225b352aa5840144c80459cab2c25ad24b82d59bf992e67f934f3278ca967e" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.010264 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.024509 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.053858 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:23 crc kubenswrapper[5047]: E0223 07:08:23.054988 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="ceilometer-central-agent" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.055008 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="ceilometer-central-agent" Feb 23 07:08:23 crc kubenswrapper[5047]: E0223 07:08:23.055037 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="sg-core" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.055044 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="sg-core" Feb 23 07:08:23 crc kubenswrapper[5047]: E0223 07:08:23.055058 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="ceilometer-notification-agent" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.055064 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="ceilometer-notification-agent" Feb 23 07:08:23 crc kubenswrapper[5047]: E0223 07:08:23.055080 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="proxy-httpd" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.055087 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="proxy-httpd" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.055337 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="proxy-httpd" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.055383 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="sg-core" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.055406 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="ceilometer-central-agent" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.055426 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae8085b-f687-4458-b894-945152e6bf1c" containerName="ceilometer-notification-agent" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.057806 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.060764 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.061038 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.082758 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.227178 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct7c5\" (UniqueName: \"kubernetes.io/projected/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-kube-api-access-ct7c5\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.227240 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.227265 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-config-data\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.227282 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.227307 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-run-httpd\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.227604 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-log-httpd\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.227799 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-scripts\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.330080 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-log-httpd\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.330163 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-scripts\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.330254 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct7c5\" (UniqueName: \"kubernetes.io/projected/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-kube-api-access-ct7c5\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.330288 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.330307 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-config-data\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.330326 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.330342 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-run-httpd\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.330798 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-run-httpd\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.331117 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-log-httpd\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.337811 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.343073 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-scripts\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.344154 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-config-data\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.346724 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.357799 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct7c5\" (UniqueName: \"kubernetes.io/projected/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-kube-api-access-ct7c5\") pod \"ceilometer-0\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.388043 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.764100 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.769818 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.984062 5047 generic.go:334] "Generic (PLEG): container finished" podID="a7cca3de-06a4-4adf-b400-fe78a34dbb65" containerID="4b60e5151d596147b7021c9545594a95c4b3f30414105fd3a7f9fe33908978eb" exitCode=0 Feb 23 07:08:23 crc kubenswrapper[5047]: I0223 07:08:23.984118 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c6b9994d4-nt845" event={"ID":"a7cca3de-06a4-4adf-b400-fe78a34dbb65","Type":"ContainerDied","Data":"4b60e5151d596147b7021c9545594a95c4b3f30414105fd3a7f9fe33908978eb"} Feb 23 07:08:24 crc kubenswrapper[5047]: I0223 07:08:24.356476 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eae8085b-f687-4458-b894-945152e6bf1c" path="/var/lib/kubelet/pods/eae8085b-f687-4458-b894-945152e6bf1c/volumes" Feb 23 07:08:27 crc kubenswrapper[5047]: I0223 07:08:27.616408 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 07:08:27 crc kubenswrapper[5047]: I0223 07:08:27.616808 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 07:08:27 crc kubenswrapper[5047]: I0223 07:08:27.668863 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 07:08:27 crc kubenswrapper[5047]: I0223 07:08:27.681490 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 07:08:28 crc kubenswrapper[5047]: I0223 07:08:28.063645 5047 generic.go:334] "Generic (PLEG): container finished" podID="19e4074e-d9fa-436c-97d8-f9d90ea147e2" containerID="39e5ccb29704604c78f137df87ae9930b8b1ecf21406a4701a0850a629ebcdfd" exitCode=0 Feb 23 07:08:28 crc kubenswrapper[5047]: I0223 07:08:28.063731 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ff9c9c5c6-pkfqh" event={"ID":"19e4074e-d9fa-436c-97d8-f9d90ea147e2","Type":"ContainerDied","Data":"39e5ccb29704604c78f137df87ae9930b8b1ecf21406a4701a0850a629ebcdfd"} Feb 23 07:08:28 crc kubenswrapper[5047]: I0223 07:08:28.064199 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 07:08:28 crc kubenswrapper[5047]: I0223 07:08:28.064347 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 07:08:28 crc kubenswrapper[5047]: I0223 07:08:28.993118 5047 scope.go:117] "RemoveContainer" containerID="abe051ca69b34167efc9d0de57b43e33b3c3eef04bc7b9d896ccfd8cfb9d4325" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.047505 5047 scope.go:117] "RemoveContainer" containerID="eaa82cc27eb171c181a912e4e4fae34fda847e4b1b717c32d57ccb3666d1e945" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.157979 5047 scope.go:117] "RemoveContainer" containerID="3fbc087f12473c0d3d8711443916113e34cd574d564699d6043b597f9dd6f857" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.216944 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.326762 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwkk5\" (UniqueName: \"kubernetes.io/projected/a7cca3de-06a4-4adf-b400-fe78a34dbb65-kube-api-access-xwkk5\") pod \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.326896 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-combined-ca-bundle\") pod \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.329339 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-internal-tls-certs\") pod \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.329482 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-public-tls-certs\") pod \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.329551 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-scripts\") pod \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.329600 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-config-data\") pod \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.329630 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cca3de-06a4-4adf-b400-fe78a34dbb65-logs\") pod \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\" (UID: \"a7cca3de-06a4-4adf-b400-fe78a34dbb65\") " Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.330692 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7cca3de-06a4-4adf-b400-fe78a34dbb65-logs" (OuterVolumeSpecName: "logs") pod "a7cca3de-06a4-4adf-b400-fe78a34dbb65" (UID: "a7cca3de-06a4-4adf-b400-fe78a34dbb65"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.335017 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7cca3de-06a4-4adf-b400-fe78a34dbb65-kube-api-access-xwkk5" (OuterVolumeSpecName: "kube-api-access-xwkk5") pod "a7cca3de-06a4-4adf-b400-fe78a34dbb65" (UID: "a7cca3de-06a4-4adf-b400-fe78a34dbb65"). InnerVolumeSpecName "kube-api-access-xwkk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.338730 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-scripts" (OuterVolumeSpecName: "scripts") pod "a7cca3de-06a4-4adf-b400-fe78a34dbb65" (UID: "a7cca3de-06a4-4adf-b400-fe78a34dbb65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.434505 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.434957 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cca3de-06a4-4adf-b400-fe78a34dbb65-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.434973 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwkk5\" (UniqueName: \"kubernetes.io/projected/a7cca3de-06a4-4adf-b400-fe78a34dbb65-kube-api-access-xwkk5\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.488351 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-config-data" (OuterVolumeSpecName: "config-data") pod "a7cca3de-06a4-4adf-b400-fe78a34dbb65" (UID: "a7cca3de-06a4-4adf-b400-fe78a34dbb65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.535159 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.535341 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7cca3de-06a4-4adf-b400-fe78a34dbb65" (UID: "a7cca3de-06a4-4adf-b400-fe78a34dbb65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.537456 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.537484 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.604535 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a7cca3de-06a4-4adf-b400-fe78a34dbb65" (UID: "a7cca3de-06a4-4adf-b400-fe78a34dbb65"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.611762 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a7cca3de-06a4-4adf-b400-fe78a34dbb65" (UID: "a7cca3de-06a4-4adf-b400-fe78a34dbb65"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.639016 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbxl8\" (UniqueName: \"kubernetes.io/projected/19e4074e-d9fa-436c-97d8-f9d90ea147e2-kube-api-access-vbxl8\") pod \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.639221 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-ovndb-tls-certs\") pod \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.639497 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-config\") pod \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.639559 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-httpd-config\") pod \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.639611 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-combined-ca-bundle\") pod \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\" (UID: \"19e4074e-d9fa-436c-97d8-f9d90ea147e2\") " Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.640321 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.640355 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cca3de-06a4-4adf-b400-fe78a34dbb65-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.647654 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "19e4074e-d9fa-436c-97d8-f9d90ea147e2" (UID: "19e4074e-d9fa-436c-97d8-f9d90ea147e2"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.647814 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e4074e-d9fa-436c-97d8-f9d90ea147e2-kube-api-access-vbxl8" (OuterVolumeSpecName: "kube-api-access-vbxl8") pod "19e4074e-d9fa-436c-97d8-f9d90ea147e2" (UID: "19e4074e-d9fa-436c-97d8-f9d90ea147e2"). InnerVolumeSpecName "kube-api-access-vbxl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.710395 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19e4074e-d9fa-436c-97d8-f9d90ea147e2" (UID: "19e4074e-d9fa-436c-97d8-f9d90ea147e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.714244 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:29 crc kubenswrapper[5047]: W0223 07:08:29.717057 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd1c5cd9_e4e7_4d73_8ba5_06775687b89c.slice/crio-07f8704a404e28a38d52f536fd981261a2f37016da9c6fb0e2a28852d9173e77 WatchSource:0}: Error finding container 07f8704a404e28a38d52f536fd981261a2f37016da9c6fb0e2a28852d9173e77: Status 404 returned error can't find the container with id 07f8704a404e28a38d52f536fd981261a2f37016da9c6fb0e2a28852d9173e77 Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.719343 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-config" (OuterVolumeSpecName: "config") pod "19e4074e-d9fa-436c-97d8-f9d90ea147e2" (UID: "19e4074e-d9fa-436c-97d8-f9d90ea147e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.746988 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.747042 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.747059 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbxl8\" (UniqueName: \"kubernetes.io/projected/19e4074e-d9fa-436c-97d8-f9d90ea147e2-kube-api-access-vbxl8\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.747074 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.763552 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "19e4074e-d9fa-436c-97d8-f9d90ea147e2" (UID: "19e4074e-d9fa-436c-97d8-f9d90ea147e2"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:29 crc kubenswrapper[5047]: I0223 07:08:29.849108 5047 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/19e4074e-d9fa-436c-97d8-f9d90ea147e2-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.159322 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c6b9994d4-nt845" event={"ID":"a7cca3de-06a4-4adf-b400-fe78a34dbb65","Type":"ContainerDied","Data":"09e358999d6346f0b6b1719bea325de7c97c4864c62d52af83d3d2ba1e7d56b1"} Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.159860 5047 scope.go:117] "RemoveContainer" containerID="4b60e5151d596147b7021c9545594a95c4b3f30414105fd3a7f9fe33908978eb" Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.160115 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c6b9994d4-nt845" Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.179142 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wlldh" event={"ID":"8a268e45-febe-4ae8-8c29-8f058d3b7e1d","Type":"ContainerStarted","Data":"85d35c7768b566f85eab0cf6425b4ba1d8ca97cbc5b6722b7a066d3865422d8e"} Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.184244 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ff9c9c5c6-pkfqh" event={"ID":"19e4074e-d9fa-436c-97d8-f9d90ea147e2","Type":"ContainerDied","Data":"c5e5f341371e2c3ad0918231991c95e427953084a3ddaf656a0b81f186095417"} Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.184365 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ff9c9c5c6-pkfqh" Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.208013 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c","Type":"ContainerStarted","Data":"07f8704a404e28a38d52f536fd981261a2f37016da9c6fb0e2a28852d9173e77"} Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.217569 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-wlldh" podStartSLOduration=3.017454557 podStartE2EDuration="16.217554966s" podCreationTimestamp="2026-02-23 07:08:14 +0000 UTC" firstStartedPulling="2026-02-23 07:08:15.820026662 +0000 UTC m=+1418.071353786" lastFinishedPulling="2026-02-23 07:08:29.020127061 +0000 UTC m=+1431.271454195" observedRunningTime="2026-02-23 07:08:30.215505444 +0000 UTC m=+1432.466832608" watchObservedRunningTime="2026-02-23 07:08:30.217554966 +0000 UTC m=+1432.468882100" Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.278148 5047 scope.go:117] "RemoveContainer" containerID="71648def4db91a786a875c748f6f4a8187a401446c788caf05a7425b249ebded" Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.319484 5047 scope.go:117] "RemoveContainer" containerID="48beb39652f6bbd9ec63e7ff1fa9cf1cf2543500d09199781370b0ee22d9c880" Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.363042 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c6b9994d4-nt845"] Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.363088 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c6b9994d4-nt845"] Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.377145 5047 scope.go:117] "RemoveContainer" containerID="39e5ccb29704604c78f137df87ae9930b8b1ecf21406a4701a0850a629ebcdfd" Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.379255 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5ff9c9c5c6-pkfqh"] Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.387640 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5ff9c9c5c6-pkfqh"] Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.796456 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.797161 5047 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:08:30 crc kubenswrapper[5047]: I0223 07:08:30.805020 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 07:08:31 crc kubenswrapper[5047]: I0223 07:08:31.239275 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:31 crc kubenswrapper[5047]: I0223 07:08:31.249124 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c","Type":"ContainerStarted","Data":"50fe878f105f28b7e44f72ee7869d984bdcae2673045f30f9158c67a8a0ca32f"} Feb 23 07:08:31 crc kubenswrapper[5047]: I0223 07:08:31.249200 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c","Type":"ContainerStarted","Data":"4ff9fae1ad5491a02509b3b94a974addb71a232b2e485a758c15bba57bc8d015"} Feb 23 07:08:32 crc kubenswrapper[5047]: I0223 07:08:32.264538 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c","Type":"ContainerStarted","Data":"527aa4d7904312806104cb0f238ef6c7ddf334b973afadf465f573497126282d"} Feb 23 07:08:32 crc kubenswrapper[5047]: I0223 07:08:32.352028 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e4074e-d9fa-436c-97d8-f9d90ea147e2" path="/var/lib/kubelet/pods/19e4074e-d9fa-436c-97d8-f9d90ea147e2/volumes" Feb 23 07:08:32 crc kubenswrapper[5047]: I0223 07:08:32.353250 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7cca3de-06a4-4adf-b400-fe78a34dbb65" path="/var/lib/kubelet/pods/a7cca3de-06a4-4adf-b400-fe78a34dbb65/volumes" Feb 23 07:08:34 crc kubenswrapper[5047]: I0223 07:08:34.288661 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c","Type":"ContainerStarted","Data":"31d37cdff43ca5ac3ce82bf97fd3e78ed32ca9df7ff1781016858cf5d2a6de76"} Feb 23 07:08:34 crc kubenswrapper[5047]: I0223 07:08:34.289380 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="ceilometer-central-agent" containerID="cri-o://4ff9fae1ad5491a02509b3b94a974addb71a232b2e485a758c15bba57bc8d015" gracePeriod=30 Feb 23 07:08:34 crc kubenswrapper[5047]: I0223 07:08:34.289785 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5047]: I0223 07:08:34.290262 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="proxy-httpd" containerID="cri-o://31d37cdff43ca5ac3ce82bf97fd3e78ed32ca9df7ff1781016858cf5d2a6de76" gracePeriod=30 Feb 23 07:08:34 crc kubenswrapper[5047]: I0223 07:08:34.290343 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="sg-core" containerID="cri-o://527aa4d7904312806104cb0f238ef6c7ddf334b973afadf465f573497126282d" gracePeriod=30 Feb 23 07:08:34 crc kubenswrapper[5047]: I0223 07:08:34.290405 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="ceilometer-notification-agent" containerID="cri-o://50fe878f105f28b7e44f72ee7869d984bdcae2673045f30f9158c67a8a0ca32f" gracePeriod=30 Feb 23 07:08:34 crc kubenswrapper[5047]: I0223 07:08:34.328211 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.830034947 podStartE2EDuration="11.328176361s" podCreationTimestamp="2026-02-23 07:08:23 +0000 UTC" firstStartedPulling="2026-02-23 07:08:29.721297804 +0000 UTC m=+1431.972624938" lastFinishedPulling="2026-02-23 07:08:33.219439218 +0000 UTC m=+1435.470766352" observedRunningTime="2026-02-23 07:08:34.318313272 +0000 UTC m=+1436.569640426" watchObservedRunningTime="2026-02-23 07:08:34.328176361 +0000 UTC m=+1436.579503535" Feb 23 07:08:35 crc kubenswrapper[5047]: I0223 07:08:35.308224 5047 generic.go:334] "Generic (PLEG): container finished" podID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerID="31d37cdff43ca5ac3ce82bf97fd3e78ed32ca9df7ff1781016858cf5d2a6de76" exitCode=0 Feb 23 07:08:35 crc kubenswrapper[5047]: I0223 07:08:35.308662 5047 generic.go:334] "Generic (PLEG): container finished" podID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerID="527aa4d7904312806104cb0f238ef6c7ddf334b973afadf465f573497126282d" exitCode=2 Feb 23 07:08:35 crc kubenswrapper[5047]: I0223 07:08:35.308677 5047 generic.go:334] "Generic (PLEG): container finished" podID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerID="50fe878f105f28b7e44f72ee7869d984bdcae2673045f30f9158c67a8a0ca32f" exitCode=0 Feb 23 07:08:35 crc kubenswrapper[5047]: I0223 07:08:35.308525 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c","Type":"ContainerDied","Data":"31d37cdff43ca5ac3ce82bf97fd3e78ed32ca9df7ff1781016858cf5d2a6de76"} Feb 23 07:08:35 crc kubenswrapper[5047]: I0223 07:08:35.308727 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c","Type":"ContainerDied","Data":"527aa4d7904312806104cb0f238ef6c7ddf334b973afadf465f573497126282d"} Feb 23 07:08:35 crc kubenswrapper[5047]: I0223 07:08:35.308746 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c","Type":"ContainerDied","Data":"50fe878f105f28b7e44f72ee7869d984bdcae2673045f30f9158c67a8a0ca32f"} Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.392132 5047 generic.go:334] "Generic (PLEG): container finished" podID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerID="4ff9fae1ad5491a02509b3b94a974addb71a232b2e485a758c15bba57bc8d015" exitCode=0 Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.392805 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c","Type":"ContainerDied","Data":"4ff9fae1ad5491a02509b3b94a974addb71a232b2e485a758c15bba57bc8d015"} Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.748540 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.834697 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" (UID: "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.833989 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-run-httpd\") pod \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.835844 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-combined-ca-bundle\") pod \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.835950 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-sg-core-conf-yaml\") pod \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.836007 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct7c5\" (UniqueName: \"kubernetes.io/projected/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-kube-api-access-ct7c5\") pod \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.836035 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-scripts\") pod \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.836065 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-log-httpd\") pod \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.839043 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-config-data\") pod \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\" (UID: \"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c\") " Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.839869 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" (UID: "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.841161 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.841304 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.860024 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-kube-api-access-ct7c5" (OuterVolumeSpecName: "kube-api-access-ct7c5") pod "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" (UID: "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c"). InnerVolumeSpecName "kube-api-access-ct7c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.861393 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-scripts" (OuterVolumeSpecName: "scripts") pod "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" (UID: "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.868311 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" (UID: "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.946063 5047 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.946114 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct7c5\" (UniqueName: \"kubernetes.io/projected/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-kube-api-access-ct7c5\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.946129 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.950718 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" (UID: "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:40 crc kubenswrapper[5047]: I0223 07:08:40.972392 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-config-data" (OuterVolumeSpecName: "config-data") pod "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" (UID: "fd1c5cd9-e4e7-4d73-8ba5-06775687b89c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.047653 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.047686 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.407107 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd1c5cd9-e4e7-4d73-8ba5-06775687b89c","Type":"ContainerDied","Data":"07f8704a404e28a38d52f536fd981261a2f37016da9c6fb0e2a28852d9173e77"} Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.408998 5047 scope.go:117] "RemoveContainer" containerID="31d37cdff43ca5ac3ce82bf97fd3e78ed32ca9df7ff1781016858cf5d2a6de76" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.407245 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.438637 5047 scope.go:117] "RemoveContainer" containerID="527aa4d7904312806104cb0f238ef6c7ddf334b973afadf465f573497126282d" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.461566 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.485481 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.502484 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:41 crc kubenswrapper[5047]: E0223 07:08:41.507151 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7cca3de-06a4-4adf-b400-fe78a34dbb65" containerName="placement-api" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507189 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7cca3de-06a4-4adf-b400-fe78a34dbb65" containerName="placement-api" Feb 23 07:08:41 crc kubenswrapper[5047]: E0223 07:08:41.507232 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="sg-core" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507240 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="sg-core" Feb 23 07:08:41 crc kubenswrapper[5047]: E0223 07:08:41.507258 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e4074e-d9fa-436c-97d8-f9d90ea147e2" containerName="neutron-api" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507264 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e4074e-d9fa-436c-97d8-f9d90ea147e2" containerName="neutron-api" Feb 23 07:08:41 crc kubenswrapper[5047]: E0223 07:08:41.507276 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="ceilometer-notification-agent" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507283 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="ceilometer-notification-agent" Feb 23 07:08:41 crc kubenswrapper[5047]: E0223 07:08:41.507305 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7cca3de-06a4-4adf-b400-fe78a34dbb65" containerName="placement-log" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507311 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7cca3de-06a4-4adf-b400-fe78a34dbb65" containerName="placement-log" Feb 23 07:08:41 crc kubenswrapper[5047]: E0223 07:08:41.507324 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e4074e-d9fa-436c-97d8-f9d90ea147e2" containerName="neutron-httpd" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507332 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e4074e-d9fa-436c-97d8-f9d90ea147e2" containerName="neutron-httpd" Feb 23 07:08:41 crc kubenswrapper[5047]: E0223 07:08:41.507341 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="ceilometer-central-agent" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507348 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="ceilometer-central-agent" Feb 23 07:08:41 crc kubenswrapper[5047]: E0223 07:08:41.507357 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="proxy-httpd" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507363 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="proxy-httpd" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507647 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="proxy-httpd" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507659 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7cca3de-06a4-4adf-b400-fe78a34dbb65" containerName="placement-api" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507670 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e4074e-d9fa-436c-97d8-f9d90ea147e2" containerName="neutron-api" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507683 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e4074e-d9fa-436c-97d8-f9d90ea147e2" containerName="neutron-httpd" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507693 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7cca3de-06a4-4adf-b400-fe78a34dbb65" containerName="placement-log" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507705 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="ceilometer-central-agent" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507716 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="ceilometer-notification-agent" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.507726 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" containerName="sg-core" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.513333 5047 scope.go:117] "RemoveContainer" containerID="50fe878f105f28b7e44f72ee7869d984bdcae2673045f30f9158c67a8a0ca32f" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.524738 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.524884 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.527378 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.528452 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.569882 5047 scope.go:117] "RemoveContainer" containerID="4ff9fae1ad5491a02509b3b94a974addb71a232b2e485a758c15bba57bc8d015" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.661657 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-scripts\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.661728 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/145fe444-5cf8-408b-8e73-43cf4612d8fb-run-httpd\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.661760 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/145fe444-5cf8-408b-8e73-43cf4612d8fb-log-httpd\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.661781 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-config-data\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.661826 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2qpp\" (UniqueName: \"kubernetes.io/projected/145fe444-5cf8-408b-8e73-43cf4612d8fb-kube-api-access-m2qpp\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.661876 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.661925 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.764209 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-scripts\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.764320 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/145fe444-5cf8-408b-8e73-43cf4612d8fb-run-httpd\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.764359 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/145fe444-5cf8-408b-8e73-43cf4612d8fb-log-httpd\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.764386 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-config-data\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.764446 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2qpp\" (UniqueName: \"kubernetes.io/projected/145fe444-5cf8-408b-8e73-43cf4612d8fb-kube-api-access-m2qpp\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.764504 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.764549 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.766198 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/145fe444-5cf8-408b-8e73-43cf4612d8fb-run-httpd\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.766498 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/145fe444-5cf8-408b-8e73-43cf4612d8fb-log-httpd\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.773681 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.773891 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.774127 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-config-data\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.794245 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-scripts\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.798544 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2qpp\" (UniqueName: \"kubernetes.io/projected/145fe444-5cf8-408b-8e73-43cf4612d8fb-kube-api-access-m2qpp\") pod \"ceilometer-0\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " pod="openstack/ceilometer-0" Feb 23 07:08:41 crc kubenswrapper[5047]: I0223 07:08:41.865577 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:42 crc kubenswrapper[5047]: I0223 07:08:42.351236 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd1c5cd9-e4e7-4d73-8ba5-06775687b89c" path="/var/lib/kubelet/pods/fd1c5cd9-e4e7-4d73-8ba5-06775687b89c/volumes" Feb 23 07:08:42 crc kubenswrapper[5047]: I0223 07:08:42.428686 5047 generic.go:334] "Generic (PLEG): container finished" podID="8a268e45-febe-4ae8-8c29-8f058d3b7e1d" containerID="85d35c7768b566f85eab0cf6425b4ba1d8ca97cbc5b6722b7a066d3865422d8e" exitCode=0 Feb 23 07:08:42 crc kubenswrapper[5047]: I0223 07:08:42.428743 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wlldh" event={"ID":"8a268e45-febe-4ae8-8c29-8f058d3b7e1d","Type":"ContainerDied","Data":"85d35c7768b566f85eab0cf6425b4ba1d8ca97cbc5b6722b7a066d3865422d8e"} Feb 23 07:08:42 crc kubenswrapper[5047]: I0223 07:08:42.430873 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:42 crc kubenswrapper[5047]: W0223 07:08:42.432884 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod145fe444_5cf8_408b_8e73_43cf4612d8fb.slice/crio-27bad713b4d8cf9d6d2487bf9c998d6f38a0fc0d6ab3b7e16694931a2076d3d1 WatchSource:0}: Error finding container 27bad713b4d8cf9d6d2487bf9c998d6f38a0fc0d6ab3b7e16694931a2076d3d1: Status 404 returned error can't find the container with id 27bad713b4d8cf9d6d2487bf9c998d6f38a0fc0d6ab3b7e16694931a2076d3d1 Feb 23 07:08:43 crc kubenswrapper[5047]: I0223 07:08:43.440752 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"145fe444-5cf8-408b-8e73-43cf4612d8fb","Type":"ContainerStarted","Data":"5751ee81a594e2aa3943d910dce95002af9da3e9c256b1464fc01eeda3cc9699"} Feb 23 07:08:43 crc kubenswrapper[5047]: I0223 07:08:43.441414 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"145fe444-5cf8-408b-8e73-43cf4612d8fb","Type":"ContainerStarted","Data":"27bad713b4d8cf9d6d2487bf9c998d6f38a0fc0d6ab3b7e16694931a2076d3d1"} Feb 23 07:08:43 crc kubenswrapper[5047]: I0223 07:08:43.837006 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:43 crc kubenswrapper[5047]: I0223 07:08:43.920819 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-combined-ca-bundle\") pod \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " Feb 23 07:08:43 crc kubenswrapper[5047]: I0223 07:08:43.920936 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-scripts\") pod \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " Feb 23 07:08:43 crc kubenswrapper[5047]: I0223 07:08:43.920968 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvxfg\" (UniqueName: \"kubernetes.io/projected/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-kube-api-access-vvxfg\") pod \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " Feb 23 07:08:43 crc kubenswrapper[5047]: I0223 07:08:43.921021 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-config-data\") pod \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\" (UID: \"8a268e45-febe-4ae8-8c29-8f058d3b7e1d\") " Feb 23 07:08:43 crc kubenswrapper[5047]: I0223 07:08:43.927143 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-kube-api-access-vvxfg" (OuterVolumeSpecName: "kube-api-access-vvxfg") pod "8a268e45-febe-4ae8-8c29-8f058d3b7e1d" (UID: "8a268e45-febe-4ae8-8c29-8f058d3b7e1d"). InnerVolumeSpecName "kube-api-access-vvxfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:43 crc kubenswrapper[5047]: I0223 07:08:43.931243 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-scripts" (OuterVolumeSpecName: "scripts") pod "8a268e45-febe-4ae8-8c29-8f058d3b7e1d" (UID: "8a268e45-febe-4ae8-8c29-8f058d3b7e1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:43 crc kubenswrapper[5047]: I0223 07:08:43.950265 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a268e45-febe-4ae8-8c29-8f058d3b7e1d" (UID: "8a268e45-febe-4ae8-8c29-8f058d3b7e1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:43 crc kubenswrapper[5047]: I0223 07:08:43.977206 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-config-data" (OuterVolumeSpecName: "config-data") pod "8a268e45-febe-4ae8-8c29-8f058d3b7e1d" (UID: "8a268e45-febe-4ae8-8c29-8f058d3b7e1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.024431 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.024724 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.024787 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvxfg\" (UniqueName: \"kubernetes.io/projected/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-kube-api-access-vvxfg\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.024842 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a268e45-febe-4ae8-8c29-8f058d3b7e1d-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.455252 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wlldh" event={"ID":"8a268e45-febe-4ae8-8c29-8f058d3b7e1d","Type":"ContainerDied","Data":"e24da6e33daca5cc7a90ce5df0757966a1a477ed7b1390e86ff6aa3cb330c7f1"} Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.455767 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e24da6e33daca5cc7a90ce5df0757966a1a477ed7b1390e86ff6aa3cb330c7f1" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.455865 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wlldh" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.465222 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"145fe444-5cf8-408b-8e73-43cf4612d8fb","Type":"ContainerStarted","Data":"ad84c6ec5eabcb07103d6ec8277b5a04a1db646f7ae2ba8d38afb8609d719c5d"} Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.594776 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:08:44 crc kubenswrapper[5047]: E0223 07:08:44.595466 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a268e45-febe-4ae8-8c29-8f058d3b7e1d" containerName="nova-cell0-conductor-db-sync" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.595492 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a268e45-febe-4ae8-8c29-8f058d3b7e1d" containerName="nova-cell0-conductor-db-sync" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.595746 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a268e45-febe-4ae8-8c29-8f058d3b7e1d" containerName="nova-cell0-conductor-db-sync" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.596826 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.599829 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-spntp" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.600252 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.625778 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.737774 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287ef842-0502-41ff-ae1c-ea849066edca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"287ef842-0502-41ff-ae1c-ea849066edca\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.738329 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/287ef842-0502-41ff-ae1c-ea849066edca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"287ef842-0502-41ff-ae1c-ea849066edca\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.738456 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m6fj\" (UniqueName: \"kubernetes.io/projected/287ef842-0502-41ff-ae1c-ea849066edca-kube-api-access-5m6fj\") pod \"nova-cell0-conductor-0\" (UID: \"287ef842-0502-41ff-ae1c-ea849066edca\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.841051 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/287ef842-0502-41ff-ae1c-ea849066edca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"287ef842-0502-41ff-ae1c-ea849066edca\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.841113 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6fj\" (UniqueName: \"kubernetes.io/projected/287ef842-0502-41ff-ae1c-ea849066edca-kube-api-access-5m6fj\") pod \"nova-cell0-conductor-0\" (UID: \"287ef842-0502-41ff-ae1c-ea849066edca\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.841170 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287ef842-0502-41ff-ae1c-ea849066edca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"287ef842-0502-41ff-ae1c-ea849066edca\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.847890 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287ef842-0502-41ff-ae1c-ea849066edca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"287ef842-0502-41ff-ae1c-ea849066edca\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.848803 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/287ef842-0502-41ff-ae1c-ea849066edca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"287ef842-0502-41ff-ae1c-ea849066edca\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.863285 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m6fj\" (UniqueName: \"kubernetes.io/projected/287ef842-0502-41ff-ae1c-ea849066edca-kube-api-access-5m6fj\") pod \"nova-cell0-conductor-0\" (UID: \"287ef842-0502-41ff-ae1c-ea849066edca\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:44 crc kubenswrapper[5047]: I0223 07:08:44.997807 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:45 crc kubenswrapper[5047]: I0223 07:08:45.430125 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:08:45 crc kubenswrapper[5047]: I0223 07:08:45.482448 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"145fe444-5cf8-408b-8e73-43cf4612d8fb","Type":"ContainerStarted","Data":"7460e60a106f9e4539d487e9f27d99fcb6446fb9ff6a9be35f6a77114ae65731"} Feb 23 07:08:45 crc kubenswrapper[5047]: W0223 07:08:45.485431 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod287ef842_0502_41ff_ae1c_ea849066edca.slice/crio-15c8e0d0f5cc82ed9d7ff73714695f02407212d8efe14c38c2c93f53e6f6368a WatchSource:0}: Error finding container 15c8e0d0f5cc82ed9d7ff73714695f02407212d8efe14c38c2c93f53e6f6368a: Status 404 returned error can't find the container with id 15c8e0d0f5cc82ed9d7ff73714695f02407212d8efe14c38c2c93f53e6f6368a Feb 23 07:08:45 crc kubenswrapper[5047]: I0223 07:08:45.493638 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:08:46 crc kubenswrapper[5047]: I0223 07:08:46.303626 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:46 crc kubenswrapper[5047]: I0223 07:08:46.495007 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"287ef842-0502-41ff-ae1c-ea849066edca","Type":"ContainerStarted","Data":"0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818"} Feb 23 07:08:46 crc kubenswrapper[5047]: I0223 07:08:46.495232 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"287ef842-0502-41ff-ae1c-ea849066edca","Type":"ContainerStarted","Data":"15c8e0d0f5cc82ed9d7ff73714695f02407212d8efe14c38c2c93f53e6f6368a"} Feb 23 07:08:46 crc kubenswrapper[5047]: I0223 07:08:46.495522 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="287ef842-0502-41ff-ae1c-ea849066edca" containerName="nova-cell0-conductor-conductor" containerID="cri-o://0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" gracePeriod=30 Feb 23 07:08:46 crc kubenswrapper[5047]: I0223 07:08:46.495593 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:46 crc kubenswrapper[5047]: I0223 07:08:46.507644 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"145fe444-5cf8-408b-8e73-43cf4612d8fb","Type":"ContainerStarted","Data":"4daed24c62831d00d640ce79e854033621425fe564637df4fe5f19dac6337961"} Feb 23 07:08:46 crc kubenswrapper[5047]: I0223 07:08:46.508833 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:08:46 crc kubenswrapper[5047]: I0223 07:08:46.522354 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.522329205 podStartE2EDuration="2.522329205s" podCreationTimestamp="2026-02-23 07:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:46.518367725 +0000 UTC m=+1448.769694859" watchObservedRunningTime="2026-02-23 07:08:46.522329205 +0000 UTC m=+1448.773656339" Feb 23 07:08:46 crc kubenswrapper[5047]: I0223 07:08:46.556799 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.299645892 podStartE2EDuration="5.556777505s" podCreationTimestamp="2026-02-23 07:08:41 +0000 UTC" firstStartedPulling="2026-02-23 07:08:42.435641335 +0000 UTC m=+1444.686968469" lastFinishedPulling="2026-02-23 07:08:45.692772948 +0000 UTC m=+1447.944100082" observedRunningTime="2026-02-23 07:08:46.548571398 +0000 UTC m=+1448.799898532" watchObservedRunningTime="2026-02-23 07:08:46.556777505 +0000 UTC m=+1448.808104639" Feb 23 07:08:47 crc kubenswrapper[5047]: I0223 07:08:47.517716 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="ceilometer-central-agent" containerID="cri-o://5751ee81a594e2aa3943d910dce95002af9da3e9c256b1464fc01eeda3cc9699" gracePeriod=30 Feb 23 07:08:47 crc kubenswrapper[5047]: I0223 07:08:47.517839 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="ceilometer-notification-agent" containerID="cri-o://ad84c6ec5eabcb07103d6ec8277b5a04a1db646f7ae2ba8d38afb8609d719c5d" gracePeriod=30 Feb 23 07:08:47 crc kubenswrapper[5047]: I0223 07:08:47.517870 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="proxy-httpd" containerID="cri-o://4daed24c62831d00d640ce79e854033621425fe564637df4fe5f19dac6337961" gracePeriod=30 Feb 23 07:08:47 crc kubenswrapper[5047]: I0223 07:08:47.520416 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="sg-core" containerID="cri-o://7460e60a106f9e4539d487e9f27d99fcb6446fb9ff6a9be35f6a77114ae65731" gracePeriod=30 Feb 23 07:08:48 crc kubenswrapper[5047]: I0223 07:08:48.535634 5047 generic.go:334] "Generic (PLEG): container finished" podID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerID="4daed24c62831d00d640ce79e854033621425fe564637df4fe5f19dac6337961" exitCode=0 Feb 23 07:08:48 crc kubenswrapper[5047]: I0223 07:08:48.536045 5047 generic.go:334] "Generic (PLEG): container finished" podID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerID="7460e60a106f9e4539d487e9f27d99fcb6446fb9ff6a9be35f6a77114ae65731" exitCode=2 Feb 23 07:08:48 crc kubenswrapper[5047]: I0223 07:08:48.536059 5047 generic.go:334] "Generic (PLEG): container finished" podID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerID="ad84c6ec5eabcb07103d6ec8277b5a04a1db646f7ae2ba8d38afb8609d719c5d" exitCode=0 Feb 23 07:08:48 crc kubenswrapper[5047]: I0223 07:08:48.535828 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"145fe444-5cf8-408b-8e73-43cf4612d8fb","Type":"ContainerDied","Data":"4daed24c62831d00d640ce79e854033621425fe564637df4fe5f19dac6337961"} Feb 23 07:08:48 crc kubenswrapper[5047]: I0223 07:08:48.536110 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"145fe444-5cf8-408b-8e73-43cf4612d8fb","Type":"ContainerDied","Data":"7460e60a106f9e4539d487e9f27d99fcb6446fb9ff6a9be35f6a77114ae65731"} Feb 23 07:08:48 crc kubenswrapper[5047]: I0223 07:08:48.536128 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"145fe444-5cf8-408b-8e73-43cf4612d8fb","Type":"ContainerDied","Data":"ad84c6ec5eabcb07103d6ec8277b5a04a1db646f7ae2ba8d38afb8609d719c5d"} Feb 23 07:08:50 crc kubenswrapper[5047]: E0223 07:08:50.021411 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:08:50 crc kubenswrapper[5047]: E0223 07:08:50.023705 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:08:50 crc kubenswrapper[5047]: E0223 07:08:50.025045 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:08:50 crc kubenswrapper[5047]: E0223 07:08:50.025079 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="287ef842-0502-41ff-ae1c-ea849066edca" containerName="nova-cell0-conductor-conductor" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.579051 5047 generic.go:334] "Generic (PLEG): container finished" podID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerID="5751ee81a594e2aa3943d910dce95002af9da3e9c256b1464fc01eeda3cc9699" exitCode=0 Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.579135 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"145fe444-5cf8-408b-8e73-43cf4612d8fb","Type":"ContainerDied","Data":"5751ee81a594e2aa3943d910dce95002af9da3e9c256b1464fc01eeda3cc9699"} Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.706162 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.828306 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2qpp\" (UniqueName: \"kubernetes.io/projected/145fe444-5cf8-408b-8e73-43cf4612d8fb-kube-api-access-m2qpp\") pod \"145fe444-5cf8-408b-8e73-43cf4612d8fb\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.828401 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-combined-ca-bundle\") pod \"145fe444-5cf8-408b-8e73-43cf4612d8fb\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.828513 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-scripts\") pod \"145fe444-5cf8-408b-8e73-43cf4612d8fb\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.828631 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/145fe444-5cf8-408b-8e73-43cf4612d8fb-run-httpd\") pod \"145fe444-5cf8-408b-8e73-43cf4612d8fb\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.828717 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-sg-core-conf-yaml\") pod \"145fe444-5cf8-408b-8e73-43cf4612d8fb\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.828806 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-config-data\") pod \"145fe444-5cf8-408b-8e73-43cf4612d8fb\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.828843 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/145fe444-5cf8-408b-8e73-43cf4612d8fb-log-httpd\") pod \"145fe444-5cf8-408b-8e73-43cf4612d8fb\" (UID: \"145fe444-5cf8-408b-8e73-43cf4612d8fb\") " Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.829738 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/145fe444-5cf8-408b-8e73-43cf4612d8fb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "145fe444-5cf8-408b-8e73-43cf4612d8fb" (UID: "145fe444-5cf8-408b-8e73-43cf4612d8fb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.829842 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/145fe444-5cf8-408b-8e73-43cf4612d8fb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "145fe444-5cf8-408b-8e73-43cf4612d8fb" (UID: "145fe444-5cf8-408b-8e73-43cf4612d8fb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.836056 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/145fe444-5cf8-408b-8e73-43cf4612d8fb-kube-api-access-m2qpp" (OuterVolumeSpecName: "kube-api-access-m2qpp") pod "145fe444-5cf8-408b-8e73-43cf4612d8fb" (UID: "145fe444-5cf8-408b-8e73-43cf4612d8fb"). InnerVolumeSpecName "kube-api-access-m2qpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.840039 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-scripts" (OuterVolumeSpecName: "scripts") pod "145fe444-5cf8-408b-8e73-43cf4612d8fb" (UID: "145fe444-5cf8-408b-8e73-43cf4612d8fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.858522 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "145fe444-5cf8-408b-8e73-43cf4612d8fb" (UID: "145fe444-5cf8-408b-8e73-43cf4612d8fb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.910776 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "145fe444-5cf8-408b-8e73-43cf4612d8fb" (UID: "145fe444-5cf8-408b-8e73-43cf4612d8fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.931687 5047 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.931741 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/145fe444-5cf8-408b-8e73-43cf4612d8fb-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.931758 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2qpp\" (UniqueName: \"kubernetes.io/projected/145fe444-5cf8-408b-8e73-43cf4612d8fb-kube-api-access-m2qpp\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.931773 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.931784 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.931795 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/145fe444-5cf8-408b-8e73-43cf4612d8fb-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:52 crc kubenswrapper[5047]: I0223 07:08:52.932887 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-config-data" (OuterVolumeSpecName: "config-data") pod "145fe444-5cf8-408b-8e73-43cf4612d8fb" (UID: "145fe444-5cf8-408b-8e73-43cf4612d8fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.033993 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145fe444-5cf8-408b-8e73-43cf4612d8fb-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.591680 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"145fe444-5cf8-408b-8e73-43cf4612d8fb","Type":"ContainerDied","Data":"27bad713b4d8cf9d6d2487bf9c998d6f38a0fc0d6ab3b7e16694931a2076d3d1"} Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.591742 5047 scope.go:117] "RemoveContainer" containerID="4daed24c62831d00d640ce79e854033621425fe564637df4fe5f19dac6337961" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.591891 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.632194 5047 scope.go:117] "RemoveContainer" containerID="7460e60a106f9e4539d487e9f27d99fcb6446fb9ff6a9be35f6a77114ae65731" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.670966 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.676183 5047 scope.go:117] "RemoveContainer" containerID="ad84c6ec5eabcb07103d6ec8277b5a04a1db646f7ae2ba8d38afb8609d719c5d" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.697003 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.713058 5047 scope.go:117] "RemoveContainer" containerID="5751ee81a594e2aa3943d910dce95002af9da3e9c256b1464fc01eeda3cc9699" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.718240 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:53 crc kubenswrapper[5047]: E0223 07:08:53.718693 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="ceilometer-notification-agent" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.718712 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="ceilometer-notification-agent" Feb 23 07:08:53 crc kubenswrapper[5047]: E0223 07:08:53.718740 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="sg-core" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.718751 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="sg-core" Feb 23 07:08:53 crc kubenswrapper[5047]: E0223 07:08:53.718763 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="ceilometer-central-agent" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.718769 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="ceilometer-central-agent" Feb 23 07:08:53 crc kubenswrapper[5047]: E0223 07:08:53.718793 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="proxy-httpd" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.718801 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="proxy-httpd" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.719007 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="sg-core" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.719023 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="proxy-httpd" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.719034 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="ceilometer-notification-agent" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.719050 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" containerName="ceilometer-central-agent" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.720675 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.726255 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.726510 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.817970 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.860158 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-scripts\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.860308 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7khkz\" (UniqueName: \"kubernetes.io/projected/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-kube-api-access-7khkz\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.860372 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-log-httpd\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.860672 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-run-httpd\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.860892 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.860993 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-config-data\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.861251 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.963432 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-run-httpd\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.963885 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.964011 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-config-data\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.964132 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.964253 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-scripts\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.964395 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-run-httpd\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.964889 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7khkz\" (UniqueName: \"kubernetes.io/projected/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-kube-api-access-7khkz\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.965039 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-log-httpd\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.965426 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-log-httpd\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.969576 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.969670 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-scripts\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.970604 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-config-data\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.971639 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5047]: I0223 07:08:53.986787 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7khkz\" (UniqueName: \"kubernetes.io/projected/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-kube-api-access-7khkz\") pod \"ceilometer-0\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5047]: I0223 07:08:54.104194 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5047]: I0223 07:08:54.355783 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="145fe444-5cf8-408b-8e73-43cf4612d8fb" path="/var/lib/kubelet/pods/145fe444-5cf8-408b-8e73-43cf4612d8fb/volumes" Feb 23 07:08:54 crc kubenswrapper[5047]: I0223 07:08:54.637973 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:55 crc kubenswrapper[5047]: E0223 07:08:55.001777 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:08:55 crc kubenswrapper[5047]: E0223 07:08:55.004159 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:08:55 crc kubenswrapper[5047]: E0223 07:08:55.006274 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:08:55 crc kubenswrapper[5047]: E0223 07:08:55.006347 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="287ef842-0502-41ff-ae1c-ea849066edca" containerName="nova-cell0-conductor-conductor" Feb 23 07:08:55 crc kubenswrapper[5047]: I0223 07:08:55.624225 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4","Type":"ContainerStarted","Data":"e0180980acad646ff81dc12d1d2091d5dcafd78c9e1c8d1583d842b36e52d4ce"} Feb 23 07:08:55 crc kubenswrapper[5047]: I0223 07:08:55.624964 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4","Type":"ContainerStarted","Data":"174fc1d9065bc442bb17d94cf085f30b567e60778ccf2edacbef965785977f7d"} Feb 23 07:08:56 crc kubenswrapper[5047]: I0223 07:08:56.635764 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4","Type":"ContainerStarted","Data":"8aec5caea1192302b7b1be1c7ec79843c909cd94e3ed9794a7052b7df22f1b8d"} Feb 23 07:08:57 crc kubenswrapper[5047]: I0223 07:08:57.649503 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4","Type":"ContainerStarted","Data":"e8c19b482335e07fe147b02aa7397108698ecbcd460d651e57c356e9bb8d81d2"} Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.648430 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q6n2t"] Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.653802 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.663791 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q6n2t"] Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.668949 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4","Type":"ContainerStarted","Data":"d9df5839961aafe72950647a46bfa6c0f2fe6e62de134192322d86962cb62235"} Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.669436 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.741829 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.684961907 podStartE2EDuration="5.741805809s" podCreationTimestamp="2026-02-23 07:08:53 +0000 UTC" firstStartedPulling="2026-02-23 07:08:54.661777006 +0000 UTC m=+1456.913104180" lastFinishedPulling="2026-02-23 07:08:57.718620938 +0000 UTC m=+1459.969948082" observedRunningTime="2026-02-23 07:08:58.736618577 +0000 UTC m=+1460.987945721" watchObservedRunningTime="2026-02-23 07:08:58.741805809 +0000 UTC m=+1460.993132953" Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.792746 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a17ee75-a91f-4d3c-b53c-57c52bb86743-catalog-content\") pod \"certified-operators-q6n2t\" (UID: \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\") " pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.792872 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgggg\" (UniqueName: \"kubernetes.io/projected/7a17ee75-a91f-4d3c-b53c-57c52bb86743-kube-api-access-fgggg\") pod \"certified-operators-q6n2t\" (UID: \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\") " pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.793032 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a17ee75-a91f-4d3c-b53c-57c52bb86743-utilities\") pod \"certified-operators-q6n2t\" (UID: \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\") " pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.895008 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a17ee75-a91f-4d3c-b53c-57c52bb86743-utilities\") pod \"certified-operators-q6n2t\" (UID: \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\") " pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.895102 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a17ee75-a91f-4d3c-b53c-57c52bb86743-catalog-content\") pod \"certified-operators-q6n2t\" (UID: \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\") " pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.895163 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgggg\" (UniqueName: \"kubernetes.io/projected/7a17ee75-a91f-4d3c-b53c-57c52bb86743-kube-api-access-fgggg\") pod \"certified-operators-q6n2t\" (UID: \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\") " pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.895616 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a17ee75-a91f-4d3c-b53c-57c52bb86743-utilities\") pod \"certified-operators-q6n2t\" (UID: \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\") " pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.895712 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a17ee75-a91f-4d3c-b53c-57c52bb86743-catalog-content\") pod \"certified-operators-q6n2t\" (UID: \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\") " pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.925164 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgggg\" (UniqueName: \"kubernetes.io/projected/7a17ee75-a91f-4d3c-b53c-57c52bb86743-kube-api-access-fgggg\") pod \"certified-operators-q6n2t\" (UID: \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\") " pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:08:58 crc kubenswrapper[5047]: I0223 07:08:58.988713 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:08:59 crc kubenswrapper[5047]: W0223 07:08:59.555966 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a17ee75_a91f_4d3c_b53c_57c52bb86743.slice/crio-f433295933b26bc728c488dc797f94f065fa6af601110a6b112666ea7b5106db WatchSource:0}: Error finding container f433295933b26bc728c488dc797f94f065fa6af601110a6b112666ea7b5106db: Status 404 returned error can't find the container with id f433295933b26bc728c488dc797f94f065fa6af601110a6b112666ea7b5106db Feb 23 07:08:59 crc kubenswrapper[5047]: I0223 07:08:59.563200 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q6n2t"] Feb 23 07:08:59 crc kubenswrapper[5047]: I0223 07:08:59.679285 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6n2t" event={"ID":"7a17ee75-a91f-4d3c-b53c-57c52bb86743","Type":"ContainerStarted","Data":"f433295933b26bc728c488dc797f94f065fa6af601110a6b112666ea7b5106db"} Feb 23 07:09:00 crc kubenswrapper[5047]: E0223 07:09:00.000543 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:09:00 crc kubenswrapper[5047]: E0223 07:09:00.007243 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:09:00 crc kubenswrapper[5047]: E0223 07:09:00.009384 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:09:00 crc kubenswrapper[5047]: E0223 07:09:00.009454 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="287ef842-0502-41ff-ae1c-ea849066edca" containerName="nova-cell0-conductor-conductor" Feb 23 07:09:00 crc kubenswrapper[5047]: I0223 07:09:00.691686 5047 generic.go:334] "Generic (PLEG): container finished" podID="7a17ee75-a91f-4d3c-b53c-57c52bb86743" containerID="c36f2bc13b2222876922b4baa8e613192d719f46dc525eb16a3c76f1cf5cd195" exitCode=0 Feb 23 07:09:00 crc kubenswrapper[5047]: I0223 07:09:00.691749 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6n2t" event={"ID":"7a17ee75-a91f-4d3c-b53c-57c52bb86743","Type":"ContainerDied","Data":"c36f2bc13b2222876922b4baa8e613192d719f46dc525eb16a3c76f1cf5cd195"} Feb 23 07:09:01 crc kubenswrapper[5047]: I0223 07:09:01.706108 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6n2t" event={"ID":"7a17ee75-a91f-4d3c-b53c-57c52bb86743","Type":"ContainerStarted","Data":"eae27f1bf42848cf6bb8ab7d87102809e3efcf1359792a61cc4b0818df03ca6f"} Feb 23 07:09:02 crc kubenswrapper[5047]: I0223 07:09:02.718773 5047 generic.go:334] "Generic (PLEG): container finished" podID="7a17ee75-a91f-4d3c-b53c-57c52bb86743" containerID="eae27f1bf42848cf6bb8ab7d87102809e3efcf1359792a61cc4b0818df03ca6f" exitCode=0 Feb 23 07:09:02 crc kubenswrapper[5047]: I0223 07:09:02.718844 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6n2t" event={"ID":"7a17ee75-a91f-4d3c-b53c-57c52bb86743","Type":"ContainerDied","Data":"eae27f1bf42848cf6bb8ab7d87102809e3efcf1359792a61cc4b0818df03ca6f"} Feb 23 07:09:04 crc kubenswrapper[5047]: I0223 07:09:04.746406 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6n2t" event={"ID":"7a17ee75-a91f-4d3c-b53c-57c52bb86743","Type":"ContainerStarted","Data":"8b3a4ddb23cf9bc0732a24845f4e60187041b114ee3fe12b467933cd2cdb78c1"} Feb 23 07:09:04 crc kubenswrapper[5047]: I0223 07:09:04.771574 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q6n2t" podStartSLOduration=3.639211079 podStartE2EDuration="6.771548408s" podCreationTimestamp="2026-02-23 07:08:58 +0000 UTC" firstStartedPulling="2026-02-23 07:09:00.695097317 +0000 UTC m=+1462.946424451" lastFinishedPulling="2026-02-23 07:09:03.827434626 +0000 UTC m=+1466.078761780" observedRunningTime="2026-02-23 07:09:04.767888516 +0000 UTC m=+1467.019215690" watchObservedRunningTime="2026-02-23 07:09:04.771548408 +0000 UTC m=+1467.022875552" Feb 23 07:09:05 crc kubenswrapper[5047]: E0223 07:09:05.000461 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:09:05 crc kubenswrapper[5047]: E0223 07:09:05.002139 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:09:05 crc kubenswrapper[5047]: E0223 07:09:05.003421 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:09:05 crc kubenswrapper[5047]: E0223 07:09:05.003541 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="287ef842-0502-41ff-ae1c-ea849066edca" containerName="nova-cell0-conductor-conductor" Feb 23 07:09:08 crc kubenswrapper[5047]: I0223 07:09:08.988995 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:09:08 crc kubenswrapper[5047]: I0223 07:09:08.989987 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:09:09 crc kubenswrapper[5047]: I0223 07:09:09.081243 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:09:09 crc kubenswrapper[5047]: I0223 07:09:09.861018 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:09:10 crc kubenswrapper[5047]: E0223 07:09:10.000716 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:09:10 crc kubenswrapper[5047]: E0223 07:09:10.003317 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:09:10 crc kubenswrapper[5047]: E0223 07:09:10.007410 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:09:10 crc kubenswrapper[5047]: E0223 07:09:10.007525 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="287ef842-0502-41ff-ae1c-ea849066edca" containerName="nova-cell0-conductor-conductor" Feb 23 07:09:11 crc kubenswrapper[5047]: I0223 07:09:11.233168 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q6n2t"] Feb 23 07:09:11 crc kubenswrapper[5047]: I0223 07:09:11.842882 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q6n2t" podUID="7a17ee75-a91f-4d3c-b53c-57c52bb86743" containerName="registry-server" containerID="cri-o://8b3a4ddb23cf9bc0732a24845f4e60187041b114ee3fe12b467933cd2cdb78c1" gracePeriod=2 Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.418229 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.597190 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a17ee75-a91f-4d3c-b53c-57c52bb86743-catalog-content\") pod \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\" (UID: \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\") " Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.597831 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgggg\" (UniqueName: \"kubernetes.io/projected/7a17ee75-a91f-4d3c-b53c-57c52bb86743-kube-api-access-fgggg\") pod \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\" (UID: \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\") " Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.597983 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a17ee75-a91f-4d3c-b53c-57c52bb86743-utilities\") pod \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\" (UID: \"7a17ee75-a91f-4d3c-b53c-57c52bb86743\") " Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.600463 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a17ee75-a91f-4d3c-b53c-57c52bb86743-utilities" (OuterVolumeSpecName: "utilities") pod "7a17ee75-a91f-4d3c-b53c-57c52bb86743" (UID: "7a17ee75-a91f-4d3c-b53c-57c52bb86743"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.608860 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a17ee75-a91f-4d3c-b53c-57c52bb86743-kube-api-access-fgggg" (OuterVolumeSpecName: "kube-api-access-fgggg") pod "7a17ee75-a91f-4d3c-b53c-57c52bb86743" (UID: "7a17ee75-a91f-4d3c-b53c-57c52bb86743"). InnerVolumeSpecName "kube-api-access-fgggg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.667149 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a17ee75-a91f-4d3c-b53c-57c52bb86743-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a17ee75-a91f-4d3c-b53c-57c52bb86743" (UID: "7a17ee75-a91f-4d3c-b53c-57c52bb86743"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.702000 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a17ee75-a91f-4d3c-b53c-57c52bb86743-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.702051 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a17ee75-a91f-4d3c-b53c-57c52bb86743-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.702072 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgggg\" (UniqueName: \"kubernetes.io/projected/7a17ee75-a91f-4d3c-b53c-57c52bb86743-kube-api-access-fgggg\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.859764 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q6n2t" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.859706 5047 generic.go:334] "Generic (PLEG): container finished" podID="7a17ee75-a91f-4d3c-b53c-57c52bb86743" containerID="8b3a4ddb23cf9bc0732a24845f4e60187041b114ee3fe12b467933cd2cdb78c1" exitCode=0 Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.859868 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6n2t" event={"ID":"7a17ee75-a91f-4d3c-b53c-57c52bb86743","Type":"ContainerDied","Data":"8b3a4ddb23cf9bc0732a24845f4e60187041b114ee3fe12b467933cd2cdb78c1"} Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.860105 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q6n2t" event={"ID":"7a17ee75-a91f-4d3c-b53c-57c52bb86743","Type":"ContainerDied","Data":"f433295933b26bc728c488dc797f94f065fa6af601110a6b112666ea7b5106db"} Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.860153 5047 scope.go:117] "RemoveContainer" containerID="8b3a4ddb23cf9bc0732a24845f4e60187041b114ee3fe12b467933cd2cdb78c1" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.899918 5047 scope.go:117] "RemoveContainer" containerID="eae27f1bf42848cf6bb8ab7d87102809e3efcf1359792a61cc4b0818df03ca6f" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.913540 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q6n2t"] Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.928086 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q6n2t"] Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.929893 5047 scope.go:117] "RemoveContainer" containerID="c36f2bc13b2222876922b4baa8e613192d719f46dc525eb16a3c76f1cf5cd195" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.997495 5047 scope.go:117] "RemoveContainer" containerID="8b3a4ddb23cf9bc0732a24845f4e60187041b114ee3fe12b467933cd2cdb78c1" Feb 23 07:09:12 crc kubenswrapper[5047]: E0223 07:09:12.998187 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3a4ddb23cf9bc0732a24845f4e60187041b114ee3fe12b467933cd2cdb78c1\": container with ID starting with 8b3a4ddb23cf9bc0732a24845f4e60187041b114ee3fe12b467933cd2cdb78c1 not found: ID does not exist" containerID="8b3a4ddb23cf9bc0732a24845f4e60187041b114ee3fe12b467933cd2cdb78c1" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.998248 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3a4ddb23cf9bc0732a24845f4e60187041b114ee3fe12b467933cd2cdb78c1"} err="failed to get container status \"8b3a4ddb23cf9bc0732a24845f4e60187041b114ee3fe12b467933cd2cdb78c1\": rpc error: code = NotFound desc = could not find container \"8b3a4ddb23cf9bc0732a24845f4e60187041b114ee3fe12b467933cd2cdb78c1\": container with ID starting with 8b3a4ddb23cf9bc0732a24845f4e60187041b114ee3fe12b467933cd2cdb78c1 not found: ID does not exist" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.998286 5047 scope.go:117] "RemoveContainer" containerID="eae27f1bf42848cf6bb8ab7d87102809e3efcf1359792a61cc4b0818df03ca6f" Feb 23 07:09:12 crc kubenswrapper[5047]: E0223 07:09:12.999032 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae27f1bf42848cf6bb8ab7d87102809e3efcf1359792a61cc4b0818df03ca6f\": container with ID starting with eae27f1bf42848cf6bb8ab7d87102809e3efcf1359792a61cc4b0818df03ca6f not found: ID does not exist" containerID="eae27f1bf42848cf6bb8ab7d87102809e3efcf1359792a61cc4b0818df03ca6f" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.999101 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae27f1bf42848cf6bb8ab7d87102809e3efcf1359792a61cc4b0818df03ca6f"} err="failed to get container status \"eae27f1bf42848cf6bb8ab7d87102809e3efcf1359792a61cc4b0818df03ca6f\": rpc error: code = NotFound desc = could not find container \"eae27f1bf42848cf6bb8ab7d87102809e3efcf1359792a61cc4b0818df03ca6f\": container with ID starting with eae27f1bf42848cf6bb8ab7d87102809e3efcf1359792a61cc4b0818df03ca6f not found: ID does not exist" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.999142 5047 scope.go:117] "RemoveContainer" containerID="c36f2bc13b2222876922b4baa8e613192d719f46dc525eb16a3c76f1cf5cd195" Feb 23 07:09:12 crc kubenswrapper[5047]: E0223 07:09:12.999650 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c36f2bc13b2222876922b4baa8e613192d719f46dc525eb16a3c76f1cf5cd195\": container with ID starting with c36f2bc13b2222876922b4baa8e613192d719f46dc525eb16a3c76f1cf5cd195 not found: ID does not exist" containerID="c36f2bc13b2222876922b4baa8e613192d719f46dc525eb16a3c76f1cf5cd195" Feb 23 07:09:12 crc kubenswrapper[5047]: I0223 07:09:12.999858 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c36f2bc13b2222876922b4baa8e613192d719f46dc525eb16a3c76f1cf5cd195"} err="failed to get container status \"c36f2bc13b2222876922b4baa8e613192d719f46dc525eb16a3c76f1cf5cd195\": rpc error: code = NotFound desc = could not find container \"c36f2bc13b2222876922b4baa8e613192d719f46dc525eb16a3c76f1cf5cd195\": container with ID starting with c36f2bc13b2222876922b4baa8e613192d719f46dc525eb16a3c76f1cf5cd195 not found: ID does not exist" Feb 23 07:09:14 crc kubenswrapper[5047]: I0223 07:09:14.357193 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a17ee75-a91f-4d3c-b53c-57c52bb86743" path="/var/lib/kubelet/pods/7a17ee75-a91f-4d3c-b53c-57c52bb86743/volumes" Feb 23 07:09:15 crc kubenswrapper[5047]: E0223 07:09:15.001621 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:09:15 crc kubenswrapper[5047]: E0223 07:09:15.004637 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:09:15 crc kubenswrapper[5047]: E0223 07:09:15.007499 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:09:15 crc kubenswrapper[5047]: E0223 07:09:15.007575 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="287ef842-0502-41ff-ae1c-ea849066edca" containerName="nova-cell0-conductor-conductor" Feb 23 07:09:16 crc kubenswrapper[5047]: I0223 07:09:16.760349 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:09:16 crc kubenswrapper[5047]: I0223 07:09:16.760864 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:09:16 crc kubenswrapper[5047]: I0223 07:09:16.910597 5047 generic.go:334] "Generic (PLEG): container finished" podID="287ef842-0502-41ff-ae1c-ea849066edca" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" exitCode=137 Feb 23 07:09:16 crc kubenswrapper[5047]: I0223 07:09:16.910649 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"287ef842-0502-41ff-ae1c-ea849066edca","Type":"ContainerDied","Data":"0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818"} Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.017060 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.122205 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m6fj\" (UniqueName: \"kubernetes.io/projected/287ef842-0502-41ff-ae1c-ea849066edca-kube-api-access-5m6fj\") pod \"287ef842-0502-41ff-ae1c-ea849066edca\" (UID: \"287ef842-0502-41ff-ae1c-ea849066edca\") " Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.122319 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287ef842-0502-41ff-ae1c-ea849066edca-combined-ca-bundle\") pod \"287ef842-0502-41ff-ae1c-ea849066edca\" (UID: \"287ef842-0502-41ff-ae1c-ea849066edca\") " Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.122561 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/287ef842-0502-41ff-ae1c-ea849066edca-config-data\") pod \"287ef842-0502-41ff-ae1c-ea849066edca\" (UID: \"287ef842-0502-41ff-ae1c-ea849066edca\") " Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.131295 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/287ef842-0502-41ff-ae1c-ea849066edca-kube-api-access-5m6fj" (OuterVolumeSpecName: "kube-api-access-5m6fj") pod "287ef842-0502-41ff-ae1c-ea849066edca" (UID: "287ef842-0502-41ff-ae1c-ea849066edca"). InnerVolumeSpecName "kube-api-access-5m6fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.158316 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/287ef842-0502-41ff-ae1c-ea849066edca-config-data" (OuterVolumeSpecName: "config-data") pod "287ef842-0502-41ff-ae1c-ea849066edca" (UID: "287ef842-0502-41ff-ae1c-ea849066edca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.161583 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/287ef842-0502-41ff-ae1c-ea849066edca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "287ef842-0502-41ff-ae1c-ea849066edca" (UID: "287ef842-0502-41ff-ae1c-ea849066edca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.226530 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m6fj\" (UniqueName: \"kubernetes.io/projected/287ef842-0502-41ff-ae1c-ea849066edca-kube-api-access-5m6fj\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.226620 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287ef842-0502-41ff-ae1c-ea849066edca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.226647 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/287ef842-0502-41ff-ae1c-ea849066edca-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.923197 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"287ef842-0502-41ff-ae1c-ea849066edca","Type":"ContainerDied","Data":"15c8e0d0f5cc82ed9d7ff73714695f02407212d8efe14c38c2c93f53e6f6368a"} Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.923266 5047 scope.go:117] "RemoveContainer" containerID="0c94af0e4db89f5918f117bf89c12af1436794688044554bf8fdb4dcc9e5f818" Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.923320 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.975334 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:09:17 crc kubenswrapper[5047]: I0223 07:09:17.984009 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.009793 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:09:18 crc kubenswrapper[5047]: E0223 07:09:18.010261 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a17ee75-a91f-4d3c-b53c-57c52bb86743" containerName="extract-content" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.010281 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a17ee75-a91f-4d3c-b53c-57c52bb86743" containerName="extract-content" Feb 23 07:09:18 crc kubenswrapper[5047]: E0223 07:09:18.010296 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287ef842-0502-41ff-ae1c-ea849066edca" containerName="nova-cell0-conductor-conductor" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.010303 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="287ef842-0502-41ff-ae1c-ea849066edca" containerName="nova-cell0-conductor-conductor" Feb 23 07:09:18 crc kubenswrapper[5047]: E0223 07:09:18.010321 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a17ee75-a91f-4d3c-b53c-57c52bb86743" containerName="extract-utilities" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.010328 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a17ee75-a91f-4d3c-b53c-57c52bb86743" containerName="extract-utilities" Feb 23 07:09:18 crc kubenswrapper[5047]: E0223 07:09:18.010348 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a17ee75-a91f-4d3c-b53c-57c52bb86743" containerName="registry-server" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.010355 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a17ee75-a91f-4d3c-b53c-57c52bb86743" containerName="registry-server" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.010522 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="287ef842-0502-41ff-ae1c-ea849066edca" containerName="nova-cell0-conductor-conductor" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.010551 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a17ee75-a91f-4d3c-b53c-57c52bb86743" containerName="registry-server" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.036034 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.036368 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.039319 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.041932 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-spntp" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.048920 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.049119 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.049153 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwzvd\" (UniqueName: \"kubernetes.io/projected/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-kube-api-access-kwzvd\") pod \"nova-cell0-conductor-0\" (UID: \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.150136 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.150251 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.150280 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwzvd\" (UniqueName: \"kubernetes.io/projected/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-kube-api-access-kwzvd\") pod \"nova-cell0-conductor-0\" (UID: \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.157986 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.159031 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.171275 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwzvd\" (UniqueName: \"kubernetes.io/projected/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-kube-api-access-kwzvd\") pod \"nova-cell0-conductor-0\" (UID: \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.355623 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.355821 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="287ef842-0502-41ff-ae1c-ea849066edca" path="/var/lib/kubelet/pods/287ef842-0502-41ff-ae1c-ea849066edca/volumes" Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.810010 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:09:18 crc kubenswrapper[5047]: I0223 07:09:18.937231 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5408d3e3-d1ab-45e8-b226-0eb3b26fe183","Type":"ContainerStarted","Data":"3b60870e78547f020b35c009e24ea5aeb788ace7ef0605c9ce610748dce17860"} Feb 23 07:09:19 crc kubenswrapper[5047]: I0223 07:09:19.954190 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5408d3e3-d1ab-45e8-b226-0eb3b26fe183","Type":"ContainerStarted","Data":"64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd"} Feb 23 07:09:19 crc kubenswrapper[5047]: I0223 07:09:19.955319 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:24 crc kubenswrapper[5047]: I0223 07:09:24.116623 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 23 07:09:24 crc kubenswrapper[5047]: I0223 07:09:24.163415 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=7.163381852 podStartE2EDuration="7.163381852s" podCreationTimestamp="2026-02-23 07:09:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:19.988230076 +0000 UTC m=+1482.239557210" watchObservedRunningTime="2026-02-23 07:09:24.163381852 +0000 UTC m=+1486.414709036" Feb 23 07:09:28 crc kubenswrapper[5047]: I0223 07:09:28.455816 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 23 07:09:28 crc kubenswrapper[5047]: I0223 07:09:28.639224 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:09:28 crc kubenswrapper[5047]: I0223 07:09:28.639499 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c0b4e975-a97e-42a0-80a8-f97734bcaff2" containerName="kube-state-metrics" containerID="cri-o://162cb7f8c6e55e75e580ec58b388952191f54181ae019a1fa7f507a11cbff878" gracePeriod=30 Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.080479 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xnqnk"] Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.082441 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.086673 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.086755 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.100567 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xnqnk"] Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.105614 5047 generic.go:334] "Generic (PLEG): container finished" podID="c0b4e975-a97e-42a0-80a8-f97734bcaff2" containerID="162cb7f8c6e55e75e580ec58b388952191f54181ae019a1fa7f507a11cbff878" exitCode=2 Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.105660 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c0b4e975-a97e-42a0-80a8-f97734bcaff2","Type":"ContainerDied","Data":"162cb7f8c6e55e75e580ec58b388952191f54181ae019a1fa7f507a11cbff878"} Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.105687 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c0b4e975-a97e-42a0-80a8-f97734bcaff2","Type":"ContainerDied","Data":"19817381454e5b946ff94789c84fd485fc5ea8df527ca113f294803b838b2d7e"} Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.105700 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19817381454e5b946ff94789c84fd485fc5ea8df527ca113f294803b838b2d7e" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.189125 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.249246 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqjwt\" (UniqueName: \"kubernetes.io/projected/9ad17d77-4987-46fc-aa46-622e08b708ea-kube-api-access-rqjwt\") pod \"nova-cell0-cell-mapping-xnqnk\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.249415 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-config-data\") pod \"nova-cell0-cell-mapping-xnqnk\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.249483 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-scripts\") pod \"nova-cell0-cell-mapping-xnqnk\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.249517 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xnqnk\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.350900 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptpq6\" (UniqueName: \"kubernetes.io/projected/c0b4e975-a97e-42a0-80a8-f97734bcaff2-kube-api-access-ptpq6\") pod \"c0b4e975-a97e-42a0-80a8-f97734bcaff2\" (UID: \"c0b4e975-a97e-42a0-80a8-f97734bcaff2\") " Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.351185 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqjwt\" (UniqueName: \"kubernetes.io/projected/9ad17d77-4987-46fc-aa46-622e08b708ea-kube-api-access-rqjwt\") pod \"nova-cell0-cell-mapping-xnqnk\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.351279 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-config-data\") pod \"nova-cell0-cell-mapping-xnqnk\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.351318 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-scripts\") pod \"nova-cell0-cell-mapping-xnqnk\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.351346 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xnqnk\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.377746 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0b4e975-a97e-42a0-80a8-f97734bcaff2-kube-api-access-ptpq6" (OuterVolumeSpecName: "kube-api-access-ptpq6") pod "c0b4e975-a97e-42a0-80a8-f97734bcaff2" (UID: "c0b4e975-a97e-42a0-80a8-f97734bcaff2"). InnerVolumeSpecName "kube-api-access-ptpq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.378870 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:29 crc kubenswrapper[5047]: E0223 07:09:29.379407 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b4e975-a97e-42a0-80a8-f97734bcaff2" containerName="kube-state-metrics" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.379431 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b4e975-a97e-42a0-80a8-f97734bcaff2" containerName="kube-state-metrics" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.379624 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0b4e975-a97e-42a0-80a8-f97734bcaff2" containerName="kube-state-metrics" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.384062 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-config-data\") pod \"nova-cell0-cell-mapping-xnqnk\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.389433 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-scripts\") pod \"nova-cell0-cell-mapping-xnqnk\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.389931 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.392685 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xnqnk\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.420310 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.442375 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.443948 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqjwt\" (UniqueName: \"kubernetes.io/projected/9ad17d77-4987-46fc-aa46-622e08b708ea-kube-api-access-rqjwt\") pod \"nova-cell0-cell-mapping-xnqnk\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.457806 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptpq6\" (UniqueName: \"kubernetes.io/projected/c0b4e975-a97e-42a0-80a8-f97734bcaff2-kube-api-access-ptpq6\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.505845 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.538979 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.540656 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.549300 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.561371 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12a93090-b5d0-4a31-8aff-b597e9d00112-logs\") pod \"nova-metadata-0\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.561437 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f8ba7b-8aef-4487-826e-04b1a5cc2631-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.561467 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a93090-b5d0-4a31-8aff-b597e9d00112-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.561526 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7lt4\" (UniqueName: \"kubernetes.io/projected/64f8ba7b-8aef-4487-826e-04b1a5cc2631-kube-api-access-x7lt4\") pod \"nova-scheduler-0\" (UID: \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.561549 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f8ba7b-8aef-4487-826e-04b1a5cc2631-config-data\") pod \"nova-scheduler-0\" (UID: \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.561570 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a93090-b5d0-4a31-8aff-b597e9d00112-config-data\") pod \"nova-metadata-0\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.561593 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m95lz\" (UniqueName: \"kubernetes.io/projected/12a93090-b5d0-4a31-8aff-b597e9d00112-kube-api-access-m95lz\") pod \"nova-metadata-0\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.593808 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.664895 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.684708 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.685820 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a93090-b5d0-4a31-8aff-b597e9d00112-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.687647 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7lt4\" (UniqueName: \"kubernetes.io/projected/64f8ba7b-8aef-4487-826e-04b1a5cc2631-kube-api-access-x7lt4\") pod \"nova-scheduler-0\" (UID: \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.687825 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f8ba7b-8aef-4487-826e-04b1a5cc2631-config-data\") pod \"nova-scheduler-0\" (UID: \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.687951 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a93090-b5d0-4a31-8aff-b597e9d00112-config-data\") pod \"nova-metadata-0\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.688080 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m95lz\" (UniqueName: \"kubernetes.io/projected/12a93090-b5d0-4a31-8aff-b597e9d00112-kube-api-access-m95lz\") pod \"nova-metadata-0\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.688274 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12a93090-b5d0-4a31-8aff-b597e9d00112-logs\") pod \"nova-metadata-0\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.688529 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f8ba7b-8aef-4487-826e-04b1a5cc2631-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.691449 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12a93090-b5d0-4a31-8aff-b597e9d00112-logs\") pod \"nova-metadata-0\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.700208 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a93090-b5d0-4a31-8aff-b597e9d00112-config-data\") pod \"nova-metadata-0\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.701616 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a93090-b5d0-4a31-8aff-b597e9d00112-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.701639 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f8ba7b-8aef-4487-826e-04b1a5cc2631-config-data\") pod \"nova-scheduler-0\" (UID: \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.706057 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.707397 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f8ba7b-8aef-4487-826e-04b1a5cc2631-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.711027 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.758663 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7lt4\" (UniqueName: \"kubernetes.io/projected/64f8ba7b-8aef-4487-826e-04b1a5cc2631-kube-api-access-x7lt4\") pod \"nova-scheduler-0\" (UID: \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.759641 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m95lz\" (UniqueName: \"kubernetes.io/projected/12a93090-b5d0-4a31-8aff-b597e9d00112-kube-api-access-m95lz\") pod \"nova-metadata-0\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.788776 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-7kfxn"] Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.791767 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.793431 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cf0b35-5745-4939-b559-5041d7070842-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.793517 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brlz9\" (UniqueName: \"kubernetes.io/projected/71cf0b35-5745-4939-b559-5041d7070842-kube-api-access-brlz9\") pod \"nova-api-0\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.793545 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71cf0b35-5745-4939-b559-5041d7070842-config-data\") pod \"nova-api-0\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.793576 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cf0b35-5745-4939-b559-5041d7070842-logs\") pod \"nova-api-0\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.806429 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.865845 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-7kfxn"] Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.911005 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.912447 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.913506 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.914378 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.914568 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brlz9\" (UniqueName: \"kubernetes.io/projected/71cf0b35-5745-4939-b559-5041d7070842-kube-api-access-brlz9\") pod \"nova-api-0\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.914713 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71cf0b35-5745-4939-b559-5041d7070842-config-data\") pod \"nova-api-0\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.914975 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8whx\" (UniqueName: \"kubernetes.io/projected/d48e05a8-af41-43a5-bc35-19b17bf735e3-kube-api-access-q8whx\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.915099 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cf0b35-5745-4939-b559-5041d7070842-logs\") pod \"nova-api-0\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.915273 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-config\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.916360 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cf0b35-5745-4939-b559-5041d7070842-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.916456 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.917526 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.917713 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.918195 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cf0b35-5745-4939-b559-5041d7070842-logs\") pod \"nova-api-0\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.922555 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.927177 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cf0b35-5745-4939-b559-5041d7070842-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.938771 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71cf0b35-5745-4939-b559-5041d7070842-config-data\") pod \"nova-api-0\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5047]: I0223 07:09:29.979751 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brlz9\" (UniqueName: \"kubernetes.io/projected/71cf0b35-5745-4939-b559-5041d7070842-kube-api-access-brlz9\") pod \"nova-api-0\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " pod="openstack/nova-api-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.019266 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.021060 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-config\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.021108 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.021167 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.021224 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.021257 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.021283 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2bxv\" (UniqueName: \"kubernetes.io/projected/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-kube-api-access-z2bxv\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.021316 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.021353 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.021397 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8whx\" (UniqueName: \"kubernetes.io/projected/d48e05a8-af41-43a5-bc35-19b17bf735e3-kube-api-access-q8whx\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.029048 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.029573 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-config\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.029629 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.030182 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.030718 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.056867 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.134657 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.134795 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.135150 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2bxv\" (UniqueName: \"kubernetes.io/projected/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-kube-api-access-z2bxv\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.150188 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.165980 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2bxv\" (UniqueName: \"kubernetes.io/projected/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-kube-api-access-z2bxv\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.171243 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8whx\" (UniqueName: \"kubernetes.io/projected/d48e05a8-af41-43a5-bc35-19b17bf735e3-kube-api-access-q8whx\") pod \"dnsmasq-dns-75ddbf7c75-7kfxn\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.172521 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.195174 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.280766 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.300746 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.313139 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.315528 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.320497 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.320714 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.336249 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.372194 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0b4e975-a97e-42a0-80a8-f97734bcaff2" path="/var/lib/kubelet/pods/c0b4e975-a97e-42a0-80a8-f97734bcaff2/volumes" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.376085 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.420493 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.450570 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.450692 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwc9m\" (UniqueName: \"kubernetes.io/projected/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-api-access-cwc9m\") pod \"kube-state-metrics-0\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.450794 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.450829 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.552721 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.553189 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.553253 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.553295 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwc9m\" (UniqueName: \"kubernetes.io/projected/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-api-access-cwc9m\") pod \"kube-state-metrics-0\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.560213 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.561207 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.569278 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.582694 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwc9m\" (UniqueName: \"kubernetes.io/projected/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-api-access-cwc9m\") pod \"kube-state-metrics-0\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.620408 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xnqnk"] Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.670153 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.711364 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:30 crc kubenswrapper[5047]: W0223 07:09:30.726646 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12a93090_b5d0_4a31_8aff_b597e9d00112.slice/crio-5b4ee0f492d05b6ae837fe10b17013e465089e450b9a22368d62154b6fc8d897 WatchSource:0}: Error finding container 5b4ee0f492d05b6ae837fe10b17013e465089e450b9a22368d62154b6fc8d897: Status 404 returned error can't find the container with id 5b4ee0f492d05b6ae837fe10b17013e465089e450b9a22368d62154b6fc8d897 Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.730671 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.864618 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:30 crc kubenswrapper[5047]: I0223 07:09:30.970094 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:30 crc kubenswrapper[5047]: W0223 07:09:30.975250 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71cf0b35_5745_4939_b559_5041d7070842.slice/crio-9cce3d8392e1e0f63b2b7f5f4b06b55ab060989ceabc12c76ee8e79e4af5749b WatchSource:0}: Error finding container 9cce3d8392e1e0f63b2b7f5f4b06b55ab060989ceabc12c76ee8e79e4af5749b: Status 404 returned error can't find the container with id 9cce3d8392e1e0f63b2b7f5f4b06b55ab060989ceabc12c76ee8e79e4af5749b Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.051002 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwwzn"] Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.057028 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.061158 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.065574 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.072536 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwwzn"] Feb 23 07:09:31 crc kubenswrapper[5047]: W0223 07:09:31.097256 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f196b0e_f688_4c8b_8e1a_a37f8d3009d4.slice/crio-8eda1f4225609c2d1ee636d87513eafb2745086686158c652f1d2bc863c0270c WatchSource:0}: Error finding container 8eda1f4225609c2d1ee636d87513eafb2745086686158c652f1d2bc863c0270c: Status 404 returned error can't find the container with id 8eda1f4225609c2d1ee636d87513eafb2745086686158c652f1d2bc863c0270c Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.112126 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.159030 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-7kfxn"] Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.173861 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g762q\" (UniqueName: \"kubernetes.io/projected/4d5409fa-0e21-401f-9b92-09556836eb13-kube-api-access-g762q\") pod \"nova-cell1-conductor-db-sync-cwwzn\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.173986 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-config-data\") pod \"nova-cell1-conductor-db-sync-cwwzn\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.174039 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cwwzn\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.174068 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-scripts\") pod \"nova-cell1-conductor-db-sync-cwwzn\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.208406 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4","Type":"ContainerStarted","Data":"8eda1f4225609c2d1ee636d87513eafb2745086686158c652f1d2bc863c0270c"} Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.209448 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"71cf0b35-5745-4939-b559-5041d7070842","Type":"ContainerStarted","Data":"9cce3d8392e1e0f63b2b7f5f4b06b55ab060989ceabc12c76ee8e79e4af5749b"} Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.210818 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" event={"ID":"d48e05a8-af41-43a5-bc35-19b17bf735e3","Type":"ContainerStarted","Data":"eb3488a2cd70c23a480536e538adeb7f84157834dbcb09415d0243a028e55264"} Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.212570 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"64f8ba7b-8aef-4487-826e-04b1a5cc2631","Type":"ContainerStarted","Data":"2f2094adafca6b3a1b0401262c221dd7623c35b258058d615c91720c3442ce03"} Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.223273 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xnqnk" event={"ID":"9ad17d77-4987-46fc-aa46-622e08b708ea","Type":"ContainerStarted","Data":"954c1f1ee794e4e77d08d6c1a2df71e2d46a99260aa2ccf3f544ba5c564093a1"} Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.223339 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xnqnk" event={"ID":"9ad17d77-4987-46fc-aa46-622e08b708ea","Type":"ContainerStarted","Data":"faa39768562358438bda2798eb7df9d18e2efa3701ab7a676d5998d2f71be864"} Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.226715 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12a93090-b5d0-4a31-8aff-b597e9d00112","Type":"ContainerStarted","Data":"5b4ee0f492d05b6ae837fe10b17013e465089e450b9a22368d62154b6fc8d897"} Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.255729 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xnqnk" podStartSLOduration=2.255708624 podStartE2EDuration="2.255708624s" podCreationTimestamp="2026-02-23 07:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:31.24191737 +0000 UTC m=+1493.493244504" watchObservedRunningTime="2026-02-23 07:09:31.255708624 +0000 UTC m=+1493.507035758" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.275880 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g762q\" (UniqueName: \"kubernetes.io/projected/4d5409fa-0e21-401f-9b92-09556836eb13-kube-api-access-g762q\") pod \"nova-cell1-conductor-db-sync-cwwzn\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.276075 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-config-data\") pod \"nova-cell1-conductor-db-sync-cwwzn\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.276139 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cwwzn\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.276172 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-scripts\") pod \"nova-cell1-conductor-db-sync-cwwzn\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.285012 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-config-data\") pod \"nova-cell1-conductor-db-sync-cwwzn\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.285482 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cwwzn\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.293175 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-scripts\") pod \"nova-cell1-conductor-db-sync-cwwzn\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.298560 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.301983 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g762q\" (UniqueName: \"kubernetes.io/projected/4d5409fa-0e21-401f-9b92-09556836eb13-kube-api-access-g762q\") pod \"nova-cell1-conductor-db-sync-cwwzn\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.388936 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:31 crc kubenswrapper[5047]: I0223 07:09:31.894877 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwwzn"] Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.241272 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cwwzn" event={"ID":"4d5409fa-0e21-401f-9b92-09556836eb13","Type":"ContainerStarted","Data":"74b40c582ae12999f42d1eef301f44f0da1417815226e698bfe5b38ce08e9a34"} Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.241726 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cwwzn" event={"ID":"4d5409fa-0e21-401f-9b92-09556836eb13","Type":"ContainerStarted","Data":"9785a85eacdb780f25b0e7c2831c3d53b25f9fd437cf25996d0240e9db8fb430"} Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.250497 5047 generic.go:334] "Generic (PLEG): container finished" podID="d48e05a8-af41-43a5-bc35-19b17bf735e3" containerID="6bb60eeb023dade211e23276a0c4b1a2179398333be845aeede6772875cc232c" exitCode=0 Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.250611 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" event={"ID":"d48e05a8-af41-43a5-bc35-19b17bf735e3","Type":"ContainerDied","Data":"6bb60eeb023dade211e23276a0c4b1a2179398333be845aeede6772875cc232c"} Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.265203 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"95d08092-3eac-4289-b594-a77d5dfecfe9","Type":"ContainerStarted","Data":"897b14a78dfbcdad54f56571d5c5ba61205fe2678162c41bb69a05cb9c0da5b7"} Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.265247 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"95d08092-3eac-4289-b594-a77d5dfecfe9","Type":"ContainerStarted","Data":"74fe8d1d5f8f73a388f5636574972cc01a2efd9b5d40891d42ee819b7fe495ba"} Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.265611 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.277591 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-cwwzn" podStartSLOduration=1.277563894 podStartE2EDuration="1.277563894s" podCreationTimestamp="2026-02-23 07:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:32.257339552 +0000 UTC m=+1494.508666686" watchObservedRunningTime="2026-02-23 07:09:32.277563894 +0000 UTC m=+1494.528891028" Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.336331 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.927396927 podStartE2EDuration="2.336296602s" podCreationTimestamp="2026-02-23 07:09:30 +0000 UTC" firstStartedPulling="2026-02-23 07:09:31.312423739 +0000 UTC m=+1493.563750873" lastFinishedPulling="2026-02-23 07:09:31.721323414 +0000 UTC m=+1493.972650548" observedRunningTime="2026-02-23 07:09:32.289665493 +0000 UTC m=+1494.540992627" watchObservedRunningTime="2026-02-23 07:09:32.336296602 +0000 UTC m=+1494.587623736" Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.437172 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.437543 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="ceilometer-central-agent" containerID="cri-o://e0180980acad646ff81dc12d1d2091d5dcafd78c9e1c8d1583d842b36e52d4ce" gracePeriod=30 Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.438000 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="sg-core" containerID="cri-o://e8c19b482335e07fe147b02aa7397108698ecbcd460d651e57c356e9bb8d81d2" gracePeriod=30 Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.438076 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="ceilometer-notification-agent" containerID="cri-o://8aec5caea1192302b7b1be1c7ec79843c909cd94e3ed9794a7052b7df22f1b8d" gracePeriod=30 Feb 23 07:09:32 crc kubenswrapper[5047]: I0223 07:09:32.438077 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="proxy-httpd" containerID="cri-o://d9df5839961aafe72950647a46bfa6c0f2fe6e62de134192322d86962cb62235" gracePeriod=30 Feb 23 07:09:33 crc kubenswrapper[5047]: E0223 07:09:33.009472 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfe1983c_dee4_4f0f_a3bb_1ad6df2fc4f4.slice/crio-e8c19b482335e07fe147b02aa7397108698ecbcd460d651e57c356e9bb8d81d2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfe1983c_dee4_4f0f_a3bb_1ad6df2fc4f4.slice/crio-d9df5839961aafe72950647a46bfa6c0f2fe6e62de134192322d86962cb62235.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfe1983c_dee4_4f0f_a3bb_1ad6df2fc4f4.slice/crio-conmon-e8c19b482335e07fe147b02aa7397108698ecbcd460d651e57c356e9bb8d81d2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfe1983c_dee4_4f0f_a3bb_1ad6df2fc4f4.slice/crio-conmon-d9df5839961aafe72950647a46bfa6c0f2fe6e62de134192322d86962cb62235.scope\": RecentStats: unable to find data in memory cache]" Feb 23 07:09:33 crc kubenswrapper[5047]: I0223 07:09:33.287075 5047 generic.go:334] "Generic (PLEG): container finished" podID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerID="d9df5839961aafe72950647a46bfa6c0f2fe6e62de134192322d86962cb62235" exitCode=0 Feb 23 07:09:33 crc kubenswrapper[5047]: I0223 07:09:33.287130 5047 generic.go:334] "Generic (PLEG): container finished" podID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerID="e8c19b482335e07fe147b02aa7397108698ecbcd460d651e57c356e9bb8d81d2" exitCode=2 Feb 23 07:09:33 crc kubenswrapper[5047]: I0223 07:09:33.287867 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4","Type":"ContainerDied","Data":"d9df5839961aafe72950647a46bfa6c0f2fe6e62de134192322d86962cb62235"} Feb 23 07:09:33 crc kubenswrapper[5047]: I0223 07:09:33.287960 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4","Type":"ContainerDied","Data":"e8c19b482335e07fe147b02aa7397108698ecbcd460d651e57c356e9bb8d81d2"} Feb 23 07:09:33 crc kubenswrapper[5047]: I0223 07:09:33.481001 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:33 crc kubenswrapper[5047]: I0223 07:09:33.494525 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:09:34 crc kubenswrapper[5047]: I0223 07:09:34.298282 5047 generic.go:334] "Generic (PLEG): container finished" podID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerID="e0180980acad646ff81dc12d1d2091d5dcafd78c9e1c8d1583d842b36e52d4ce" exitCode=0 Feb 23 07:09:34 crc kubenswrapper[5047]: I0223 07:09:34.299520 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4","Type":"ContainerDied","Data":"e0180980acad646ff81dc12d1d2091d5dcafd78c9e1c8d1583d842b36e52d4ce"} Feb 23 07:09:35 crc kubenswrapper[5047]: I0223 07:09:35.322065 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"71cf0b35-5745-4939-b559-5041d7070842","Type":"ContainerStarted","Data":"ddf61b1500decbbe2b784b296ee90daade4d60aa333558ed7ade2fb7040ee01f"} Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.335045 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12a93090-b5d0-4a31-8aff-b597e9d00112","Type":"ContainerStarted","Data":"e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec"} Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.335351 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="12a93090-b5d0-4a31-8aff-b597e9d00112" containerName="nova-metadata-metadata" containerID="cri-o://e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec" gracePeriod=30 Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.335664 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12a93090-b5d0-4a31-8aff-b597e9d00112","Type":"ContainerStarted","Data":"46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20"} Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.335309 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="12a93090-b5d0-4a31-8aff-b597e9d00112" containerName="nova-metadata-log" containerID="cri-o://46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20" gracePeriod=30 Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.342307 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4f196b0e-f688-4c8b-8e1a-a37f8d3009d4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843" gracePeriod=30 Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.375212 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4","Type":"ContainerStarted","Data":"f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843"} Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.375292 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"71cf0b35-5745-4939-b559-5041d7070842","Type":"ContainerStarted","Data":"76567726ef56e2a98637e729008f178ecec9754e29f941487c6716f5670595f3"} Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.377818 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.145141095 podStartE2EDuration="7.377795586s" podCreationTimestamp="2026-02-23 07:09:29 +0000 UTC" firstStartedPulling="2026-02-23 07:09:30.730453291 +0000 UTC m=+1492.981780425" lastFinishedPulling="2026-02-23 07:09:34.963107782 +0000 UTC m=+1497.214434916" observedRunningTime="2026-02-23 07:09:36.371039997 +0000 UTC m=+1498.622367131" watchObservedRunningTime="2026-02-23 07:09:36.377795586 +0000 UTC m=+1498.629122720" Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.384664 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" event={"ID":"d48e05a8-af41-43a5-bc35-19b17bf735e3","Type":"ContainerStarted","Data":"3f73a342fb9259de5b85d35476814c1b8fb2a82ad57a4291c4eba08237571620"} Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.385007 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.391540 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"64f8ba7b-8aef-4487-826e-04b1a5cc2631","Type":"ContainerStarted","Data":"aaf169b1c047fa6009018bffbc5020f20fa43632fb684268a6cb15fbcba02ac5"} Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.409162 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.554096714 podStartE2EDuration="7.409136362s" podCreationTimestamp="2026-02-23 07:09:29 +0000 UTC" firstStartedPulling="2026-02-23 07:09:31.101700185 +0000 UTC m=+1493.353027319" lastFinishedPulling="2026-02-23 07:09:34.956739833 +0000 UTC m=+1497.208066967" observedRunningTime="2026-02-23 07:09:36.397144356 +0000 UTC m=+1498.648471530" watchObservedRunningTime="2026-02-23 07:09:36.409136362 +0000 UTC m=+1498.660463496" Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.425434 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.471906917 podStartE2EDuration="7.425404191s" podCreationTimestamp="2026-02-23 07:09:29 +0000 UTC" firstStartedPulling="2026-02-23 07:09:30.978389595 +0000 UTC m=+1493.229716729" lastFinishedPulling="2026-02-23 07:09:34.931886869 +0000 UTC m=+1497.183214003" observedRunningTime="2026-02-23 07:09:36.420000288 +0000 UTC m=+1498.671327422" watchObservedRunningTime="2026-02-23 07:09:36.425404191 +0000 UTC m=+1498.676731325" Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.452732 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.422027391 podStartE2EDuration="7.452709449s" podCreationTimestamp="2026-02-23 07:09:29 +0000 UTC" firstStartedPulling="2026-02-23 07:09:30.907168858 +0000 UTC m=+1493.158495992" lastFinishedPulling="2026-02-23 07:09:34.937850916 +0000 UTC m=+1497.189178050" observedRunningTime="2026-02-23 07:09:36.446439785 +0000 UTC m=+1498.697766919" watchObservedRunningTime="2026-02-23 07:09:36.452709449 +0000 UTC m=+1498.704036583" Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.928491 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:36 crc kubenswrapper[5047]: I0223 07:09:36.955749 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" podStartSLOduration=7.955719237 podStartE2EDuration="7.955719237s" podCreationTimestamp="2026-02-23 07:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:36.469315267 +0000 UTC m=+1498.720642421" watchObservedRunningTime="2026-02-23 07:09:36.955719237 +0000 UTC m=+1499.207046371" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.058375 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a93090-b5d0-4a31-8aff-b597e9d00112-combined-ca-bundle\") pod \"12a93090-b5d0-4a31-8aff-b597e9d00112\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.058434 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a93090-b5d0-4a31-8aff-b597e9d00112-config-data\") pod \"12a93090-b5d0-4a31-8aff-b597e9d00112\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.058650 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12a93090-b5d0-4a31-8aff-b597e9d00112-logs\") pod \"12a93090-b5d0-4a31-8aff-b597e9d00112\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.059324 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12a93090-b5d0-4a31-8aff-b597e9d00112-logs" (OuterVolumeSpecName: "logs") pod "12a93090-b5d0-4a31-8aff-b597e9d00112" (UID: "12a93090-b5d0-4a31-8aff-b597e9d00112"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.059390 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m95lz\" (UniqueName: \"kubernetes.io/projected/12a93090-b5d0-4a31-8aff-b597e9d00112-kube-api-access-m95lz\") pod \"12a93090-b5d0-4a31-8aff-b597e9d00112\" (UID: \"12a93090-b5d0-4a31-8aff-b597e9d00112\") " Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.060354 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12a93090-b5d0-4a31-8aff-b597e9d00112-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.072488 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12a93090-b5d0-4a31-8aff-b597e9d00112-kube-api-access-m95lz" (OuterVolumeSpecName: "kube-api-access-m95lz") pod "12a93090-b5d0-4a31-8aff-b597e9d00112" (UID: "12a93090-b5d0-4a31-8aff-b597e9d00112"). InnerVolumeSpecName "kube-api-access-m95lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.094636 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12a93090-b5d0-4a31-8aff-b597e9d00112-config-data" (OuterVolumeSpecName: "config-data") pod "12a93090-b5d0-4a31-8aff-b597e9d00112" (UID: "12a93090-b5d0-4a31-8aff-b597e9d00112"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.107616 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12a93090-b5d0-4a31-8aff-b597e9d00112-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12a93090-b5d0-4a31-8aff-b597e9d00112" (UID: "12a93090-b5d0-4a31-8aff-b597e9d00112"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.162752 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a93090-b5d0-4a31-8aff-b597e9d00112-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.162789 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a93090-b5d0-4a31-8aff-b597e9d00112-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.162798 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m95lz\" (UniqueName: \"kubernetes.io/projected/12a93090-b5d0-4a31-8aff-b597e9d00112-kube-api-access-m95lz\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.405079 5047 generic.go:334] "Generic (PLEG): container finished" podID="12a93090-b5d0-4a31-8aff-b597e9d00112" containerID="e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec" exitCode=0 Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.405115 5047 generic.go:334] "Generic (PLEG): container finished" podID="12a93090-b5d0-4a31-8aff-b597e9d00112" containerID="46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20" exitCode=143 Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.405142 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12a93090-b5d0-4a31-8aff-b597e9d00112","Type":"ContainerDied","Data":"e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec"} Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.405187 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.405233 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12a93090-b5d0-4a31-8aff-b597e9d00112","Type":"ContainerDied","Data":"46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20"} Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.405255 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12a93090-b5d0-4a31-8aff-b597e9d00112","Type":"ContainerDied","Data":"5b4ee0f492d05b6ae837fe10b17013e465089e450b9a22368d62154b6fc8d897"} Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.405257 5047 scope.go:117] "RemoveContainer" containerID="e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.438032 5047 scope.go:117] "RemoveContainer" containerID="46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.444187 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.456615 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.485565 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:37 crc kubenswrapper[5047]: E0223 07:09:37.486493 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a93090-b5d0-4a31-8aff-b597e9d00112" containerName="nova-metadata-log" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.486514 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a93090-b5d0-4a31-8aff-b597e9d00112" containerName="nova-metadata-log" Feb 23 07:09:37 crc kubenswrapper[5047]: E0223 07:09:37.486562 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a93090-b5d0-4a31-8aff-b597e9d00112" containerName="nova-metadata-metadata" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.486570 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a93090-b5d0-4a31-8aff-b597e9d00112" containerName="nova-metadata-metadata" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.486729 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a93090-b5d0-4a31-8aff-b597e9d00112" containerName="nova-metadata-metadata" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.486751 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a93090-b5d0-4a31-8aff-b597e9d00112" containerName="nova-metadata-log" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.488859 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.492854 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.493188 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.514181 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.530280 5047 scope.go:117] "RemoveContainer" containerID="e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec" Feb 23 07:09:37 crc kubenswrapper[5047]: E0223 07:09:37.531350 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec\": container with ID starting with e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec not found: ID does not exist" containerID="e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.531388 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec"} err="failed to get container status \"e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec\": rpc error: code = NotFound desc = could not find container \"e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec\": container with ID starting with e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec not found: ID does not exist" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.531413 5047 scope.go:117] "RemoveContainer" containerID="46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20" Feb 23 07:09:37 crc kubenswrapper[5047]: E0223 07:09:37.532107 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20\": container with ID starting with 46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20 not found: ID does not exist" containerID="46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.532155 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20"} err="failed to get container status \"46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20\": rpc error: code = NotFound desc = could not find container \"46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20\": container with ID starting with 46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20 not found: ID does not exist" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.532190 5047 scope.go:117] "RemoveContainer" containerID="e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.533306 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec"} err="failed to get container status \"e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec\": rpc error: code = NotFound desc = could not find container \"e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec\": container with ID starting with e38d5c9494ca4f528b6607fc14f7ed34acf6f8e7594ad2065c2228a59f926aec not found: ID does not exist" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.533333 5047 scope.go:117] "RemoveContainer" containerID="46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.533684 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20"} err="failed to get container status \"46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20\": rpc error: code = NotFound desc = could not find container \"46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20\": container with ID starting with 46b2a69efdde8517aca46d223c4656445856c6d5207c516b71d3d00b21898d20 not found: ID does not exist" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.571857 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d17bb28-592f-4218-b2b0-7bf0ff03808c-logs\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.571965 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-config-data\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.572001 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2nwl\" (UniqueName: \"kubernetes.io/projected/4d17bb28-592f-4218-b2b0-7bf0ff03808c-kube-api-access-c2nwl\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.572071 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.572103 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.674818 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.675025 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d17bb28-592f-4218-b2b0-7bf0ff03808c-logs\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.675085 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-config-data\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.675124 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2nwl\" (UniqueName: \"kubernetes.io/projected/4d17bb28-592f-4218-b2b0-7bf0ff03808c-kube-api-access-c2nwl\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.675203 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.675686 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d17bb28-592f-4218-b2b0-7bf0ff03808c-logs\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.682825 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.683829 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-config-data\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.686416 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.695841 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2nwl\" (UniqueName: \"kubernetes.io/projected/4d17bb28-592f-4218-b2b0-7bf0ff03808c-kube-api-access-c2nwl\") pod \"nova-metadata-0\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " pod="openstack/nova-metadata-0" Feb 23 07:09:37 crc kubenswrapper[5047]: I0223 07:09:37.834316 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:38 crc kubenswrapper[5047]: I0223 07:09:38.384680 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12a93090-b5d0-4a31-8aff-b597e9d00112" path="/var/lib/kubelet/pods/12a93090-b5d0-4a31-8aff-b597e9d00112/volumes" Feb 23 07:09:38 crc kubenswrapper[5047]: I0223 07:09:38.386683 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:38 crc kubenswrapper[5047]: I0223 07:09:38.432240 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d17bb28-592f-4218-b2b0-7bf0ff03808c","Type":"ContainerStarted","Data":"92843fc9ad4defc0e9a8d1f8970ed274cf650491c3578f2a9ee5b27a116abc6d"} Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.450468 5047 generic.go:334] "Generic (PLEG): container finished" podID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerID="8aec5caea1192302b7b1be1c7ec79843c909cd94e3ed9794a7052b7df22f1b8d" exitCode=0 Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.450558 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4","Type":"ContainerDied","Data":"8aec5caea1192302b7b1be1c7ec79843c909cd94e3ed9794a7052b7df22f1b8d"} Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.454942 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d17bb28-592f-4218-b2b0-7bf0ff03808c","Type":"ContainerStarted","Data":"085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3"} Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.454983 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d17bb28-592f-4218-b2b0-7bf0ff03808c","Type":"ContainerStarted","Data":"59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce"} Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.484256 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.484233886 podStartE2EDuration="2.484233886s" podCreationTimestamp="2026-02-23 07:09:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:39.482160071 +0000 UTC m=+1501.733487225" watchObservedRunningTime="2026-02-23 07:09:39.484233886 +0000 UTC m=+1501.735561030" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.550530 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.735189 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-scripts\") pod \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.735262 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-config-data\") pod \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.735333 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-sg-core-conf-yaml\") pod \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.735438 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-log-httpd\") pod \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.735467 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7khkz\" (UniqueName: \"kubernetes.io/projected/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-kube-api-access-7khkz\") pod \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.735510 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-combined-ca-bundle\") pod \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.735603 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-run-httpd\") pod \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\" (UID: \"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4\") " Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.736455 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" (UID: "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.737619 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" (UID: "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.744308 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-kube-api-access-7khkz" (OuterVolumeSpecName: "kube-api-access-7khkz") pod "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" (UID: "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4"). InnerVolumeSpecName "kube-api-access-7khkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.761891 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-scripts" (OuterVolumeSpecName: "scripts") pod "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" (UID: "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.794026 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" (UID: "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.841006 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.841062 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.841074 5047 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.841085 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.841100 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7khkz\" (UniqueName: \"kubernetes.io/projected/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-kube-api-access-7khkz\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.847167 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" (UID: "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.874855 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-config-data" (OuterVolumeSpecName: "config-data") pod "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" (UID: "cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.914185 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.914255 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.943635 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.943671 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:39 crc kubenswrapper[5047]: I0223 07:09:39.947149 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.020700 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.022349 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.338174 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.409471 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2"] Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.409753 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" podUID="c53524c5-f61d-4400-b15e-75d3a13e8297" containerName="dnsmasq-dns" containerID="cri-o://3dd3bd2be726ff15885515d1e19edb35ae6b4705c213bdb72bc493ebf2b48abf" gracePeriod=10 Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.422083 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.471942 5047 generic.go:334] "Generic (PLEG): container finished" podID="9ad17d77-4987-46fc-aa46-622e08b708ea" containerID="954c1f1ee794e4e77d08d6c1a2df71e2d46a99260aa2ccf3f544ba5c564093a1" exitCode=0 Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.472049 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xnqnk" event={"ID":"9ad17d77-4987-46fc-aa46-622e08b708ea","Type":"ContainerDied","Data":"954c1f1ee794e4e77d08d6c1a2df71e2d46a99260aa2ccf3f544ba5c564093a1"} Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.481417 5047 generic.go:334] "Generic (PLEG): container finished" podID="4d5409fa-0e21-401f-9b92-09556836eb13" containerID="74b40c582ae12999f42d1eef301f44f0da1417815226e698bfe5b38ce08e9a34" exitCode=0 Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.481491 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cwwzn" event={"ID":"4d5409fa-0e21-401f-9b92-09556836eb13","Type":"ContainerDied","Data":"74b40c582ae12999f42d1eef301f44f0da1417815226e698bfe5b38ce08e9a34"} Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.484432 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4","Type":"ContainerDied","Data":"174fc1d9065bc442bb17d94cf085f30b567e60778ccf2edacbef965785977f7d"} Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.484472 5047 scope.go:117] "RemoveContainer" containerID="d9df5839961aafe72950647a46bfa6c0f2fe6e62de134192322d86962cb62235" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.484624 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.527885 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.559845 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.564571 5047 scope.go:117] "RemoveContainer" containerID="e8c19b482335e07fe147b02aa7397108698ecbcd460d651e57c356e9bb8d81d2" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.583330 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.591949 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:09:40 crc kubenswrapper[5047]: E0223 07:09:40.592430 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="sg-core" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.592447 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="sg-core" Feb 23 07:09:40 crc kubenswrapper[5047]: E0223 07:09:40.592468 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="ceilometer-notification-agent" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.592475 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="ceilometer-notification-agent" Feb 23 07:09:40 crc kubenswrapper[5047]: E0223 07:09:40.592499 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="ceilometer-central-agent" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.592505 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="ceilometer-central-agent" Feb 23 07:09:40 crc kubenswrapper[5047]: E0223 07:09:40.592517 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="proxy-httpd" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.592549 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="proxy-httpd" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.592806 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="sg-core" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.592829 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="proxy-httpd" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.592844 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="ceilometer-notification-agent" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.592855 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" containerName="ceilometer-central-agent" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.595147 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.617285 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.621258 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.623106 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.636128 5047 scope.go:117] "RemoveContainer" containerID="8aec5caea1192302b7b1be1c7ec79843c909cd94e3ed9794a7052b7df22f1b8d" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.672263 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j988r\" (UniqueName: \"kubernetes.io/projected/55cf0e5d-62dd-4b49-8e84-cd9307857a82-kube-api-access-j988r\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.672350 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.673003 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55cf0e5d-62dd-4b49-8e84-cd9307857a82-run-httpd\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.673336 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-config-data\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.673507 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55cf0e5d-62dd-4b49-8e84-cd9307857a82-log-httpd\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.673632 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.673682 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-scripts\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.674703 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.695267 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.699245 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.700506 5047 scope.go:117] "RemoveContainer" containerID="e0180980acad646ff81dc12d1d2091d5dcafd78c9e1c8d1583d842b36e52d4ce" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.776386 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55cf0e5d-62dd-4b49-8e84-cd9307857a82-log-httpd\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.776435 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.776456 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-scripts\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.776478 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.776538 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j988r\" (UniqueName: \"kubernetes.io/projected/55cf0e5d-62dd-4b49-8e84-cd9307857a82-kube-api-access-j988r\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.776569 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.776603 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55cf0e5d-62dd-4b49-8e84-cd9307857a82-run-httpd\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.776678 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-config-data\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.779988 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55cf0e5d-62dd-4b49-8e84-cd9307857a82-log-httpd\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.780334 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55cf0e5d-62dd-4b49-8e84-cd9307857a82-run-httpd\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.785410 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.785814 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.786749 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-config-data\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.805211 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-scripts\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.806480 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:40 crc kubenswrapper[5047]: I0223 07:09:40.810468 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j988r\" (UniqueName: \"kubernetes.io/projected/55cf0e5d-62dd-4b49-8e84-cd9307857a82-kube-api-access-j988r\") pod \"ceilometer-0\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " pod="openstack/ceilometer-0" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.010778 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.070680 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.103133 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="71cf0b35-5745-4939-b559-5041d7070842" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.103153 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="71cf0b35-5745-4939-b559-5041d7070842" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.189669 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-ovsdbserver-sb\") pod \"c53524c5-f61d-4400-b15e-75d3a13e8297\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.189765 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-ovsdbserver-nb\") pod \"c53524c5-f61d-4400-b15e-75d3a13e8297\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.189891 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-dns-swift-storage-0\") pod \"c53524c5-f61d-4400-b15e-75d3a13e8297\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.189941 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-dns-svc\") pod \"c53524c5-f61d-4400-b15e-75d3a13e8297\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.189980 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77ljk\" (UniqueName: \"kubernetes.io/projected/c53524c5-f61d-4400-b15e-75d3a13e8297-kube-api-access-77ljk\") pod \"c53524c5-f61d-4400-b15e-75d3a13e8297\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.190033 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-config\") pod \"c53524c5-f61d-4400-b15e-75d3a13e8297\" (UID: \"c53524c5-f61d-4400-b15e-75d3a13e8297\") " Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.202659 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c53524c5-f61d-4400-b15e-75d3a13e8297-kube-api-access-77ljk" (OuterVolumeSpecName: "kube-api-access-77ljk") pod "c53524c5-f61d-4400-b15e-75d3a13e8297" (UID: "c53524c5-f61d-4400-b15e-75d3a13e8297"). InnerVolumeSpecName "kube-api-access-77ljk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.262760 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c53524c5-f61d-4400-b15e-75d3a13e8297" (UID: "c53524c5-f61d-4400-b15e-75d3a13e8297"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.274447 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c53524c5-f61d-4400-b15e-75d3a13e8297" (UID: "c53524c5-f61d-4400-b15e-75d3a13e8297"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.285481 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c53524c5-f61d-4400-b15e-75d3a13e8297" (UID: "c53524c5-f61d-4400-b15e-75d3a13e8297"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.291085 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-config" (OuterVolumeSpecName: "config") pod "c53524c5-f61d-4400-b15e-75d3a13e8297" (UID: "c53524c5-f61d-4400-b15e-75d3a13e8297"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.292727 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.292758 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77ljk\" (UniqueName: \"kubernetes.io/projected/c53524c5-f61d-4400-b15e-75d3a13e8297-kube-api-access-77ljk\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.292769 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.292778 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.292788 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.310559 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c53524c5-f61d-4400-b15e-75d3a13e8297" (UID: "c53524c5-f61d-4400-b15e-75d3a13e8297"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.395850 5047 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c53524c5-f61d-4400-b15e-75d3a13e8297-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.500689 5047 generic.go:334] "Generic (PLEG): container finished" podID="c53524c5-f61d-4400-b15e-75d3a13e8297" containerID="3dd3bd2be726ff15885515d1e19edb35ae6b4705c213bdb72bc493ebf2b48abf" exitCode=0 Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.501358 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.501517 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" event={"ID":"c53524c5-f61d-4400-b15e-75d3a13e8297","Type":"ContainerDied","Data":"3dd3bd2be726ff15885515d1e19edb35ae6b4705c213bdb72bc493ebf2b48abf"} Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.515088 5047 scope.go:117] "RemoveContainer" containerID="3dd3bd2be726ff15885515d1e19edb35ae6b4705c213bdb72bc493ebf2b48abf" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.515065 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2" event={"ID":"c53524c5-f61d-4400-b15e-75d3a13e8297","Type":"ContainerDied","Data":"e6911942035096026ad06d77ea8f79879d181b8f2ce45ce50563caaa696c48ae"} Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.519315 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.667874 5047 scope.go:117] "RemoveContainer" containerID="ff18d81ab3929d4bd39416c335b372ef47753f96edbb2e9d38b02a6a42d92032" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.670968 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2"] Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.681223 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xwjw2"] Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.718519 5047 scope.go:117] "RemoveContainer" containerID="3dd3bd2be726ff15885515d1e19edb35ae6b4705c213bdb72bc493ebf2b48abf" Feb 23 07:09:41 crc kubenswrapper[5047]: E0223 07:09:41.719184 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd3bd2be726ff15885515d1e19edb35ae6b4705c213bdb72bc493ebf2b48abf\": container with ID starting with 3dd3bd2be726ff15885515d1e19edb35ae6b4705c213bdb72bc493ebf2b48abf not found: ID does not exist" containerID="3dd3bd2be726ff15885515d1e19edb35ae6b4705c213bdb72bc493ebf2b48abf" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.719220 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd3bd2be726ff15885515d1e19edb35ae6b4705c213bdb72bc493ebf2b48abf"} err="failed to get container status \"3dd3bd2be726ff15885515d1e19edb35ae6b4705c213bdb72bc493ebf2b48abf\": rpc error: code = NotFound desc = could not find container \"3dd3bd2be726ff15885515d1e19edb35ae6b4705c213bdb72bc493ebf2b48abf\": container with ID starting with 3dd3bd2be726ff15885515d1e19edb35ae6b4705c213bdb72bc493ebf2b48abf not found: ID does not exist" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.719247 5047 scope.go:117] "RemoveContainer" containerID="ff18d81ab3929d4bd39416c335b372ef47753f96edbb2e9d38b02a6a42d92032" Feb 23 07:09:41 crc kubenswrapper[5047]: E0223 07:09:41.719668 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff18d81ab3929d4bd39416c335b372ef47753f96edbb2e9d38b02a6a42d92032\": container with ID starting with ff18d81ab3929d4bd39416c335b372ef47753f96edbb2e9d38b02a6a42d92032 not found: ID does not exist" containerID="ff18d81ab3929d4bd39416c335b372ef47753f96edbb2e9d38b02a6a42d92032" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.719690 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff18d81ab3929d4bd39416c335b372ef47753f96edbb2e9d38b02a6a42d92032"} err="failed to get container status \"ff18d81ab3929d4bd39416c335b372ef47753f96edbb2e9d38b02a6a42d92032\": rpc error: code = NotFound desc = could not find container \"ff18d81ab3929d4bd39416c335b372ef47753f96edbb2e9d38b02a6a42d92032\": container with ID starting with ff18d81ab3929d4bd39416c335b372ef47753f96edbb2e9d38b02a6a42d92032 not found: ID does not exist" Feb 23 07:09:41 crc kubenswrapper[5047]: I0223 07:09:41.944982 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.030158 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.115591 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-scripts\") pod \"9ad17d77-4987-46fc-aa46-622e08b708ea\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.115722 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-combined-ca-bundle\") pod \"9ad17d77-4987-46fc-aa46-622e08b708ea\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.115807 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-config-data\") pod \"9ad17d77-4987-46fc-aa46-622e08b708ea\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.115861 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqjwt\" (UniqueName: \"kubernetes.io/projected/9ad17d77-4987-46fc-aa46-622e08b708ea-kube-api-access-rqjwt\") pod \"9ad17d77-4987-46fc-aa46-622e08b708ea\" (UID: \"9ad17d77-4987-46fc-aa46-622e08b708ea\") " Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.122512 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-scripts" (OuterVolumeSpecName: "scripts") pod "9ad17d77-4987-46fc-aa46-622e08b708ea" (UID: "9ad17d77-4987-46fc-aa46-622e08b708ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.124172 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ad17d77-4987-46fc-aa46-622e08b708ea-kube-api-access-rqjwt" (OuterVolumeSpecName: "kube-api-access-rqjwt") pod "9ad17d77-4987-46fc-aa46-622e08b708ea" (UID: "9ad17d77-4987-46fc-aa46-622e08b708ea"). InnerVolumeSpecName "kube-api-access-rqjwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.159434 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-config-data" (OuterVolumeSpecName: "config-data") pod "9ad17d77-4987-46fc-aa46-622e08b708ea" (UID: "9ad17d77-4987-46fc-aa46-622e08b708ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.160137 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ad17d77-4987-46fc-aa46-622e08b708ea" (UID: "9ad17d77-4987-46fc-aa46-622e08b708ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.218206 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g762q\" (UniqueName: \"kubernetes.io/projected/4d5409fa-0e21-401f-9b92-09556836eb13-kube-api-access-g762q\") pod \"4d5409fa-0e21-401f-9b92-09556836eb13\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.218277 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-combined-ca-bundle\") pod \"4d5409fa-0e21-401f-9b92-09556836eb13\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.218349 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-scripts\") pod \"4d5409fa-0e21-401f-9b92-09556836eb13\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.218547 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-config-data\") pod \"4d5409fa-0e21-401f-9b92-09556836eb13\" (UID: \"4d5409fa-0e21-401f-9b92-09556836eb13\") " Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.219022 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.219043 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.219055 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ad17d77-4987-46fc-aa46-622e08b708ea-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.219066 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqjwt\" (UniqueName: \"kubernetes.io/projected/9ad17d77-4987-46fc-aa46-622e08b708ea-kube-api-access-rqjwt\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.222942 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5409fa-0e21-401f-9b92-09556836eb13-kube-api-access-g762q" (OuterVolumeSpecName: "kube-api-access-g762q") pod "4d5409fa-0e21-401f-9b92-09556836eb13" (UID: "4d5409fa-0e21-401f-9b92-09556836eb13"). InnerVolumeSpecName "kube-api-access-g762q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.224327 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-scripts" (OuterVolumeSpecName: "scripts") pod "4d5409fa-0e21-401f-9b92-09556836eb13" (UID: "4d5409fa-0e21-401f-9b92-09556836eb13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.258441 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d5409fa-0e21-401f-9b92-09556836eb13" (UID: "4d5409fa-0e21-401f-9b92-09556836eb13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.266896 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-config-data" (OuterVolumeSpecName: "config-data") pod "4d5409fa-0e21-401f-9b92-09556836eb13" (UID: "4d5409fa-0e21-401f-9b92-09556836eb13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.321297 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.321341 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g762q\" (UniqueName: \"kubernetes.io/projected/4d5409fa-0e21-401f-9b92-09556836eb13-kube-api-access-g762q\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.321356 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.321367 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d5409fa-0e21-401f-9b92-09556836eb13-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.370038 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c53524c5-f61d-4400-b15e-75d3a13e8297" path="/var/lib/kubelet/pods/c53524c5-f61d-4400-b15e-75d3a13e8297/volumes" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.371791 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4" path="/var/lib/kubelet/pods/cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4/volumes" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.513329 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55cf0e5d-62dd-4b49-8e84-cd9307857a82","Type":"ContainerStarted","Data":"c7a63fde00908a7029a5a0cc9c9ad4e38bdab61cd7948b87df4667bec68c9579"} Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.513393 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55cf0e5d-62dd-4b49-8e84-cd9307857a82","Type":"ContainerStarted","Data":"b76e87f8ca9945022f71387af0336143c537b3566cb6618c59f723e1377248e1"} Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.518848 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xnqnk" event={"ID":"9ad17d77-4987-46fc-aa46-622e08b708ea","Type":"ContainerDied","Data":"faa39768562358438bda2798eb7df9d18e2efa3701ab7a676d5998d2f71be864"} Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.519385 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faa39768562358438bda2798eb7df9d18e2efa3701ab7a676d5998d2f71be864" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.518897 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xnqnk" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.530958 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cwwzn" event={"ID":"4d5409fa-0e21-401f-9b92-09556836eb13","Type":"ContainerDied","Data":"9785a85eacdb780f25b0e7c2831c3d53b25f9fd437cf25996d0240e9db8fb430"} Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.531020 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9785a85eacdb780f25b0e7c2831c3d53b25f9fd437cf25996d0240e9db8fb430" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.531130 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cwwzn" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.634048 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:09:42 crc kubenswrapper[5047]: E0223 07:09:42.642207 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53524c5-f61d-4400-b15e-75d3a13e8297" containerName="dnsmasq-dns" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.642246 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53524c5-f61d-4400-b15e-75d3a13e8297" containerName="dnsmasq-dns" Feb 23 07:09:42 crc kubenswrapper[5047]: E0223 07:09:42.642262 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53524c5-f61d-4400-b15e-75d3a13e8297" containerName="init" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.642269 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53524c5-f61d-4400-b15e-75d3a13e8297" containerName="init" Feb 23 07:09:42 crc kubenswrapper[5047]: E0223 07:09:42.642294 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5409fa-0e21-401f-9b92-09556836eb13" containerName="nova-cell1-conductor-db-sync" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.642302 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5409fa-0e21-401f-9b92-09556836eb13" containerName="nova-cell1-conductor-db-sync" Feb 23 07:09:42 crc kubenswrapper[5047]: E0223 07:09:42.642340 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ad17d77-4987-46fc-aa46-622e08b708ea" containerName="nova-manage" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.642347 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ad17d77-4987-46fc-aa46-622e08b708ea" containerName="nova-manage" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.642703 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ad17d77-4987-46fc-aa46-622e08b708ea" containerName="nova-manage" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.642715 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5409fa-0e21-401f-9b92-09556836eb13" containerName="nova-cell1-conductor-db-sync" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.642727 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c53524c5-f61d-4400-b15e-75d3a13e8297" containerName="dnsmasq-dns" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.643514 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.646982 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.662582 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.730027 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911535c0-45eb-4361-b169-fad54a54d78b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"911535c0-45eb-4361-b169-fad54a54d78b\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.730095 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911535c0-45eb-4361-b169-fad54a54d78b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"911535c0-45eb-4361-b169-fad54a54d78b\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.730125 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmtwx\" (UniqueName: \"kubernetes.io/projected/911535c0-45eb-4361-b169-fad54a54d78b-kube-api-access-kmtwx\") pod \"nova-cell1-conductor-0\" (UID: \"911535c0-45eb-4361-b169-fad54a54d78b\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.781040 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.781380 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="71cf0b35-5745-4939-b559-5041d7070842" containerName="nova-api-log" containerID="cri-o://ddf61b1500decbbe2b784b296ee90daade4d60aa333558ed7ade2fb7040ee01f" gracePeriod=30 Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.781542 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="71cf0b35-5745-4939-b559-5041d7070842" containerName="nova-api-api" containerID="cri-o://76567726ef56e2a98637e729008f178ecec9754e29f941487c6716f5670595f3" gracePeriod=30 Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.809158 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.809448 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="64f8ba7b-8aef-4487-826e-04b1a5cc2631" containerName="nova-scheduler-scheduler" containerID="cri-o://aaf169b1c047fa6009018bffbc5020f20fa43632fb684268a6cb15fbcba02ac5" gracePeriod=30 Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.831445 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911535c0-45eb-4361-b169-fad54a54d78b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"911535c0-45eb-4361-b169-fad54a54d78b\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.831498 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911535c0-45eb-4361-b169-fad54a54d78b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"911535c0-45eb-4361-b169-fad54a54d78b\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.831533 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmtwx\" (UniqueName: \"kubernetes.io/projected/911535c0-45eb-4361-b169-fad54a54d78b-kube-api-access-kmtwx\") pod \"nova-cell1-conductor-0\" (UID: \"911535c0-45eb-4361-b169-fad54a54d78b\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.834391 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.834532 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.837543 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911535c0-45eb-4361-b169-fad54a54d78b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"911535c0-45eb-4361-b169-fad54a54d78b\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.851277 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911535c0-45eb-4361-b169-fad54a54d78b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"911535c0-45eb-4361-b169-fad54a54d78b\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.885589 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmtwx\" (UniqueName: \"kubernetes.io/projected/911535c0-45eb-4361-b169-fad54a54d78b-kube-api-access-kmtwx\") pod \"nova-cell1-conductor-0\" (UID: \"911535c0-45eb-4361-b169-fad54a54d78b\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.919647 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:42 crc kubenswrapper[5047]: I0223 07:09:42.977115 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 07:09:43 crc kubenswrapper[5047]: I0223 07:09:43.502039 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:09:43 crc kubenswrapper[5047]: I0223 07:09:43.545816 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55cf0e5d-62dd-4b49-8e84-cd9307857a82","Type":"ContainerStarted","Data":"06bcc7b23eb3d84ce4ef716d4b2f1931a9c641d0bc275868a75f5371bfe28726"} Feb 23 07:09:43 crc kubenswrapper[5047]: I0223 07:09:43.547716 5047 generic.go:334] "Generic (PLEG): container finished" podID="71cf0b35-5745-4939-b559-5041d7070842" containerID="ddf61b1500decbbe2b784b296ee90daade4d60aa333558ed7ade2fb7040ee01f" exitCode=143 Feb 23 07:09:43 crc kubenswrapper[5047]: I0223 07:09:43.547832 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"71cf0b35-5745-4939-b559-5041d7070842","Type":"ContainerDied","Data":"ddf61b1500decbbe2b784b296ee90daade4d60aa333558ed7ade2fb7040ee01f"} Feb 23 07:09:43 crc kubenswrapper[5047]: I0223 07:09:43.551033 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4d17bb28-592f-4218-b2b0-7bf0ff03808c" containerName="nova-metadata-log" containerID="cri-o://59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce" gracePeriod=30 Feb 23 07:09:43 crc kubenswrapper[5047]: I0223 07:09:43.551423 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"911535c0-45eb-4361-b169-fad54a54d78b","Type":"ContainerStarted","Data":"88888b246d29f927a3c12dc0e8aa82e6bcb34276355e5324bd30894dc59ad8af"} Feb 23 07:09:43 crc kubenswrapper[5047]: I0223 07:09:43.551983 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4d17bb28-592f-4218-b2b0-7bf0ff03808c" containerName="nova-metadata-metadata" containerID="cri-o://085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3" gracePeriod=30 Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.151653 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.266989 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-config-data\") pod \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.267206 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2nwl\" (UniqueName: \"kubernetes.io/projected/4d17bb28-592f-4218-b2b0-7bf0ff03808c-kube-api-access-c2nwl\") pod \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.267387 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d17bb28-592f-4218-b2b0-7bf0ff03808c-logs\") pod \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.267440 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-nova-metadata-tls-certs\") pod \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.267519 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-combined-ca-bundle\") pod \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\" (UID: \"4d17bb28-592f-4218-b2b0-7bf0ff03808c\") " Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.268808 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d17bb28-592f-4218-b2b0-7bf0ff03808c-logs" (OuterVolumeSpecName: "logs") pod "4d17bb28-592f-4218-b2b0-7bf0ff03808c" (UID: "4d17bb28-592f-4218-b2b0-7bf0ff03808c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.269743 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d17bb28-592f-4218-b2b0-7bf0ff03808c-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.279105 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d17bb28-592f-4218-b2b0-7bf0ff03808c-kube-api-access-c2nwl" (OuterVolumeSpecName: "kube-api-access-c2nwl") pod "4d17bb28-592f-4218-b2b0-7bf0ff03808c" (UID: "4d17bb28-592f-4218-b2b0-7bf0ff03808c"). InnerVolumeSpecName "kube-api-access-c2nwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.297451 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d17bb28-592f-4218-b2b0-7bf0ff03808c" (UID: "4d17bb28-592f-4218-b2b0-7bf0ff03808c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.300313 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-config-data" (OuterVolumeSpecName: "config-data") pod "4d17bb28-592f-4218-b2b0-7bf0ff03808c" (UID: "4d17bb28-592f-4218-b2b0-7bf0ff03808c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.325940 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4d17bb28-592f-4218-b2b0-7bf0ff03808c" (UID: "4d17bb28-592f-4218-b2b0-7bf0ff03808c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.371782 5047 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.371823 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.371834 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d17bb28-592f-4218-b2b0-7bf0ff03808c-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.371843 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2nwl\" (UniqueName: \"kubernetes.io/projected/4d17bb28-592f-4218-b2b0-7bf0ff03808c-kube-api-access-c2nwl\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.582048 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"911535c0-45eb-4361-b169-fad54a54d78b","Type":"ContainerStarted","Data":"fd49a3d170be31b10745d22fe236e4a8ddd84be4aa8f10cffd23d28b0ec0dcb6"} Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.582256 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.584058 5047 generic.go:334] "Generic (PLEG): container finished" podID="4d17bb28-592f-4218-b2b0-7bf0ff03808c" containerID="085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3" exitCode=0 Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.584101 5047 generic.go:334] "Generic (PLEG): container finished" podID="4d17bb28-592f-4218-b2b0-7bf0ff03808c" containerID="59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce" exitCode=143 Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.584165 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d17bb28-592f-4218-b2b0-7bf0ff03808c","Type":"ContainerDied","Data":"085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3"} Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.584222 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d17bb28-592f-4218-b2b0-7bf0ff03808c","Type":"ContainerDied","Data":"59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce"} Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.584250 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d17bb28-592f-4218-b2b0-7bf0ff03808c","Type":"ContainerDied","Data":"92843fc9ad4defc0e9a8d1f8970ed274cf650491c3578f2a9ee5b27a116abc6d"} Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.584264 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.584277 5047 scope.go:117] "RemoveContainer" containerID="085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.588687 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55cf0e5d-62dd-4b49-8e84-cd9307857a82","Type":"ContainerStarted","Data":"9a2d8665e47f507ac5a9bd1ef70f5ad09192769b8c98e9ae19d7f37912b53efb"} Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.620584 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.620562743 podStartE2EDuration="2.620562743s" podCreationTimestamp="2026-02-23 07:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:44.60260098 +0000 UTC m=+1506.853928154" watchObservedRunningTime="2026-02-23 07:09:44.620562743 +0000 UTC m=+1506.871889877" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.622389 5047 scope.go:117] "RemoveContainer" containerID="59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.647721 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.653285 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.660061 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:44 crc kubenswrapper[5047]: E0223 07:09:44.660498 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d17bb28-592f-4218-b2b0-7bf0ff03808c" containerName="nova-metadata-metadata" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.660516 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d17bb28-592f-4218-b2b0-7bf0ff03808c" containerName="nova-metadata-metadata" Feb 23 07:09:44 crc kubenswrapper[5047]: E0223 07:09:44.660544 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d17bb28-592f-4218-b2b0-7bf0ff03808c" containerName="nova-metadata-log" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.660552 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d17bb28-592f-4218-b2b0-7bf0ff03808c" containerName="nova-metadata-log" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.660710 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d17bb28-592f-4218-b2b0-7bf0ff03808c" containerName="nova-metadata-metadata" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.660730 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d17bb28-592f-4218-b2b0-7bf0ff03808c" containerName="nova-metadata-log" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.664443 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.677046 5047 scope.go:117] "RemoveContainer" containerID="085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3" Feb 23 07:09:44 crc kubenswrapper[5047]: E0223 07:09:44.678693 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3\": container with ID starting with 085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3 not found: ID does not exist" containerID="085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.678751 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3"} err="failed to get container status \"085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3\": rpc error: code = NotFound desc = could not find container \"085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3\": container with ID starting with 085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3 not found: ID does not exist" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.678791 5047 scope.go:117] "RemoveContainer" containerID="59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce" Feb 23 07:09:44 crc kubenswrapper[5047]: E0223 07:09:44.679968 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce\": container with ID starting with 59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce not found: ID does not exist" containerID="59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.680006 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce"} err="failed to get container status \"59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce\": rpc error: code = NotFound desc = could not find container \"59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce\": container with ID starting with 59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce not found: ID does not exist" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.680028 5047 scope.go:117] "RemoveContainer" containerID="085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.680345 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3"} err="failed to get container status \"085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3\": rpc error: code = NotFound desc = could not find container \"085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3\": container with ID starting with 085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3 not found: ID does not exist" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.680418 5047 scope.go:117] "RemoveContainer" containerID="59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.681666 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce"} err="failed to get container status \"59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce\": rpc error: code = NotFound desc = could not find container \"59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce\": container with ID starting with 59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce not found: ID does not exist" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.681850 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.683080 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.695092 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.781732 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-config-data\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.781945 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.782186 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc9b19ab-9f8a-4127-944e-116e2b843310-logs\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.782413 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cbkr\" (UniqueName: \"kubernetes.io/projected/cc9b19ab-9f8a-4127-944e-116e2b843310-kube-api-access-8cbkr\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.783073 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.885695 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc9b19ab-9f8a-4127-944e-116e2b843310-logs\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.885801 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cbkr\" (UniqueName: \"kubernetes.io/projected/cc9b19ab-9f8a-4127-944e-116e2b843310-kube-api-access-8cbkr\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.885939 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.885983 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-config-data\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.886012 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.886443 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc9b19ab-9f8a-4127-944e-116e2b843310-logs\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.891077 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.892821 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-config-data\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.893279 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: I0223 07:09:44.905810 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cbkr\" (UniqueName: \"kubernetes.io/projected/cc9b19ab-9f8a-4127-944e-116e2b843310-kube-api-access-8cbkr\") pod \"nova-metadata-0\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " pod="openstack/nova-metadata-0" Feb 23 07:09:44 crc kubenswrapper[5047]: E0223 07:09:44.917993 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aaf169b1c047fa6009018bffbc5020f20fa43632fb684268a6cb15fbcba02ac5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:09:44 crc kubenswrapper[5047]: E0223 07:09:44.920417 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aaf169b1c047fa6009018bffbc5020f20fa43632fb684268a6cb15fbcba02ac5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:09:44 crc kubenswrapper[5047]: E0223 07:09:44.921784 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aaf169b1c047fa6009018bffbc5020f20fa43632fb684268a6cb15fbcba02ac5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:09:44 crc kubenswrapper[5047]: E0223 07:09:44.921952 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="64f8ba7b-8aef-4487-826e-04b1a5cc2631" containerName="nova-scheduler-scheduler" Feb 23 07:09:45 crc kubenswrapper[5047]: I0223 07:09:45.011577 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:45 crc kubenswrapper[5047]: I0223 07:09:45.576004 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:45 crc kubenswrapper[5047]: I0223 07:09:45.614774 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc9b19ab-9f8a-4127-944e-116e2b843310","Type":"ContainerStarted","Data":"c833ebb18e775fee47c5c108bafc1f9d4d9cd700b14092cb139726e8ff0f50ec"} Feb 23 07:09:45 crc kubenswrapper[5047]: I0223 07:09:45.620122 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55cf0e5d-62dd-4b49-8e84-cd9307857a82","Type":"ContainerStarted","Data":"ecea963d0e2a0cfc6412ec419ed6f100ec03ca22655bd8dcad6f98f759cafc5b"} Feb 23 07:09:45 crc kubenswrapper[5047]: I0223 07:09:45.620273 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:09:45 crc kubenswrapper[5047]: I0223 07:09:45.658323 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.116135549 podStartE2EDuration="5.658295912s" podCreationTimestamp="2026-02-23 07:09:40 +0000 UTC" firstStartedPulling="2026-02-23 07:09:41.540809646 +0000 UTC m=+1503.792136780" lastFinishedPulling="2026-02-23 07:09:45.082970009 +0000 UTC m=+1507.334297143" observedRunningTime="2026-02-23 07:09:45.643744998 +0000 UTC m=+1507.895072152" watchObservedRunningTime="2026-02-23 07:09:45.658295912 +0000 UTC m=+1507.909623046" Feb 23 07:09:46 crc kubenswrapper[5047]: I0223 07:09:46.363931 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d17bb28-592f-4218-b2b0-7bf0ff03808c" path="/var/lib/kubelet/pods/4d17bb28-592f-4218-b2b0-7bf0ff03808c/volumes" Feb 23 07:09:46 crc kubenswrapper[5047]: I0223 07:09:46.633647 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc9b19ab-9f8a-4127-944e-116e2b843310","Type":"ContainerStarted","Data":"84b8331202fbfcf70dee794c91c4472ba39ff7a3c996eaf20b2b30fe07e14ad2"} Feb 23 07:09:46 crc kubenswrapper[5047]: I0223 07:09:46.634659 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc9b19ab-9f8a-4127-944e-116e2b843310","Type":"ContainerStarted","Data":"32b939c9054bff897de2c9bc7e06c46485057377ca90daad4c84f6e37485a364"} Feb 23 07:09:46 crc kubenswrapper[5047]: I0223 07:09:46.665004 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.664974833 podStartE2EDuration="2.664974833s" podCreationTimestamp="2026-02-23 07:09:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:46.651038076 +0000 UTC m=+1508.902365210" watchObservedRunningTime="2026-02-23 07:09:46.664974833 +0000 UTC m=+1508.916301967" Feb 23 07:09:46 crc kubenswrapper[5047]: I0223 07:09:46.760304 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:09:46 crc kubenswrapper[5047]: I0223 07:09:46.760426 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.648177 5047 generic.go:334] "Generic (PLEG): container finished" podID="71cf0b35-5745-4939-b559-5041d7070842" containerID="76567726ef56e2a98637e729008f178ecec9754e29f941487c6716f5670595f3" exitCode=0 Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.648713 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"71cf0b35-5745-4939-b559-5041d7070842","Type":"ContainerDied","Data":"76567726ef56e2a98637e729008f178ecec9754e29f941487c6716f5670595f3"} Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.659217 5047 generic.go:334] "Generic (PLEG): container finished" podID="64f8ba7b-8aef-4487-826e-04b1a5cc2631" containerID="aaf169b1c047fa6009018bffbc5020f20fa43632fb684268a6cb15fbcba02ac5" exitCode=0 Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.659328 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"64f8ba7b-8aef-4487-826e-04b1a5cc2631","Type":"ContainerDied","Data":"aaf169b1c047fa6009018bffbc5020f20fa43632fb684268a6cb15fbcba02ac5"} Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.746228 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.863470 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71cf0b35-5745-4939-b559-5041d7070842-config-data\") pod \"71cf0b35-5745-4939-b559-5041d7070842\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.864067 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cf0b35-5745-4939-b559-5041d7070842-logs\") pod \"71cf0b35-5745-4939-b559-5041d7070842\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.864118 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cf0b35-5745-4939-b559-5041d7070842-combined-ca-bundle\") pod \"71cf0b35-5745-4939-b559-5041d7070842\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.864198 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brlz9\" (UniqueName: \"kubernetes.io/projected/71cf0b35-5745-4939-b559-5041d7070842-kube-api-access-brlz9\") pod \"71cf0b35-5745-4939-b559-5041d7070842\" (UID: \"71cf0b35-5745-4939-b559-5041d7070842\") " Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.864702 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71cf0b35-5745-4939-b559-5041d7070842-logs" (OuterVolumeSpecName: "logs") pod "71cf0b35-5745-4939-b559-5041d7070842" (UID: "71cf0b35-5745-4939-b559-5041d7070842"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.864938 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cf0b35-5745-4939-b559-5041d7070842-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.870807 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71cf0b35-5745-4939-b559-5041d7070842-kube-api-access-brlz9" (OuterVolumeSpecName: "kube-api-access-brlz9") pod "71cf0b35-5745-4939-b559-5041d7070842" (UID: "71cf0b35-5745-4939-b559-5041d7070842"). InnerVolumeSpecName "kube-api-access-brlz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.892932 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71cf0b35-5745-4939-b559-5041d7070842-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71cf0b35-5745-4939-b559-5041d7070842" (UID: "71cf0b35-5745-4939-b559-5041d7070842"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.896219 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71cf0b35-5745-4939-b559-5041d7070842-config-data" (OuterVolumeSpecName: "config-data") pod "71cf0b35-5745-4939-b559-5041d7070842" (UID: "71cf0b35-5745-4939-b559-5041d7070842"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.931652 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.967628 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71cf0b35-5745-4939-b559-5041d7070842-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.967679 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cf0b35-5745-4939-b559-5041d7070842-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:47 crc kubenswrapper[5047]: I0223 07:09:47.967698 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brlz9\" (UniqueName: \"kubernetes.io/projected/71cf0b35-5745-4939-b559-5041d7070842-kube-api-access-brlz9\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.069523 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7lt4\" (UniqueName: \"kubernetes.io/projected/64f8ba7b-8aef-4487-826e-04b1a5cc2631-kube-api-access-x7lt4\") pod \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\" (UID: \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\") " Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.069688 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f8ba7b-8aef-4487-826e-04b1a5cc2631-combined-ca-bundle\") pod \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\" (UID: \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\") " Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.069867 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f8ba7b-8aef-4487-826e-04b1a5cc2631-config-data\") pod \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\" (UID: \"64f8ba7b-8aef-4487-826e-04b1a5cc2631\") " Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.081723 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f8ba7b-8aef-4487-826e-04b1a5cc2631-kube-api-access-x7lt4" (OuterVolumeSpecName: "kube-api-access-x7lt4") pod "64f8ba7b-8aef-4487-826e-04b1a5cc2631" (UID: "64f8ba7b-8aef-4487-826e-04b1a5cc2631"). InnerVolumeSpecName "kube-api-access-x7lt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.104140 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64f8ba7b-8aef-4487-826e-04b1a5cc2631-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64f8ba7b-8aef-4487-826e-04b1a5cc2631" (UID: "64f8ba7b-8aef-4487-826e-04b1a5cc2631"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.104192 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64f8ba7b-8aef-4487-826e-04b1a5cc2631-config-data" (OuterVolumeSpecName: "config-data") pod "64f8ba7b-8aef-4487-826e-04b1a5cc2631" (UID: "64f8ba7b-8aef-4487-826e-04b1a5cc2631"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.173265 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f8ba7b-8aef-4487-826e-04b1a5cc2631-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.173314 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7lt4\" (UniqueName: \"kubernetes.io/projected/64f8ba7b-8aef-4487-826e-04b1a5cc2631-kube-api-access-x7lt4\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.173337 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f8ba7b-8aef-4487-826e-04b1a5cc2631-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.670709 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"71cf0b35-5745-4939-b559-5041d7070842","Type":"ContainerDied","Data":"9cce3d8392e1e0f63b2b7f5f4b06b55ab060989ceabc12c76ee8e79e4af5749b"} Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.670756 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.670797 5047 scope.go:117] "RemoveContainer" containerID="76567726ef56e2a98637e729008f178ecec9754e29f941487c6716f5670595f3" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.672473 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"64f8ba7b-8aef-4487-826e-04b1a5cc2631","Type":"ContainerDied","Data":"2f2094adafca6b3a1b0401262c221dd7623c35b258058d615c91720c3442ce03"} Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.672506 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.708145 5047 scope.go:117] "RemoveContainer" containerID="ddf61b1500decbbe2b784b296ee90daade4d60aa333558ed7ade2fb7040ee01f" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.718684 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.743997 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.755000 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.763282 5047 scope.go:117] "RemoveContainer" containerID="aaf169b1c047fa6009018bffbc5020f20fa43632fb684268a6cb15fbcba02ac5" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.818986 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.849139 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:48 crc kubenswrapper[5047]: E0223 07:09:48.849931 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cf0b35-5745-4939-b559-5041d7070842" containerName="nova-api-api" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.849952 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cf0b35-5745-4939-b559-5041d7070842" containerName="nova-api-api" Feb 23 07:09:48 crc kubenswrapper[5047]: E0223 07:09:48.849993 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f8ba7b-8aef-4487-826e-04b1a5cc2631" containerName="nova-scheduler-scheduler" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.850001 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f8ba7b-8aef-4487-826e-04b1a5cc2631" containerName="nova-scheduler-scheduler" Feb 23 07:09:48 crc kubenswrapper[5047]: E0223 07:09:48.850018 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cf0b35-5745-4939-b559-5041d7070842" containerName="nova-api-log" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.850026 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cf0b35-5745-4939-b559-5041d7070842" containerName="nova-api-log" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.850369 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cf0b35-5745-4939-b559-5041d7070842" containerName="nova-api-api" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.850399 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="64f8ba7b-8aef-4487-826e-04b1a5cc2631" containerName="nova-scheduler-scheduler" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.850426 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cf0b35-5745-4939-b559-5041d7070842" containerName="nova-api-log" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.852153 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.855173 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.865591 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.867808 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.870033 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.885968 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:48 crc kubenswrapper[5047]: I0223 07:09:48.898272 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.002327 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.003040 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ab0013-7eda-4e2e-8244-242e7e896603-logs\") pod \"nova-api-0\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " pod="openstack/nova-api-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.003076 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-config-data\") pod \"nova-scheduler-0\" (UID: \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.003283 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5c99\" (UniqueName: \"kubernetes.io/projected/84ab0013-7eda-4e2e-8244-242e7e896603-kube-api-access-w5c99\") pod \"nova-api-0\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " pod="openstack/nova-api-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.003539 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ab0013-7eda-4e2e-8244-242e7e896603-config-data\") pod \"nova-api-0\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " pod="openstack/nova-api-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.003575 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2kpv\" (UniqueName: \"kubernetes.io/projected/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-kube-api-access-r2kpv\") pod \"nova-scheduler-0\" (UID: \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.003840 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ab0013-7eda-4e2e-8244-242e7e896603-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " pod="openstack/nova-api-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.105836 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2kpv\" (UniqueName: \"kubernetes.io/projected/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-kube-api-access-r2kpv\") pod \"nova-scheduler-0\" (UID: \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.105889 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ab0013-7eda-4e2e-8244-242e7e896603-config-data\") pod \"nova-api-0\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " pod="openstack/nova-api-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.105961 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ab0013-7eda-4e2e-8244-242e7e896603-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " pod="openstack/nova-api-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.106018 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.106051 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ab0013-7eda-4e2e-8244-242e7e896603-logs\") pod \"nova-api-0\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " pod="openstack/nova-api-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.106079 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-config-data\") pod \"nova-scheduler-0\" (UID: \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.106112 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5c99\" (UniqueName: \"kubernetes.io/projected/84ab0013-7eda-4e2e-8244-242e7e896603-kube-api-access-w5c99\") pod \"nova-api-0\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " pod="openstack/nova-api-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.106782 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ab0013-7eda-4e2e-8244-242e7e896603-logs\") pod \"nova-api-0\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " pod="openstack/nova-api-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.113054 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-config-data\") pod \"nova-scheduler-0\" (UID: \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.113592 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.121837 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ab0013-7eda-4e2e-8244-242e7e896603-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " pod="openstack/nova-api-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.121838 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ab0013-7eda-4e2e-8244-242e7e896603-config-data\") pod \"nova-api-0\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " pod="openstack/nova-api-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.125528 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2kpv\" (UniqueName: \"kubernetes.io/projected/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-kube-api-access-r2kpv\") pod \"nova-scheduler-0\" (UID: \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.126403 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5c99\" (UniqueName: \"kubernetes.io/projected/84ab0013-7eda-4e2e-8244-242e7e896603-kube-api-access-w5c99\") pod \"nova-api-0\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " pod="openstack/nova-api-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.181727 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.194886 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.737351 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:49 crc kubenswrapper[5047]: I0223 07:09:49.753807 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:50 crc kubenswrapper[5047]: I0223 07:09:50.050478 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:09:50 crc kubenswrapper[5047]: I0223 07:09:50.050549 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:09:50 crc kubenswrapper[5047]: I0223 07:09:50.353527 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f8ba7b-8aef-4487-826e-04b1a5cc2631" path="/var/lib/kubelet/pods/64f8ba7b-8aef-4487-826e-04b1a5cc2631/volumes" Feb 23 07:09:50 crc kubenswrapper[5047]: I0223 07:09:50.354334 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71cf0b35-5745-4939-b559-5041d7070842" path="/var/lib/kubelet/pods/71cf0b35-5745-4939-b559-5041d7070842/volumes" Feb 23 07:09:50 crc kubenswrapper[5047]: I0223 07:09:50.719421 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84ab0013-7eda-4e2e-8244-242e7e896603","Type":"ContainerStarted","Data":"12acc951ea1c23b29390ddca608807d477006e621c027794656239f91b5fde2b"} Feb 23 07:09:50 crc kubenswrapper[5047]: I0223 07:09:50.719970 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84ab0013-7eda-4e2e-8244-242e7e896603","Type":"ContainerStarted","Data":"8c441540f0378b86998d22edba1bb329e5fcfcc9bf3122d9794ebc1137c8a697"} Feb 23 07:09:50 crc kubenswrapper[5047]: I0223 07:09:50.719983 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84ab0013-7eda-4e2e-8244-242e7e896603","Type":"ContainerStarted","Data":"68d4909b24395592e18ce12926fd11571d55d554e18a7eb37e140479049773cb"} Feb 23 07:09:50 crc kubenswrapper[5047]: I0223 07:09:50.723214 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a09d8fa2-5b55-4c94-8c50-fe34bb06924e","Type":"ContainerStarted","Data":"31343670cf5cde0bd0303067cd57e0a57c44bf4242ce1b88c23ed27786ca708b"} Feb 23 07:09:50 crc kubenswrapper[5047]: I0223 07:09:50.723254 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a09d8fa2-5b55-4c94-8c50-fe34bb06924e","Type":"ContainerStarted","Data":"d669cc20e7935aa938555ad8bebade0d1007a5c5a2b381433f9577226cdaa552"} Feb 23 07:09:50 crc kubenswrapper[5047]: I0223 07:09:50.797033 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.797004332 podStartE2EDuration="2.797004332s" podCreationTimestamp="2026-02-23 07:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:50.76694811 +0000 UTC m=+1513.018275334" watchObservedRunningTime="2026-02-23 07:09:50.797004332 +0000 UTC m=+1513.048331476" Feb 23 07:09:50 crc kubenswrapper[5047]: I0223 07:09:50.807856 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.807826987 podStartE2EDuration="2.807826987s" podCreationTimestamp="2026-02-23 07:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:50.789654708 +0000 UTC m=+1513.040981862" watchObservedRunningTime="2026-02-23 07:09:50.807826987 +0000 UTC m=+1513.059154131" Feb 23 07:09:53 crc kubenswrapper[5047]: I0223 07:09:53.015055 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 23 07:09:54 crc kubenswrapper[5047]: I0223 07:09:54.195568 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 07:09:54 crc kubenswrapper[5047]: E0223 07:09:54.661135 5047 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/078f5acbb1349163f1b14975ba40185c6b49fe99dffe9bec7408e389d41167b9/diff" to get inode usage: stat /var/lib/containers/storage/overlay/078f5acbb1349163f1b14975ba40185c6b49fe99dffe9bec7408e389d41167b9/diff: no such file or directory, extraDiskErr: Feb 23 07:09:55 crc kubenswrapper[5047]: I0223 07:09:55.013122 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 07:09:55 crc kubenswrapper[5047]: I0223 07:09:55.013201 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 07:09:56 crc kubenswrapper[5047]: E0223 07:09:56.011492 5047 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/7ccff8451ff9c51daff97ce36b7fde520a08eb7d1e1f7e4c5ef0a9d9aff669a2/diff" to get inode usage: stat /var/lib/containers/storage/overlay/7ccff8451ff9c51daff97ce36b7fde520a08eb7d1e1f7e4c5ef0a9d9aff669a2/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_ceilometer-0_cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4/ceilometer-notification-agent/0.log" to get inode usage: stat /var/log/pods/openstack_ceilometer-0_cfe1983c-dee4-4f0f-a3bb-1ad6df2fc4f4/ceilometer-notification-agent/0.log: no such file or directory Feb 23 07:09:56 crc kubenswrapper[5047]: I0223 07:09:56.030236 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:09:56 crc kubenswrapper[5047]: I0223 07:09:56.030256 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:09:59 crc kubenswrapper[5047]: I0223 07:09:59.181975 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:09:59 crc kubenswrapper[5047]: I0223 07:09:59.182388 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:09:59 crc kubenswrapper[5047]: I0223 07:09:59.195578 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 07:09:59 crc kubenswrapper[5047]: I0223 07:09:59.232042 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 07:09:59 crc kubenswrapper[5047]: I0223 07:09:59.894306 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 07:10:00 crc kubenswrapper[5047]: I0223 07:10:00.223145 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="84ab0013-7eda-4e2e-8244-242e7e896603" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:00 crc kubenswrapper[5047]: I0223 07:10:00.264172 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="84ab0013-7eda-4e2e-8244-242e7e896603" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:05 crc kubenswrapper[5047]: I0223 07:10:05.022281 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 07:10:05 crc kubenswrapper[5047]: I0223 07:10:05.022804 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 07:10:05 crc kubenswrapper[5047]: I0223 07:10:05.031343 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 07:10:05 crc kubenswrapper[5047]: I0223 07:10:05.034741 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 07:10:06 crc kubenswrapper[5047]: W0223 07:10:06.421825 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d17bb28_592f_4218_b2b0_7bf0ff03808c.slice/crio-59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce.scope WatchSource:0}: Error finding container 59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce: Status 404 returned error can't find the container with id 59eced8219d7ad316f6ed23c87e9d8b1fd2f17b99be63615e6b552ed4f18f9ce Feb 23 07:10:06 crc kubenswrapper[5047]: W0223 07:10:06.427075 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d17bb28_592f_4218_b2b0_7bf0ff03808c.slice/crio-085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3.scope WatchSource:0}: Error finding container 085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3: Status 404 returned error can't find the container with id 085c7cd83c7dc7efc8ebaa33aa0e3289c66e4cac500841630088e17a820513a3 Feb 23 07:10:06 crc kubenswrapper[5047]: E0223 07:10:06.658852 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d17bb28_592f_4218_b2b0_7bf0ff03808c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64f8ba7b_8aef_4487_826e_04b1a5cc2631.slice/crio-conmon-aaf169b1c047fa6009018bffbc5020f20fa43632fb684268a6cb15fbcba02ac5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d17bb28_592f_4218_b2b0_7bf0ff03808c.slice/crio-92843fc9ad4defc0e9a8d1f8970ed274cf650491c3578f2a9ee5b27a116abc6d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64f8ba7b_8aef_4487_826e_04b1a5cc2631.slice/crio-2f2094adafca6b3a1b0401262c221dd7623c35b258058d615c91720c3442ce03\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64f8ba7b_8aef_4487_826e_04b1a5cc2631.slice/crio-aaf169b1c047fa6009018bffbc5020f20fa43632fb684268a6cb15fbcba02ac5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71cf0b35_5745_4939_b559_5041d7070842.slice/crio-76567726ef56e2a98637e729008f178ecec9754e29f941487c6716f5670595f3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f196b0e_f688_4c8b_8e1a_a37f8d3009d4.slice/crio-f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71cf0b35_5745_4939_b559_5041d7070842.slice/crio-9cce3d8392e1e0f63b2b7f5f4b06b55ab060989ceabc12c76ee8e79e4af5749b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71cf0b35_5745_4939_b559_5041d7070842.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f196b0e_f688_4c8b_8e1a_a37f8d3009d4.slice/crio-conmon-f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71cf0b35_5745_4939_b559_5041d7070842.slice/crio-conmon-76567726ef56e2a98637e729008f178ecec9754e29f941487c6716f5670595f3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64f8ba7b_8aef_4487_826e_04b1a5cc2631.slice\": RecentStats: unable to find data in memory cache]" Feb 23 07:10:06 crc kubenswrapper[5047]: I0223 07:10:06.900675 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:06 crc kubenswrapper[5047]: I0223 07:10:06.965100 5047 generic.go:334] "Generic (PLEG): container finished" podID="4f196b0e-f688-4c8b-8e1a-a37f8d3009d4" containerID="f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843" exitCode=137 Feb 23 07:10:06 crc kubenswrapper[5047]: I0223 07:10:06.965157 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:06 crc kubenswrapper[5047]: I0223 07:10:06.965226 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4","Type":"ContainerDied","Data":"f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843"} Feb 23 07:10:06 crc kubenswrapper[5047]: I0223 07:10:06.965301 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4","Type":"ContainerDied","Data":"8eda1f4225609c2d1ee636d87513eafb2745086686158c652f1d2bc863c0270c"} Feb 23 07:10:06 crc kubenswrapper[5047]: I0223 07:10:06.965330 5047 scope.go:117] "RemoveContainer" containerID="f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.003352 5047 scope.go:117] "RemoveContainer" containerID="f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843" Feb 23 07:10:07 crc kubenswrapper[5047]: E0223 07:10:07.004193 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843\": container with ID starting with f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843 not found: ID does not exist" containerID="f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.004259 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843"} err="failed to get container status \"f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843\": rpc error: code = NotFound desc = could not find container \"f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843\": container with ID starting with f81cdd926c88927bd57800c94f7c7fd3c9dc58361c810ce09c33bbc22fa43843 not found: ID does not exist" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.084542 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2bxv\" (UniqueName: \"kubernetes.io/projected/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-kube-api-access-z2bxv\") pod \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\" (UID: \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\") " Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.084766 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-config-data\") pod \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\" (UID: \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\") " Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.084860 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-combined-ca-bundle\") pod \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\" (UID: \"4f196b0e-f688-4c8b-8e1a-a37f8d3009d4\") " Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.095172 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-kube-api-access-z2bxv" (OuterVolumeSpecName: "kube-api-access-z2bxv") pod "4f196b0e-f688-4c8b-8e1a-a37f8d3009d4" (UID: "4f196b0e-f688-4c8b-8e1a-a37f8d3009d4"). InnerVolumeSpecName "kube-api-access-z2bxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.133519 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f196b0e-f688-4c8b-8e1a-a37f8d3009d4" (UID: "4f196b0e-f688-4c8b-8e1a-a37f8d3009d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.136003 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-config-data" (OuterVolumeSpecName: "config-data") pod "4f196b0e-f688-4c8b-8e1a-a37f8d3009d4" (UID: "4f196b0e-f688-4c8b-8e1a-a37f8d3009d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.188634 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2bxv\" (UniqueName: \"kubernetes.io/projected/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-kube-api-access-z2bxv\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.188689 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.188710 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.343231 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.355656 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.371186 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:10:07 crc kubenswrapper[5047]: E0223 07:10:07.372329 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f196b0e-f688-4c8b-8e1a-a37f8d3009d4" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.372375 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f196b0e-f688-4c8b-8e1a-a37f8d3009d4" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.372740 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f196b0e-f688-4c8b-8e1a-a37f8d3009d4" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.374280 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.376626 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.378505 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.379071 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.386532 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.496504 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.496621 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.496795 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.496845 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dktq\" (UniqueName: \"kubernetes.io/projected/d513cfd3-cb98-440f-b564-d36d8f20f5a4-kube-api-access-7dktq\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.497010 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.599220 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.599487 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.599616 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.599658 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dktq\" (UniqueName: \"kubernetes.io/projected/d513cfd3-cb98-440f-b564-d36d8f20f5a4-kube-api-access-7dktq\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.599731 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.608959 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.612069 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.617231 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.617508 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.633967 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dktq\" (UniqueName: \"kubernetes.io/projected/d513cfd3-cb98-440f-b564-d36d8f20f5a4-kube-api-access-7dktq\") pod \"nova-cell1-novncproxy-0\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:07 crc kubenswrapper[5047]: I0223 07:10:07.713118 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:08 crc kubenswrapper[5047]: I0223 07:10:08.058449 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:10:08 crc kubenswrapper[5047]: W0223 07:10:08.061228 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd513cfd3_cb98_440f_b564_d36d8f20f5a4.slice/crio-56e9e173bf16fb369d683be7c134bed29b8b32cbd978952eb87f9751d77dabbd WatchSource:0}: Error finding container 56e9e173bf16fb369d683be7c134bed29b8b32cbd978952eb87f9751d77dabbd: Status 404 returned error can't find the container with id 56e9e173bf16fb369d683be7c134bed29b8b32cbd978952eb87f9751d77dabbd Feb 23 07:10:08 crc kubenswrapper[5047]: I0223 07:10:08.362822 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f196b0e-f688-4c8b-8e1a-a37f8d3009d4" path="/var/lib/kubelet/pods/4f196b0e-f688-4c8b-8e1a-a37f8d3009d4/volumes" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.000145 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d513cfd3-cb98-440f-b564-d36d8f20f5a4","Type":"ContainerStarted","Data":"106b2bdb20d1a2d2e867ad06c463fa1d14f76f5b815d49afbd6405b8e6d0af57"} Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.000624 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d513cfd3-cb98-440f-b564-d36d8f20f5a4","Type":"ContainerStarted","Data":"56e9e173bf16fb369d683be7c134bed29b8b32cbd978952eb87f9751d77dabbd"} Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.044960 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.044893363 podStartE2EDuration="2.044893363s" podCreationTimestamp="2026-02-23 07:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:10:09.032612179 +0000 UTC m=+1531.283939373" watchObservedRunningTime="2026-02-23 07:10:09.044893363 +0000 UTC m=+1531.296220537" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.188885 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.189508 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.190019 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.190095 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.198760 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.200662 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.489434 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7677694455-7s2lc"] Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.491271 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.504700 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7677694455-7s2lc"] Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.650242 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-dns-svc\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.650309 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.650336 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnz8l\" (UniqueName: \"kubernetes.io/projected/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-kube-api-access-bnz8l\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.650442 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-config\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.650532 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.650561 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.752592 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-dns-svc\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.753109 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.753153 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnz8l\" (UniqueName: \"kubernetes.io/projected/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-kube-api-access-bnz8l\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.753213 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-config\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.753282 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.753323 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.754445 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-dns-svc\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.754764 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-config\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.754771 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.754949 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.755402 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.772698 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnz8l\" (UniqueName: \"kubernetes.io/projected/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-kube-api-access-bnz8l\") pod \"dnsmasq-dns-7677694455-7s2lc\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:09 crc kubenswrapper[5047]: I0223 07:10:09.817135 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:10 crc kubenswrapper[5047]: I0223 07:10:10.358498 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7677694455-7s2lc"] Feb 23 07:10:11 crc kubenswrapper[5047]: I0223 07:10:11.020608 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 23 07:10:11 crc kubenswrapper[5047]: I0223 07:10:11.026045 5047 generic.go:334] "Generic (PLEG): container finished" podID="68efda44-7cf0-44c4-bc50-df0b73ed5b8b" containerID="df677e26a5cb88dc86b06af292209bbf0272870d4b53c39276d9422ebee88afe" exitCode=0 Feb 23 07:10:11 crc kubenswrapper[5047]: I0223 07:10:11.026150 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-7s2lc" event={"ID":"68efda44-7cf0-44c4-bc50-df0b73ed5b8b","Type":"ContainerDied","Data":"df677e26a5cb88dc86b06af292209bbf0272870d4b53c39276d9422ebee88afe"} Feb 23 07:10:11 crc kubenswrapper[5047]: I0223 07:10:11.026215 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-7s2lc" event={"ID":"68efda44-7cf0-44c4-bc50-df0b73ed5b8b","Type":"ContainerStarted","Data":"233c74d5d28cbde9dcb36c1578fe5f1cc611f193fe0b34e65d313162444f9a8c"} Feb 23 07:10:11 crc kubenswrapper[5047]: I0223 07:10:11.809711 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:10:11 crc kubenswrapper[5047]: I0223 07:10:11.903196 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:12 crc kubenswrapper[5047]: I0223 07:10:12.038708 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-7s2lc" event={"ID":"68efda44-7cf0-44c4-bc50-df0b73ed5b8b","Type":"ContainerStarted","Data":"7fb546caaa8981682e3f4a1e627c0bc5a97ab373a74aa0917242279733ff16a3"} Feb 23 07:10:12 crc kubenswrapper[5047]: I0223 07:10:12.038968 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="ceilometer-central-agent" containerID="cri-o://c7a63fde00908a7029a5a0cc9c9ad4e38bdab61cd7948b87df4667bec68c9579" gracePeriod=30 Feb 23 07:10:12 crc kubenswrapper[5047]: I0223 07:10:12.039074 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="proxy-httpd" containerID="cri-o://ecea963d0e2a0cfc6412ec419ed6f100ec03ca22655bd8dcad6f98f759cafc5b" gracePeriod=30 Feb 23 07:10:12 crc kubenswrapper[5047]: I0223 07:10:12.039152 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="ceilometer-notification-agent" containerID="cri-o://06bcc7b23eb3d84ce4ef716d4b2f1931a9c641d0bc275868a75f5371bfe28726" gracePeriod=30 Feb 23 07:10:12 crc kubenswrapper[5047]: I0223 07:10:12.039267 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="sg-core" containerID="cri-o://9a2d8665e47f507ac5a9bd1ef70f5ad09192769b8c98e9ae19d7f37912b53efb" gracePeriod=30 Feb 23 07:10:12 crc kubenswrapper[5047]: I0223 07:10:12.042550 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="84ab0013-7eda-4e2e-8244-242e7e896603" containerName="nova-api-log" containerID="cri-o://8c441540f0378b86998d22edba1bb329e5fcfcc9bf3122d9794ebc1137c8a697" gracePeriod=30 Feb 23 07:10:12 crc kubenswrapper[5047]: I0223 07:10:12.042621 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="84ab0013-7eda-4e2e-8244-242e7e896603" containerName="nova-api-api" containerID="cri-o://12acc951ea1c23b29390ddca608807d477006e621c027794656239f91b5fde2b" gracePeriod=30 Feb 23 07:10:12 crc kubenswrapper[5047]: I0223 07:10:12.069477 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7677694455-7s2lc" podStartSLOduration=3.069454105 podStartE2EDuration="3.069454105s" podCreationTimestamp="2026-02-23 07:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:10:12.063495688 +0000 UTC m=+1534.314822832" watchObservedRunningTime="2026-02-23 07:10:12.069454105 +0000 UTC m=+1534.320781249" Feb 23 07:10:12 crc kubenswrapper[5047]: I0223 07:10:12.713364 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.053673 5047 generic.go:334] "Generic (PLEG): container finished" podID="84ab0013-7eda-4e2e-8244-242e7e896603" containerID="8c441540f0378b86998d22edba1bb329e5fcfcc9bf3122d9794ebc1137c8a697" exitCode=143 Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.053751 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84ab0013-7eda-4e2e-8244-242e7e896603","Type":"ContainerDied","Data":"8c441540f0378b86998d22edba1bb329e5fcfcc9bf3122d9794ebc1137c8a697"} Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.062241 5047 generic.go:334] "Generic (PLEG): container finished" podID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerID="ecea963d0e2a0cfc6412ec419ed6f100ec03ca22655bd8dcad6f98f759cafc5b" exitCode=0 Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.062279 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55cf0e5d-62dd-4b49-8e84-cd9307857a82","Type":"ContainerDied","Data":"ecea963d0e2a0cfc6412ec419ed6f100ec03ca22655bd8dcad6f98f759cafc5b"} Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.062397 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55cf0e5d-62dd-4b49-8e84-cd9307857a82","Type":"ContainerDied","Data":"9a2d8665e47f507ac5a9bd1ef70f5ad09192769b8c98e9ae19d7f37912b53efb"} Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.062321 5047 generic.go:334] "Generic (PLEG): container finished" podID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerID="9a2d8665e47f507ac5a9bd1ef70f5ad09192769b8c98e9ae19d7f37912b53efb" exitCode=2 Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.062433 5047 generic.go:334] "Generic (PLEG): container finished" podID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerID="c7a63fde00908a7029a5a0cc9c9ad4e38bdab61cd7948b87df4667bec68c9579" exitCode=0 Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.062486 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55cf0e5d-62dd-4b49-8e84-cd9307857a82","Type":"ContainerDied","Data":"c7a63fde00908a7029a5a0cc9c9ad4e38bdab61cd7948b87df4667bec68c9579"} Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.062879 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.375804 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fp7fw"] Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.378633 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.391935 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fp7fw"] Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.538799 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-utilities\") pod \"redhat-operators-fp7fw\" (UID: \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\") " pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.539311 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-catalog-content\") pod \"redhat-operators-fp7fw\" (UID: \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\") " pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.539508 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8r6h\" (UniqueName: \"kubernetes.io/projected/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-kube-api-access-t8r6h\") pod \"redhat-operators-fp7fw\" (UID: \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\") " pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.641033 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8r6h\" (UniqueName: \"kubernetes.io/projected/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-kube-api-access-t8r6h\") pod \"redhat-operators-fp7fw\" (UID: \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\") " pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.641091 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-utilities\") pod \"redhat-operators-fp7fw\" (UID: \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\") " pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.641127 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-catalog-content\") pod \"redhat-operators-fp7fw\" (UID: \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\") " pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.641629 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-catalog-content\") pod \"redhat-operators-fp7fw\" (UID: \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\") " pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.642175 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-utilities\") pod \"redhat-operators-fp7fw\" (UID: \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\") " pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.682055 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8r6h\" (UniqueName: \"kubernetes.io/projected/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-kube-api-access-t8r6h\") pod \"redhat-operators-fp7fw\" (UID: \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\") " pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:13 crc kubenswrapper[5047]: I0223 07:10:13.701233 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:14 crc kubenswrapper[5047]: I0223 07:10:14.237604 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fp7fw"] Feb 23 07:10:14 crc kubenswrapper[5047]: W0223 07:10:14.248532 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cc22dfb_ad4d_4ae5_95f8_e55057a57878.slice/crio-6cea7d333b14564f1ca636f29e95221a532028f3eaf3d3406b107e02d96f208c WatchSource:0}: Error finding container 6cea7d333b14564f1ca636f29e95221a532028f3eaf3d3406b107e02d96f208c: Status 404 returned error can't find the container with id 6cea7d333b14564f1ca636f29e95221a532028f3eaf3d3406b107e02d96f208c Feb 23 07:10:14 crc kubenswrapper[5047]: I0223 07:10:14.863736 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:10:14 crc kubenswrapper[5047]: I0223 07:10:14.966316 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-sg-core-conf-yaml\") pod \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " Feb 23 07:10:14 crc kubenswrapper[5047]: I0223 07:10:14.966451 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-scripts\") pod \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " Feb 23 07:10:14 crc kubenswrapper[5047]: I0223 07:10:14.966510 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55cf0e5d-62dd-4b49-8e84-cd9307857a82-log-httpd\") pod \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " Feb 23 07:10:14 crc kubenswrapper[5047]: I0223 07:10:14.966575 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-ceilometer-tls-certs\") pod \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " Feb 23 07:10:14 crc kubenswrapper[5047]: I0223 07:10:14.966641 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j988r\" (UniqueName: \"kubernetes.io/projected/55cf0e5d-62dd-4b49-8e84-cd9307857a82-kube-api-access-j988r\") pod \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " Feb 23 07:10:14 crc kubenswrapper[5047]: I0223 07:10:14.966687 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-config-data\") pod \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " Feb 23 07:10:14 crc kubenswrapper[5047]: I0223 07:10:14.966730 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-combined-ca-bundle\") pod \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " Feb 23 07:10:14 crc kubenswrapper[5047]: I0223 07:10:14.966755 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55cf0e5d-62dd-4b49-8e84-cd9307857a82-run-httpd\") pod \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\" (UID: \"55cf0e5d-62dd-4b49-8e84-cd9307857a82\") " Feb 23 07:10:14 crc kubenswrapper[5047]: I0223 07:10:14.967801 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55cf0e5d-62dd-4b49-8e84-cd9307857a82-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "55cf0e5d-62dd-4b49-8e84-cd9307857a82" (UID: "55cf0e5d-62dd-4b49-8e84-cd9307857a82"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:14 crc kubenswrapper[5047]: I0223 07:10:14.977330 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55cf0e5d-62dd-4b49-8e84-cd9307857a82-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "55cf0e5d-62dd-4b49-8e84-cd9307857a82" (UID: "55cf0e5d-62dd-4b49-8e84-cd9307857a82"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:14 crc kubenswrapper[5047]: I0223 07:10:14.999075 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-scripts" (OuterVolumeSpecName: "scripts") pod "55cf0e5d-62dd-4b49-8e84-cd9307857a82" (UID: "55cf0e5d-62dd-4b49-8e84-cd9307857a82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.009583 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "55cf0e5d-62dd-4b49-8e84-cd9307857a82" (UID: "55cf0e5d-62dd-4b49-8e84-cd9307857a82"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.021920 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55cf0e5d-62dd-4b49-8e84-cd9307857a82-kube-api-access-j988r" (OuterVolumeSpecName: "kube-api-access-j988r") pod "55cf0e5d-62dd-4b49-8e84-cd9307857a82" (UID: "55cf0e5d-62dd-4b49-8e84-cd9307857a82"). InnerVolumeSpecName "kube-api-access-j988r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.065476 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "55cf0e5d-62dd-4b49-8e84-cd9307857a82" (UID: "55cf0e5d-62dd-4b49-8e84-cd9307857a82"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.069480 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55cf0e5d-62dd-4b49-8e84-cd9307857a82-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.069514 5047 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.069528 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j988r\" (UniqueName: \"kubernetes.io/projected/55cf0e5d-62dd-4b49-8e84-cd9307857a82-kube-api-access-j988r\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.069560 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55cf0e5d-62dd-4b49-8e84-cd9307857a82-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.069570 5047 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.069579 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.101320 5047 generic.go:334] "Generic (PLEG): container finished" podID="7cc22dfb-ad4d-4ae5-95f8-e55057a57878" containerID="3d61fb04ee0bbb4b053b534ea0c28bb11d21c70a67a08ecb4cddd00c5f55237c" exitCode=0 Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.101432 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp7fw" event={"ID":"7cc22dfb-ad4d-4ae5-95f8-e55057a57878","Type":"ContainerDied","Data":"3d61fb04ee0bbb4b053b534ea0c28bb11d21c70a67a08ecb4cddd00c5f55237c"} Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.101497 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp7fw" event={"ID":"7cc22dfb-ad4d-4ae5-95f8-e55057a57878","Type":"ContainerStarted","Data":"6cea7d333b14564f1ca636f29e95221a532028f3eaf3d3406b107e02d96f208c"} Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.110184 5047 generic.go:334] "Generic (PLEG): container finished" podID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerID="06bcc7b23eb3d84ce4ef716d4b2f1931a9c641d0bc275868a75f5371bfe28726" exitCode=0 Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.110235 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55cf0e5d-62dd-4b49-8e84-cd9307857a82","Type":"ContainerDied","Data":"06bcc7b23eb3d84ce4ef716d4b2f1931a9c641d0bc275868a75f5371bfe28726"} Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.110269 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55cf0e5d-62dd-4b49-8e84-cd9307857a82","Type":"ContainerDied","Data":"b76e87f8ca9945022f71387af0336143c537b3566cb6618c59f723e1377248e1"} Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.110290 5047 scope.go:117] "RemoveContainer" containerID="ecea963d0e2a0cfc6412ec419ed6f100ec03ca22655bd8dcad6f98f759cafc5b" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.110433 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.134142 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55cf0e5d-62dd-4b49-8e84-cd9307857a82" (UID: "55cf0e5d-62dd-4b49-8e84-cd9307857a82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.160628 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-config-data" (OuterVolumeSpecName: "config-data") pod "55cf0e5d-62dd-4b49-8e84-cd9307857a82" (UID: "55cf0e5d-62dd-4b49-8e84-cd9307857a82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.172103 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.172141 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cf0e5d-62dd-4b49-8e84-cd9307857a82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.174478 5047 scope.go:117] "RemoveContainer" containerID="9a2d8665e47f507ac5a9bd1ef70f5ad09192769b8c98e9ae19d7f37912b53efb" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.208030 5047 scope.go:117] "RemoveContainer" containerID="06bcc7b23eb3d84ce4ef716d4b2f1931a9c641d0bc275868a75f5371bfe28726" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.232697 5047 scope.go:117] "RemoveContainer" containerID="c7a63fde00908a7029a5a0cc9c9ad4e38bdab61cd7948b87df4667bec68c9579" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.295790 5047 scope.go:117] "RemoveContainer" containerID="ecea963d0e2a0cfc6412ec419ed6f100ec03ca22655bd8dcad6f98f759cafc5b" Feb 23 07:10:15 crc kubenswrapper[5047]: E0223 07:10:15.297156 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecea963d0e2a0cfc6412ec419ed6f100ec03ca22655bd8dcad6f98f759cafc5b\": container with ID starting with ecea963d0e2a0cfc6412ec419ed6f100ec03ca22655bd8dcad6f98f759cafc5b not found: ID does not exist" containerID="ecea963d0e2a0cfc6412ec419ed6f100ec03ca22655bd8dcad6f98f759cafc5b" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.297203 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecea963d0e2a0cfc6412ec419ed6f100ec03ca22655bd8dcad6f98f759cafc5b"} err="failed to get container status \"ecea963d0e2a0cfc6412ec419ed6f100ec03ca22655bd8dcad6f98f759cafc5b\": rpc error: code = NotFound desc = could not find container \"ecea963d0e2a0cfc6412ec419ed6f100ec03ca22655bd8dcad6f98f759cafc5b\": container with ID starting with ecea963d0e2a0cfc6412ec419ed6f100ec03ca22655bd8dcad6f98f759cafc5b not found: ID does not exist" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.297235 5047 scope.go:117] "RemoveContainer" containerID="9a2d8665e47f507ac5a9bd1ef70f5ad09192769b8c98e9ae19d7f37912b53efb" Feb 23 07:10:15 crc kubenswrapper[5047]: E0223 07:10:15.297689 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a2d8665e47f507ac5a9bd1ef70f5ad09192769b8c98e9ae19d7f37912b53efb\": container with ID starting with 9a2d8665e47f507ac5a9bd1ef70f5ad09192769b8c98e9ae19d7f37912b53efb not found: ID does not exist" containerID="9a2d8665e47f507ac5a9bd1ef70f5ad09192769b8c98e9ae19d7f37912b53efb" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.297719 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a2d8665e47f507ac5a9bd1ef70f5ad09192769b8c98e9ae19d7f37912b53efb"} err="failed to get container status \"9a2d8665e47f507ac5a9bd1ef70f5ad09192769b8c98e9ae19d7f37912b53efb\": rpc error: code = NotFound desc = could not find container \"9a2d8665e47f507ac5a9bd1ef70f5ad09192769b8c98e9ae19d7f37912b53efb\": container with ID starting with 9a2d8665e47f507ac5a9bd1ef70f5ad09192769b8c98e9ae19d7f37912b53efb not found: ID does not exist" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.297742 5047 scope.go:117] "RemoveContainer" containerID="06bcc7b23eb3d84ce4ef716d4b2f1931a9c641d0bc275868a75f5371bfe28726" Feb 23 07:10:15 crc kubenswrapper[5047]: E0223 07:10:15.298230 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06bcc7b23eb3d84ce4ef716d4b2f1931a9c641d0bc275868a75f5371bfe28726\": container with ID starting with 06bcc7b23eb3d84ce4ef716d4b2f1931a9c641d0bc275868a75f5371bfe28726 not found: ID does not exist" containerID="06bcc7b23eb3d84ce4ef716d4b2f1931a9c641d0bc275868a75f5371bfe28726" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.298255 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06bcc7b23eb3d84ce4ef716d4b2f1931a9c641d0bc275868a75f5371bfe28726"} err="failed to get container status \"06bcc7b23eb3d84ce4ef716d4b2f1931a9c641d0bc275868a75f5371bfe28726\": rpc error: code = NotFound desc = could not find container \"06bcc7b23eb3d84ce4ef716d4b2f1931a9c641d0bc275868a75f5371bfe28726\": container with ID starting with 06bcc7b23eb3d84ce4ef716d4b2f1931a9c641d0bc275868a75f5371bfe28726 not found: ID does not exist" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.298286 5047 scope.go:117] "RemoveContainer" containerID="c7a63fde00908a7029a5a0cc9c9ad4e38bdab61cd7948b87df4667bec68c9579" Feb 23 07:10:15 crc kubenswrapper[5047]: E0223 07:10:15.298553 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7a63fde00908a7029a5a0cc9c9ad4e38bdab61cd7948b87df4667bec68c9579\": container with ID starting with c7a63fde00908a7029a5a0cc9c9ad4e38bdab61cd7948b87df4667bec68c9579 not found: ID does not exist" containerID="c7a63fde00908a7029a5a0cc9c9ad4e38bdab61cd7948b87df4667bec68c9579" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.298600 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7a63fde00908a7029a5a0cc9c9ad4e38bdab61cd7948b87df4667bec68c9579"} err="failed to get container status \"c7a63fde00908a7029a5a0cc9c9ad4e38bdab61cd7948b87df4667bec68c9579\": rpc error: code = NotFound desc = could not find container \"c7a63fde00908a7029a5a0cc9c9ad4e38bdab61cd7948b87df4667bec68c9579\": container with ID starting with c7a63fde00908a7029a5a0cc9c9ad4e38bdab61cd7948b87df4667bec68c9579 not found: ID does not exist" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.468058 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.509997 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.537428 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:10:15 crc kubenswrapper[5047]: E0223 07:10:15.537877 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="proxy-httpd" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.537890 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="proxy-httpd" Feb 23 07:10:15 crc kubenswrapper[5047]: E0223 07:10:15.537916 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="ceilometer-notification-agent" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.537924 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="ceilometer-notification-agent" Feb 23 07:10:15 crc kubenswrapper[5047]: E0223 07:10:15.537934 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="sg-core" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.537940 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="sg-core" Feb 23 07:10:15 crc kubenswrapper[5047]: E0223 07:10:15.537972 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="ceilometer-central-agent" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.537978 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="ceilometer-central-agent" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.538184 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="ceilometer-notification-agent" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.538196 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="proxy-httpd" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.538209 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="sg-core" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.538227 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" containerName="ceilometer-central-agent" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.540211 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.545414 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.545881 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.546808 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.553596 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.605095 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.689079 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ab0013-7eda-4e2e-8244-242e7e896603-combined-ca-bundle\") pod \"84ab0013-7eda-4e2e-8244-242e7e896603\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.689449 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5c99\" (UniqueName: \"kubernetes.io/projected/84ab0013-7eda-4e2e-8244-242e7e896603-kube-api-access-w5c99\") pod \"84ab0013-7eda-4e2e-8244-242e7e896603\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.689483 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ab0013-7eda-4e2e-8244-242e7e896603-logs\") pod \"84ab0013-7eda-4e2e-8244-242e7e896603\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.689606 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ab0013-7eda-4e2e-8244-242e7e896603-config-data\") pod \"84ab0013-7eda-4e2e-8244-242e7e896603\" (UID: \"84ab0013-7eda-4e2e-8244-242e7e896603\") " Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.689973 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e307d3a1-af99-460b-bdd0-24e26de38751-log-httpd\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.690002 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e307d3a1-af99-460b-bdd0-24e26de38751-run-httpd\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.690021 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.690057 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.690092 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.690112 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-scripts\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.690133 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-config-data\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.690153 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2d5z\" (UniqueName: \"kubernetes.io/projected/e307d3a1-af99-460b-bdd0-24e26de38751-kube-api-access-z2d5z\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.692353 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84ab0013-7eda-4e2e-8244-242e7e896603-logs" (OuterVolumeSpecName: "logs") pod "84ab0013-7eda-4e2e-8244-242e7e896603" (UID: "84ab0013-7eda-4e2e-8244-242e7e896603"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.697407 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84ab0013-7eda-4e2e-8244-242e7e896603-kube-api-access-w5c99" (OuterVolumeSpecName: "kube-api-access-w5c99") pod "84ab0013-7eda-4e2e-8244-242e7e896603" (UID: "84ab0013-7eda-4e2e-8244-242e7e896603"). InnerVolumeSpecName "kube-api-access-w5c99". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.729456 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ab0013-7eda-4e2e-8244-242e7e896603-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84ab0013-7eda-4e2e-8244-242e7e896603" (UID: "84ab0013-7eda-4e2e-8244-242e7e896603"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.732193 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ab0013-7eda-4e2e-8244-242e7e896603-config-data" (OuterVolumeSpecName: "config-data") pod "84ab0013-7eda-4e2e-8244-242e7e896603" (UID: "84ab0013-7eda-4e2e-8244-242e7e896603"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.800097 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e307d3a1-af99-460b-bdd0-24e26de38751-log-httpd\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.800392 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e307d3a1-af99-460b-bdd0-24e26de38751-run-httpd\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.800470 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.800549 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.800816 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e307d3a1-af99-460b-bdd0-24e26de38751-log-httpd\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.801048 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e307d3a1-af99-460b-bdd0-24e26de38751-run-httpd\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.801490 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.801523 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-scripts\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.801558 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-config-data\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.801585 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2d5z\" (UniqueName: \"kubernetes.io/projected/e307d3a1-af99-460b-bdd0-24e26de38751-kube-api-access-z2d5z\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.801766 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ab0013-7eda-4e2e-8244-242e7e896603-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.801785 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5c99\" (UniqueName: \"kubernetes.io/projected/84ab0013-7eda-4e2e-8244-242e7e896603-kube-api-access-w5c99\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.801797 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ab0013-7eda-4e2e-8244-242e7e896603-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.801808 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ab0013-7eda-4e2e-8244-242e7e896603-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.805432 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.805626 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.807679 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-scripts\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.808094 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.808182 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-config-data\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.822138 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2d5z\" (UniqueName: \"kubernetes.io/projected/e307d3a1-af99-460b-bdd0-24e26de38751-kube-api-access-z2d5z\") pod \"ceilometer-0\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " pod="openstack/ceilometer-0" Feb 23 07:10:15 crc kubenswrapper[5047]: I0223 07:10:15.858367 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.124032 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp7fw" event={"ID":"7cc22dfb-ad4d-4ae5-95f8-e55057a57878","Type":"ContainerStarted","Data":"ab5ddd8e0748a837463e80388d303cfefe1235017cec17ffa96027aaf3e1fb93"} Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.130566 5047 generic.go:334] "Generic (PLEG): container finished" podID="84ab0013-7eda-4e2e-8244-242e7e896603" containerID="12acc951ea1c23b29390ddca608807d477006e621c027794656239f91b5fde2b" exitCode=0 Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.130596 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.130636 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84ab0013-7eda-4e2e-8244-242e7e896603","Type":"ContainerDied","Data":"12acc951ea1c23b29390ddca608807d477006e621c027794656239f91b5fde2b"} Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.130678 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84ab0013-7eda-4e2e-8244-242e7e896603","Type":"ContainerDied","Data":"68d4909b24395592e18ce12926fd11571d55d554e18a7eb37e140479049773cb"} Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.130736 5047 scope.go:117] "RemoveContainer" containerID="12acc951ea1c23b29390ddca608807d477006e621c027794656239f91b5fde2b" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.165019 5047 scope.go:117] "RemoveContainer" containerID="8c441540f0378b86998d22edba1bb329e5fcfcc9bf3122d9794ebc1137c8a697" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.178106 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.193548 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.196751 5047 scope.go:117] "RemoveContainer" containerID="12acc951ea1c23b29390ddca608807d477006e621c027794656239f91b5fde2b" Feb 23 07:10:16 crc kubenswrapper[5047]: E0223 07:10:16.201237 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12acc951ea1c23b29390ddca608807d477006e621c027794656239f91b5fde2b\": container with ID starting with 12acc951ea1c23b29390ddca608807d477006e621c027794656239f91b5fde2b not found: ID does not exist" containerID="12acc951ea1c23b29390ddca608807d477006e621c027794656239f91b5fde2b" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.201285 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12acc951ea1c23b29390ddca608807d477006e621c027794656239f91b5fde2b"} err="failed to get container status \"12acc951ea1c23b29390ddca608807d477006e621c027794656239f91b5fde2b\": rpc error: code = NotFound desc = could not find container \"12acc951ea1c23b29390ddca608807d477006e621c027794656239f91b5fde2b\": container with ID starting with 12acc951ea1c23b29390ddca608807d477006e621c027794656239f91b5fde2b not found: ID does not exist" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.201314 5047 scope.go:117] "RemoveContainer" containerID="8c441540f0378b86998d22edba1bb329e5fcfcc9bf3122d9794ebc1137c8a697" Feb 23 07:10:16 crc kubenswrapper[5047]: E0223 07:10:16.201792 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c441540f0378b86998d22edba1bb329e5fcfcc9bf3122d9794ebc1137c8a697\": container with ID starting with 8c441540f0378b86998d22edba1bb329e5fcfcc9bf3122d9794ebc1137c8a697 not found: ID does not exist" containerID="8c441540f0378b86998d22edba1bb329e5fcfcc9bf3122d9794ebc1137c8a697" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.201818 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c441540f0378b86998d22edba1bb329e5fcfcc9bf3122d9794ebc1137c8a697"} err="failed to get container status \"8c441540f0378b86998d22edba1bb329e5fcfcc9bf3122d9794ebc1137c8a697\": rpc error: code = NotFound desc = could not find container \"8c441540f0378b86998d22edba1bb329e5fcfcc9bf3122d9794ebc1137c8a697\": container with ID starting with 8c441540f0378b86998d22edba1bb329e5fcfcc9bf3122d9794ebc1137c8a697 not found: ID does not exist" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.209473 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:16 crc kubenswrapper[5047]: E0223 07:10:16.210944 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ab0013-7eda-4e2e-8244-242e7e896603" containerName="nova-api-log" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.210969 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ab0013-7eda-4e2e-8244-242e7e896603" containerName="nova-api-log" Feb 23 07:10:16 crc kubenswrapper[5047]: E0223 07:10:16.210994 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ab0013-7eda-4e2e-8244-242e7e896603" containerName="nova-api-api" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.211001 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ab0013-7eda-4e2e-8244-242e7e896603" containerName="nova-api-api" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.211224 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ab0013-7eda-4e2e-8244-242e7e896603" containerName="nova-api-api" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.211259 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ab0013-7eda-4e2e-8244-242e7e896603" containerName="nova-api-log" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.212305 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.215281 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.215485 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.215774 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.239159 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.312018 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lvxx\" (UniqueName: \"kubernetes.io/projected/4815efeb-328e-40fb-b420-c62b2fe7e984-kube-api-access-5lvxx\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.312083 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-config-data\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.312143 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.312209 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4815efeb-328e-40fb-b420-c62b2fe7e984-logs\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.312257 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.312317 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-public-tls-certs\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.354041 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55cf0e5d-62dd-4b49-8e84-cd9307857a82" path="/var/lib/kubelet/pods/55cf0e5d-62dd-4b49-8e84-cd9307857a82/volumes" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.355027 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84ab0013-7eda-4e2e-8244-242e7e896603" path="/var/lib/kubelet/pods/84ab0013-7eda-4e2e-8244-242e7e896603/volumes" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.361666 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:10:16 crc kubenswrapper[5047]: W0223 07:10:16.371932 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode307d3a1_af99_460b_bdd0_24e26de38751.slice/crio-db94c1d793dd31119a747270bdd5f33a4fe8a08855386251bf346d8de1893013 WatchSource:0}: Error finding container db94c1d793dd31119a747270bdd5f33a4fe8a08855386251bf346d8de1893013: Status 404 returned error can't find the container with id db94c1d793dd31119a747270bdd5f33a4fe8a08855386251bf346d8de1893013 Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.414267 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4815efeb-328e-40fb-b420-c62b2fe7e984-logs\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.414364 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.414429 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-public-tls-certs\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.414572 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lvxx\" (UniqueName: \"kubernetes.io/projected/4815efeb-328e-40fb-b420-c62b2fe7e984-kube-api-access-5lvxx\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.414608 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-config-data\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.414649 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.414722 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4815efeb-328e-40fb-b420-c62b2fe7e984-logs\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.424886 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-config-data\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.424978 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.425480 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-public-tls-certs\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.426591 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.431730 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lvxx\" (UniqueName: \"kubernetes.io/projected/4815efeb-328e-40fb-b420-c62b2fe7e984-kube-api-access-5lvxx\") pod \"nova-api-0\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.538579 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.760602 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.760892 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.760956 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.761857 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"837f7c0cf58cb1629ad928ad357807b43700d666b61b873505ed015b092129de"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:10:16 crc kubenswrapper[5047]: I0223 07:10:16.761945 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://837f7c0cf58cb1629ad928ad357807b43700d666b61b873505ed015b092129de" gracePeriod=600 Feb 23 07:10:16 crc kubenswrapper[5047]: E0223 07:10:16.938916 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cc22dfb_ad4d_4ae5_95f8_e55057a57878.slice/crio-ab5ddd8e0748a837463e80388d303cfefe1235017cec17ffa96027aaf3e1fb93.scope\": RecentStats: unable to find data in memory cache]" Feb 23 07:10:17 crc kubenswrapper[5047]: I0223 07:10:17.021688 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:17 crc kubenswrapper[5047]: I0223 07:10:17.147576 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4815efeb-328e-40fb-b420-c62b2fe7e984","Type":"ContainerStarted","Data":"8f17261f176f6d5d424dbfb614d2a74ba699861f4180f37782e6e9a0484cd2bb"} Feb 23 07:10:17 crc kubenswrapper[5047]: I0223 07:10:17.149680 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e307d3a1-af99-460b-bdd0-24e26de38751","Type":"ContainerStarted","Data":"db94c1d793dd31119a747270bdd5f33a4fe8a08855386251bf346d8de1893013"} Feb 23 07:10:17 crc kubenswrapper[5047]: I0223 07:10:17.153673 5047 generic.go:334] "Generic (PLEG): container finished" podID="7cc22dfb-ad4d-4ae5-95f8-e55057a57878" containerID="ab5ddd8e0748a837463e80388d303cfefe1235017cec17ffa96027aaf3e1fb93" exitCode=0 Feb 23 07:10:17 crc kubenswrapper[5047]: I0223 07:10:17.153821 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp7fw" event={"ID":"7cc22dfb-ad4d-4ae5-95f8-e55057a57878","Type":"ContainerDied","Data":"ab5ddd8e0748a837463e80388d303cfefe1235017cec17ffa96027aaf3e1fb93"} Feb 23 07:10:17 crc kubenswrapper[5047]: I0223 07:10:17.158645 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="837f7c0cf58cb1629ad928ad357807b43700d666b61b873505ed015b092129de" exitCode=0 Feb 23 07:10:17 crc kubenswrapper[5047]: I0223 07:10:17.158698 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"837f7c0cf58cb1629ad928ad357807b43700d666b61b873505ed015b092129de"} Feb 23 07:10:17 crc kubenswrapper[5047]: I0223 07:10:17.158744 5047 scope.go:117] "RemoveContainer" containerID="18683d8e116e648be702e8fe2b59331eb73a682d795d009e915974255f56b210" Feb 23 07:10:17 crc kubenswrapper[5047]: I0223 07:10:17.714405 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:17 crc kubenswrapper[5047]: I0223 07:10:17.741125 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.173556 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp7fw" event={"ID":"7cc22dfb-ad4d-4ae5-95f8-e55057a57878","Type":"ContainerStarted","Data":"cd70f1b82a81d66e733cecc5c78a1ec8cdf8eab584ac96d8ce6822ca69bea138"} Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.176677 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0"} Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.179165 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4815efeb-328e-40fb-b420-c62b2fe7e984","Type":"ContainerStarted","Data":"d90f3bbcab0a45e3eee9fff485dafa6c8e80a9a3523dd5af3ef4d6557e4f1557"} Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.179202 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4815efeb-328e-40fb-b420-c62b2fe7e984","Type":"ContainerStarted","Data":"739cba4d63c62882eaa467eda1e1e3105a514f7d12f9bc99a6fa97459459f545"} Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.181752 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e307d3a1-af99-460b-bdd0-24e26de38751","Type":"ContainerStarted","Data":"ac42b72f2ef3e70086002c27d0507227c425dcef23dd72ee4b41f90f506d1f7e"} Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.203190 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fp7fw" podStartSLOduration=2.7553927590000002 podStartE2EDuration="5.203163958s" podCreationTimestamp="2026-02-23 07:10:13 +0000 UTC" firstStartedPulling="2026-02-23 07:10:15.106805535 +0000 UTC m=+1537.358132669" lastFinishedPulling="2026-02-23 07:10:17.554576734 +0000 UTC m=+1539.805903868" observedRunningTime="2026-02-23 07:10:18.194189232 +0000 UTC m=+1540.445516376" watchObservedRunningTime="2026-02-23 07:10:18.203163958 +0000 UTC m=+1540.454491112" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.211637 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.263466 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.263426546 podStartE2EDuration="2.263426546s" podCreationTimestamp="2026-02-23 07:10:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:10:18.251694717 +0000 UTC m=+1540.503021871" watchObservedRunningTime="2026-02-23 07:10:18.263426546 +0000 UTC m=+1540.514753720" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.544692 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-fzmn4"] Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.558786 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.564939 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.566213 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.573330 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fzmn4"] Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.689560 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-scripts\") pod \"nova-cell1-cell-mapping-fzmn4\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.689654 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fzmn4\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.690053 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j77p\" (UniqueName: \"kubernetes.io/projected/1b9b1413-5859-4568-b3f8-d9401fe4b34e-kube-api-access-9j77p\") pod \"nova-cell1-cell-mapping-fzmn4\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.690090 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-config-data\") pod \"nova-cell1-cell-mapping-fzmn4\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.792413 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-config-data\") pod \"nova-cell1-cell-mapping-fzmn4\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.792496 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-scripts\") pod \"nova-cell1-cell-mapping-fzmn4\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.792546 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fzmn4\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.792674 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j77p\" (UniqueName: \"kubernetes.io/projected/1b9b1413-5859-4568-b3f8-d9401fe4b34e-kube-api-access-9j77p\") pod \"nova-cell1-cell-mapping-fzmn4\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.799521 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-scripts\") pod \"nova-cell1-cell-mapping-fzmn4\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.803880 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fzmn4\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.817550 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-config-data\") pod \"nova-cell1-cell-mapping-fzmn4\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.834314 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j77p\" (UniqueName: \"kubernetes.io/projected/1b9b1413-5859-4568-b3f8-d9401fe4b34e-kube-api-access-9j77p\") pod \"nova-cell1-cell-mapping-fzmn4\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:18 crc kubenswrapper[5047]: I0223 07:10:18.883937 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:19 crc kubenswrapper[5047]: I0223 07:10:19.223309 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e307d3a1-af99-460b-bdd0-24e26de38751","Type":"ContainerStarted","Data":"de1bae2fc8abe4518e053551089b717523cf45b88661909a16503dc7aee58baf"} Feb 23 07:10:19 crc kubenswrapper[5047]: I0223 07:10:19.224556 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e307d3a1-af99-460b-bdd0-24e26de38751","Type":"ContainerStarted","Data":"235052c048504a3689eeba925718b5995ae8bb50fcdd7c36cd0a37a3cac313c6"} Feb 23 07:10:19 crc kubenswrapper[5047]: I0223 07:10:19.496601 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fzmn4"] Feb 23 07:10:19 crc kubenswrapper[5047]: I0223 07:10:19.818919 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:10:19 crc kubenswrapper[5047]: I0223 07:10:19.913389 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-7kfxn"] Feb 23 07:10:19 crc kubenswrapper[5047]: I0223 07:10:19.913760 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" podUID="d48e05a8-af41-43a5-bc35-19b17bf735e3" containerName="dnsmasq-dns" containerID="cri-o://3f73a342fb9259de5b85d35476814c1b8fb2a82ad57a4291c4eba08237571620" gracePeriod=10 Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.242076 5047 generic.go:334] "Generic (PLEG): container finished" podID="d48e05a8-af41-43a5-bc35-19b17bf735e3" containerID="3f73a342fb9259de5b85d35476814c1b8fb2a82ad57a4291c4eba08237571620" exitCode=0 Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.243743 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" event={"ID":"d48e05a8-af41-43a5-bc35-19b17bf735e3","Type":"ContainerDied","Data":"3f73a342fb9259de5b85d35476814c1b8fb2a82ad57a4291c4eba08237571620"} Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.262015 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fzmn4" event={"ID":"1b9b1413-5859-4568-b3f8-d9401fe4b34e","Type":"ContainerStarted","Data":"16b656be4c9cc38ec648ec0e389c1956835cc8867344cd5dc329014321e6c9af"} Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.262078 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fzmn4" event={"ID":"1b9b1413-5859-4568-b3f8-d9401fe4b34e","Type":"ContainerStarted","Data":"2ec018612c7981f82cf69173b42d04acd7f45606e5ce878224066b70bccb22fc"} Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.283065 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-fzmn4" podStartSLOduration=2.283037633 podStartE2EDuration="2.283037633s" podCreationTimestamp="2026-02-23 07:10:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:10:20.280979159 +0000 UTC m=+1542.532306293" watchObservedRunningTime="2026-02-23 07:10:20.283037633 +0000 UTC m=+1542.534364767" Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.512949 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.675939 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-config\") pod \"d48e05a8-af41-43a5-bc35-19b17bf735e3\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.676047 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-dns-svc\") pod \"d48e05a8-af41-43a5-bc35-19b17bf735e3\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.676086 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8whx\" (UniqueName: \"kubernetes.io/projected/d48e05a8-af41-43a5-bc35-19b17bf735e3-kube-api-access-q8whx\") pod \"d48e05a8-af41-43a5-bc35-19b17bf735e3\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.676120 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-dns-swift-storage-0\") pod \"d48e05a8-af41-43a5-bc35-19b17bf735e3\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.676265 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-ovsdbserver-nb\") pod \"d48e05a8-af41-43a5-bc35-19b17bf735e3\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.676337 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-ovsdbserver-sb\") pod \"d48e05a8-af41-43a5-bc35-19b17bf735e3\" (UID: \"d48e05a8-af41-43a5-bc35-19b17bf735e3\") " Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.683507 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d48e05a8-af41-43a5-bc35-19b17bf735e3-kube-api-access-q8whx" (OuterVolumeSpecName: "kube-api-access-q8whx") pod "d48e05a8-af41-43a5-bc35-19b17bf735e3" (UID: "d48e05a8-af41-43a5-bc35-19b17bf735e3"). InnerVolumeSpecName "kube-api-access-q8whx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.757698 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d48e05a8-af41-43a5-bc35-19b17bf735e3" (UID: "d48e05a8-af41-43a5-bc35-19b17bf735e3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.764452 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d48e05a8-af41-43a5-bc35-19b17bf735e3" (UID: "d48e05a8-af41-43a5-bc35-19b17bf735e3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.781154 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8whx\" (UniqueName: \"kubernetes.io/projected/d48e05a8-af41-43a5-bc35-19b17bf735e3-kube-api-access-q8whx\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.781191 5047 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.781200 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.781887 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d48e05a8-af41-43a5-bc35-19b17bf735e3" (UID: "d48e05a8-af41-43a5-bc35-19b17bf735e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.795538 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-config" (OuterVolumeSpecName: "config") pod "d48e05a8-af41-43a5-bc35-19b17bf735e3" (UID: "d48e05a8-af41-43a5-bc35-19b17bf735e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.802461 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d48e05a8-af41-43a5-bc35-19b17bf735e3" (UID: "d48e05a8-af41-43a5-bc35-19b17bf735e3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.882961 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.883422 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:20 crc kubenswrapper[5047]: I0223 07:10:20.883432 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d48e05a8-af41-43a5-bc35-19b17bf735e3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:21 crc kubenswrapper[5047]: I0223 07:10:21.279002 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" event={"ID":"d48e05a8-af41-43a5-bc35-19b17bf735e3","Type":"ContainerDied","Data":"eb3488a2cd70c23a480536e538adeb7f84157834dbcb09415d0243a028e55264"} Feb 23 07:10:21 crc kubenswrapper[5047]: I0223 07:10:21.279073 5047 scope.go:117] "RemoveContainer" containerID="3f73a342fb9259de5b85d35476814c1b8fb2a82ad57a4291c4eba08237571620" Feb 23 07:10:21 crc kubenswrapper[5047]: I0223 07:10:21.279240 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" Feb 23 07:10:21 crc kubenswrapper[5047]: I0223 07:10:21.296023 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e307d3a1-af99-460b-bdd0-24e26de38751","Type":"ContainerStarted","Data":"1843426c8ae30cc809f15219c4a71a415a94a52a2bc0a9a35f7e3c5e3ca3cd62"} Feb 23 07:10:21 crc kubenswrapper[5047]: I0223 07:10:21.296105 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:10:21 crc kubenswrapper[5047]: I0223 07:10:21.306391 5047 scope.go:117] "RemoveContainer" containerID="6bb60eeb023dade211e23276a0c4b1a2179398333be845aeede6772875cc232c" Feb 23 07:10:21 crc kubenswrapper[5047]: I0223 07:10:21.346363 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.078102177 podStartE2EDuration="6.346333176s" podCreationTimestamp="2026-02-23 07:10:15 +0000 UTC" firstStartedPulling="2026-02-23 07:10:16.374471333 +0000 UTC m=+1538.625798467" lastFinishedPulling="2026-02-23 07:10:20.642702332 +0000 UTC m=+1542.894029466" observedRunningTime="2026-02-23 07:10:21.334761361 +0000 UTC m=+1543.586088505" watchObservedRunningTime="2026-02-23 07:10:21.346333176 +0000 UTC m=+1543.597660310" Feb 23 07:10:21 crc kubenswrapper[5047]: I0223 07:10:21.358965 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-7kfxn"] Feb 23 07:10:21 crc kubenswrapper[5047]: I0223 07:10:21.368698 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-7kfxn"] Feb 23 07:10:22 crc kubenswrapper[5047]: I0223 07:10:22.352290 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d48e05a8-af41-43a5-bc35-19b17bf735e3" path="/var/lib/kubelet/pods/d48e05a8-af41-43a5-bc35-19b17bf735e3/volumes" Feb 23 07:10:23 crc kubenswrapper[5047]: I0223 07:10:23.701482 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:23 crc kubenswrapper[5047]: I0223 07:10:23.702503 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:24 crc kubenswrapper[5047]: I0223 07:10:24.764151 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fp7fw" podUID="7cc22dfb-ad4d-4ae5-95f8-e55057a57878" containerName="registry-server" probeResult="failure" output=< Feb 23 07:10:24 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 07:10:24 crc kubenswrapper[5047]: > Feb 23 07:10:25 crc kubenswrapper[5047]: I0223 07:10:25.337741 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75ddbf7c75-7kfxn" podUID="d48e05a8-af41-43a5-bc35-19b17bf735e3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.190:5353: i/o timeout" Feb 23 07:10:25 crc kubenswrapper[5047]: I0223 07:10:25.361151 5047 generic.go:334] "Generic (PLEG): container finished" podID="1b9b1413-5859-4568-b3f8-d9401fe4b34e" containerID="16b656be4c9cc38ec648ec0e389c1956835cc8867344cd5dc329014321e6c9af" exitCode=0 Feb 23 07:10:25 crc kubenswrapper[5047]: I0223 07:10:25.361225 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fzmn4" event={"ID":"1b9b1413-5859-4568-b3f8-d9401fe4b34e","Type":"ContainerDied","Data":"16b656be4c9cc38ec648ec0e389c1956835cc8867344cd5dc329014321e6c9af"} Feb 23 07:10:26 crc kubenswrapper[5047]: I0223 07:10:26.540081 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:10:26 crc kubenswrapper[5047]: I0223 07:10:26.540721 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:10:26 crc kubenswrapper[5047]: I0223 07:10:26.848691 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.049776 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-scripts\") pod \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.049991 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-combined-ca-bundle\") pod \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.050045 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-config-data\") pod \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.050106 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j77p\" (UniqueName: \"kubernetes.io/projected/1b9b1413-5859-4568-b3f8-d9401fe4b34e-kube-api-access-9j77p\") pod \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\" (UID: \"1b9b1413-5859-4568-b3f8-d9401fe4b34e\") " Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.058254 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-scripts" (OuterVolumeSpecName: "scripts") pod "1b9b1413-5859-4568-b3f8-d9401fe4b34e" (UID: "1b9b1413-5859-4568-b3f8-d9401fe4b34e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.068513 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9b1413-5859-4568-b3f8-d9401fe4b34e-kube-api-access-9j77p" (OuterVolumeSpecName: "kube-api-access-9j77p") pod "1b9b1413-5859-4568-b3f8-d9401fe4b34e" (UID: "1b9b1413-5859-4568-b3f8-d9401fe4b34e"). InnerVolumeSpecName "kube-api-access-9j77p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.083140 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b9b1413-5859-4568-b3f8-d9401fe4b34e" (UID: "1b9b1413-5859-4568-b3f8-d9401fe4b34e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.084663 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-config-data" (OuterVolumeSpecName: "config-data") pod "1b9b1413-5859-4568-b3f8-d9401fe4b34e" (UID: "1b9b1413-5859-4568-b3f8-d9401fe4b34e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.153267 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.153304 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.153316 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b9b1413-5859-4568-b3f8-d9401fe4b34e-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.153329 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j77p\" (UniqueName: \"kubernetes.io/projected/1b9b1413-5859-4568-b3f8-d9401fe4b34e-kube-api-access-9j77p\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.393986 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fzmn4" Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.401675 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fzmn4" event={"ID":"1b9b1413-5859-4568-b3f8-d9401fe4b34e","Type":"ContainerDied","Data":"2ec018612c7981f82cf69173b42d04acd7f45606e5ce878224066b70bccb22fc"} Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.401777 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ec018612c7981f82cf69173b42d04acd7f45606e5ce878224066b70bccb22fc" Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.562692 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4815efeb-328e-40fb-b420-c62b2fe7e984" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.562725 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4815efeb-328e-40fb-b420-c62b2fe7e984" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.596830 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.607804 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.608604 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a09d8fa2-5b55-4c94-8c50-fe34bb06924e" containerName="nova-scheduler-scheduler" containerID="cri-o://31343670cf5cde0bd0303067cd57e0a57c44bf4242ce1b88c23ed27786ca708b" gracePeriod=30 Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.645420 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.645856 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerName="nova-metadata-log" containerID="cri-o://32b939c9054bff897de2c9bc7e06c46485057377ca90daad4c84f6e37485a364" gracePeriod=30 Feb 23 07:10:27 crc kubenswrapper[5047]: I0223 07:10:27.646410 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerName="nova-metadata-metadata" containerID="cri-o://84b8331202fbfcf70dee794c91c4472ba39ff7a3c996eaf20b2b30fe07e14ad2" gracePeriod=30 Feb 23 07:10:28 crc kubenswrapper[5047]: I0223 07:10:28.408426 5047 generic.go:334] "Generic (PLEG): container finished" podID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerID="32b939c9054bff897de2c9bc7e06c46485057377ca90daad4c84f6e37485a364" exitCode=143 Feb 23 07:10:28 crc kubenswrapper[5047]: I0223 07:10:28.408491 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc9b19ab-9f8a-4127-944e-116e2b843310","Type":"ContainerDied","Data":"32b939c9054bff897de2c9bc7e06c46485057377ca90daad4c84f6e37485a364"} Feb 23 07:10:28 crc kubenswrapper[5047]: I0223 07:10:28.409074 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4815efeb-328e-40fb-b420-c62b2fe7e984" containerName="nova-api-log" containerID="cri-o://739cba4d63c62882eaa467eda1e1e3105a514f7d12f9bc99a6fa97459459f545" gracePeriod=30 Feb 23 07:10:28 crc kubenswrapper[5047]: I0223 07:10:28.409252 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4815efeb-328e-40fb-b420-c62b2fe7e984" containerName="nova-api-api" containerID="cri-o://d90f3bbcab0a45e3eee9fff485dafa6c8e80a9a3523dd5af3ef4d6557e4f1557" gracePeriod=30 Feb 23 07:10:28 crc kubenswrapper[5047]: I0223 07:10:28.818805 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:10:28 crc kubenswrapper[5047]: I0223 07:10:28.919778 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-combined-ca-bundle\") pod \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\" (UID: \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\") " Feb 23 07:10:28 crc kubenswrapper[5047]: I0223 07:10:28.919985 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2kpv\" (UniqueName: \"kubernetes.io/projected/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-kube-api-access-r2kpv\") pod \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\" (UID: \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\") " Feb 23 07:10:28 crc kubenswrapper[5047]: I0223 07:10:28.920055 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-config-data\") pod \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\" (UID: \"a09d8fa2-5b55-4c94-8c50-fe34bb06924e\") " Feb 23 07:10:28 crc kubenswrapper[5047]: I0223 07:10:28.932152 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-kube-api-access-r2kpv" (OuterVolumeSpecName: "kube-api-access-r2kpv") pod "a09d8fa2-5b55-4c94-8c50-fe34bb06924e" (UID: "a09d8fa2-5b55-4c94-8c50-fe34bb06924e"). InnerVolumeSpecName "kube-api-access-r2kpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:28 crc kubenswrapper[5047]: I0223 07:10:28.956597 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a09d8fa2-5b55-4c94-8c50-fe34bb06924e" (UID: "a09d8fa2-5b55-4c94-8c50-fe34bb06924e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:28 crc kubenswrapper[5047]: I0223 07:10:28.959000 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-config-data" (OuterVolumeSpecName: "config-data") pod "a09d8fa2-5b55-4c94-8c50-fe34bb06924e" (UID: "a09d8fa2-5b55-4c94-8c50-fe34bb06924e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.022655 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.022735 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2kpv\" (UniqueName: \"kubernetes.io/projected/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-kube-api-access-r2kpv\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.022751 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09d8fa2-5b55-4c94-8c50-fe34bb06924e-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.423503 5047 generic.go:334] "Generic (PLEG): container finished" podID="a09d8fa2-5b55-4c94-8c50-fe34bb06924e" containerID="31343670cf5cde0bd0303067cd57e0a57c44bf4242ce1b88c23ed27786ca708b" exitCode=0 Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.423557 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.423556 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a09d8fa2-5b55-4c94-8c50-fe34bb06924e","Type":"ContainerDied","Data":"31343670cf5cde0bd0303067cd57e0a57c44bf4242ce1b88c23ed27786ca708b"} Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.423679 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a09d8fa2-5b55-4c94-8c50-fe34bb06924e","Type":"ContainerDied","Data":"d669cc20e7935aa938555ad8bebade0d1007a5c5a2b381433f9577226cdaa552"} Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.423705 5047 scope.go:117] "RemoveContainer" containerID="31343670cf5cde0bd0303067cd57e0a57c44bf4242ce1b88c23ed27786ca708b" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.425830 5047 generic.go:334] "Generic (PLEG): container finished" podID="4815efeb-328e-40fb-b420-c62b2fe7e984" containerID="739cba4d63c62882eaa467eda1e1e3105a514f7d12f9bc99a6fa97459459f545" exitCode=143 Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.425887 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4815efeb-328e-40fb-b420-c62b2fe7e984","Type":"ContainerDied","Data":"739cba4d63c62882eaa467eda1e1e3105a514f7d12f9bc99a6fa97459459f545"} Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.456192 5047 scope.go:117] "RemoveContainer" containerID="31343670cf5cde0bd0303067cd57e0a57c44bf4242ce1b88c23ed27786ca708b" Feb 23 07:10:29 crc kubenswrapper[5047]: E0223 07:10:29.456542 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31343670cf5cde0bd0303067cd57e0a57c44bf4242ce1b88c23ed27786ca708b\": container with ID starting with 31343670cf5cde0bd0303067cd57e0a57c44bf4242ce1b88c23ed27786ca708b not found: ID does not exist" containerID="31343670cf5cde0bd0303067cd57e0a57c44bf4242ce1b88c23ed27786ca708b" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.456696 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31343670cf5cde0bd0303067cd57e0a57c44bf4242ce1b88c23ed27786ca708b"} err="failed to get container status \"31343670cf5cde0bd0303067cd57e0a57c44bf4242ce1b88c23ed27786ca708b\": rpc error: code = NotFound desc = could not find container \"31343670cf5cde0bd0303067cd57e0a57c44bf4242ce1b88c23ed27786ca708b\": container with ID starting with 31343670cf5cde0bd0303067cd57e0a57c44bf4242ce1b88c23ed27786ca708b not found: ID does not exist" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.478099 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.500514 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.523652 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:10:29 crc kubenswrapper[5047]: E0223 07:10:29.524221 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48e05a8-af41-43a5-bc35-19b17bf735e3" containerName="init" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.524242 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48e05a8-af41-43a5-bc35-19b17bf735e3" containerName="init" Feb 23 07:10:29 crc kubenswrapper[5047]: E0223 07:10:29.524259 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48e05a8-af41-43a5-bc35-19b17bf735e3" containerName="dnsmasq-dns" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.524267 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48e05a8-af41-43a5-bc35-19b17bf735e3" containerName="dnsmasq-dns" Feb 23 07:10:29 crc kubenswrapper[5047]: E0223 07:10:29.524299 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09d8fa2-5b55-4c94-8c50-fe34bb06924e" containerName="nova-scheduler-scheduler" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.524305 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09d8fa2-5b55-4c94-8c50-fe34bb06924e" containerName="nova-scheduler-scheduler" Feb 23 07:10:29 crc kubenswrapper[5047]: E0223 07:10:29.524314 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9b1413-5859-4568-b3f8-d9401fe4b34e" containerName="nova-manage" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.524320 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9b1413-5859-4568-b3f8-d9401fe4b34e" containerName="nova-manage" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.524492 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48e05a8-af41-43a5-bc35-19b17bf735e3" containerName="dnsmasq-dns" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.524504 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9b1413-5859-4568-b3f8-d9401fe4b34e" containerName="nova-manage" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.524516 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09d8fa2-5b55-4c94-8c50-fe34bb06924e" containerName="nova-scheduler-scheduler" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.525255 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.527610 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.546000 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.636760 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c628150-3bde-4740-9c94-dc208f61ade2-config-data\") pod \"nova-scheduler-0\" (UID: \"5c628150-3bde-4740-9c94-dc208f61ade2\") " pod="openstack/nova-scheduler-0" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.636810 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c628150-3bde-4740-9c94-dc208f61ade2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c628150-3bde-4740-9c94-dc208f61ade2\") " pod="openstack/nova-scheduler-0" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.636924 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npt6r\" (UniqueName: \"kubernetes.io/projected/5c628150-3bde-4740-9c94-dc208f61ade2-kube-api-access-npt6r\") pod \"nova-scheduler-0\" (UID: \"5c628150-3bde-4740-9c94-dc208f61ade2\") " pod="openstack/nova-scheduler-0" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.739681 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c628150-3bde-4740-9c94-dc208f61ade2-config-data\") pod \"nova-scheduler-0\" (UID: \"5c628150-3bde-4740-9c94-dc208f61ade2\") " pod="openstack/nova-scheduler-0" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.740156 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c628150-3bde-4740-9c94-dc208f61ade2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c628150-3bde-4740-9c94-dc208f61ade2\") " pod="openstack/nova-scheduler-0" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.740226 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npt6r\" (UniqueName: \"kubernetes.io/projected/5c628150-3bde-4740-9c94-dc208f61ade2-kube-api-access-npt6r\") pod \"nova-scheduler-0\" (UID: \"5c628150-3bde-4740-9c94-dc208f61ade2\") " pod="openstack/nova-scheduler-0" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.743868 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c628150-3bde-4740-9c94-dc208f61ade2-config-data\") pod \"nova-scheduler-0\" (UID: \"5c628150-3bde-4740-9c94-dc208f61ade2\") " pod="openstack/nova-scheduler-0" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.744213 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c628150-3bde-4740-9c94-dc208f61ade2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c628150-3bde-4740-9c94-dc208f61ade2\") " pod="openstack/nova-scheduler-0" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.762607 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npt6r\" (UniqueName: \"kubernetes.io/projected/5c628150-3bde-4740-9c94-dc208f61ade2-kube-api-access-npt6r\") pod \"nova-scheduler-0\" (UID: \"5c628150-3bde-4740-9c94-dc208f61ade2\") " pod="openstack/nova-scheduler-0" Feb 23 07:10:29 crc kubenswrapper[5047]: I0223 07:10:29.853102 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:10:30 crc kubenswrapper[5047]: I0223 07:10:30.365783 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a09d8fa2-5b55-4c94-8c50-fe34bb06924e" path="/var/lib/kubelet/pods/a09d8fa2-5b55-4c94-8c50-fe34bb06924e/volumes" Feb 23 07:10:30 crc kubenswrapper[5047]: I0223 07:10:30.379458 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:10:30 crc kubenswrapper[5047]: W0223 07:10:30.387671 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c628150_3bde_4740_9c94_dc208f61ade2.slice/crio-3da926f76ee4233efa2ff012807dcadbff80d3b70dd92f81f859289ec06ac986 WatchSource:0}: Error finding container 3da926f76ee4233efa2ff012807dcadbff80d3b70dd92f81f859289ec06ac986: Status 404 returned error can't find the container with id 3da926f76ee4233efa2ff012807dcadbff80d3b70dd92f81f859289ec06ac986 Feb 23 07:10:30 crc kubenswrapper[5047]: I0223 07:10:30.440729 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c628150-3bde-4740-9c94-dc208f61ade2","Type":"ContainerStarted","Data":"3da926f76ee4233efa2ff012807dcadbff80d3b70dd92f81f859289ec06ac986"} Feb 23 07:10:30 crc kubenswrapper[5047]: I0223 07:10:30.797901 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:54038->10.217.0.197:8775: read: connection reset by peer" Feb 23 07:10:30 crc kubenswrapper[5047]: I0223 07:10:30.798055 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:54050->10.217.0.197:8775: read: connection reset by peer" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.350555 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.459449 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c628150-3bde-4740-9c94-dc208f61ade2","Type":"ContainerStarted","Data":"673e725806a724d8ff4b5a6b0941c0c98cab0db5716267243f589266bfe8fc9f"} Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.464141 5047 generic.go:334] "Generic (PLEG): container finished" podID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerID="84b8331202fbfcf70dee794c91c4472ba39ff7a3c996eaf20b2b30fe07e14ad2" exitCode=0 Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.464189 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc9b19ab-9f8a-4127-944e-116e2b843310","Type":"ContainerDied","Data":"84b8331202fbfcf70dee794c91c4472ba39ff7a3c996eaf20b2b30fe07e14ad2"} Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.464214 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc9b19ab-9f8a-4127-944e-116e2b843310","Type":"ContainerDied","Data":"c833ebb18e775fee47c5c108bafc1f9d4d9cd700b14092cb139726e8ff0f50ec"} Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.464237 5047 scope.go:117] "RemoveContainer" containerID="84b8331202fbfcf70dee794c91c4472ba39ff7a3c996eaf20b2b30fe07e14ad2" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.464317 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.485141 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.485117252 podStartE2EDuration="2.485117252s" podCreationTimestamp="2026-02-23 07:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:10:31.477772829 +0000 UTC m=+1553.729099963" watchObservedRunningTime="2026-02-23 07:10:31.485117252 +0000 UTC m=+1553.736444386" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.494264 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-config-data\") pod \"cc9b19ab-9f8a-4127-944e-116e2b843310\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.494445 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-nova-metadata-tls-certs\") pod \"cc9b19ab-9f8a-4127-944e-116e2b843310\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.494575 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cbkr\" (UniqueName: \"kubernetes.io/projected/cc9b19ab-9f8a-4127-944e-116e2b843310-kube-api-access-8cbkr\") pod \"cc9b19ab-9f8a-4127-944e-116e2b843310\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.494604 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-combined-ca-bundle\") pod \"cc9b19ab-9f8a-4127-944e-116e2b843310\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.494696 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc9b19ab-9f8a-4127-944e-116e2b843310-logs\") pod \"cc9b19ab-9f8a-4127-944e-116e2b843310\" (UID: \"cc9b19ab-9f8a-4127-944e-116e2b843310\") " Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.495642 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc9b19ab-9f8a-4127-944e-116e2b843310-logs" (OuterVolumeSpecName: "logs") pod "cc9b19ab-9f8a-4127-944e-116e2b843310" (UID: "cc9b19ab-9f8a-4127-944e-116e2b843310"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.510633 5047 scope.go:117] "RemoveContainer" containerID="32b939c9054bff897de2c9bc7e06c46485057377ca90daad4c84f6e37485a364" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.511035 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc9b19ab-9f8a-4127-944e-116e2b843310-kube-api-access-8cbkr" (OuterVolumeSpecName: "kube-api-access-8cbkr") pod "cc9b19ab-9f8a-4127-944e-116e2b843310" (UID: "cc9b19ab-9f8a-4127-944e-116e2b843310"). InnerVolumeSpecName "kube-api-access-8cbkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.527778 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-config-data" (OuterVolumeSpecName: "config-data") pod "cc9b19ab-9f8a-4127-944e-116e2b843310" (UID: "cc9b19ab-9f8a-4127-944e-116e2b843310"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.532131 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc9b19ab-9f8a-4127-944e-116e2b843310" (UID: "cc9b19ab-9f8a-4127-944e-116e2b843310"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.550920 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cc9b19ab-9f8a-4127-944e-116e2b843310" (UID: "cc9b19ab-9f8a-4127-944e-116e2b843310"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.597848 5047 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.597920 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cbkr\" (UniqueName: \"kubernetes.io/projected/cc9b19ab-9f8a-4127-944e-116e2b843310-kube-api-access-8cbkr\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.597937 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.597949 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc9b19ab-9f8a-4127-944e-116e2b843310-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.597962 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9b19ab-9f8a-4127-944e-116e2b843310-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.639954 5047 scope.go:117] "RemoveContainer" containerID="84b8331202fbfcf70dee794c91c4472ba39ff7a3c996eaf20b2b30fe07e14ad2" Feb 23 07:10:31 crc kubenswrapper[5047]: E0223 07:10:31.640489 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84b8331202fbfcf70dee794c91c4472ba39ff7a3c996eaf20b2b30fe07e14ad2\": container with ID starting with 84b8331202fbfcf70dee794c91c4472ba39ff7a3c996eaf20b2b30fe07e14ad2 not found: ID does not exist" containerID="84b8331202fbfcf70dee794c91c4472ba39ff7a3c996eaf20b2b30fe07e14ad2" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.640576 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b8331202fbfcf70dee794c91c4472ba39ff7a3c996eaf20b2b30fe07e14ad2"} err="failed to get container status \"84b8331202fbfcf70dee794c91c4472ba39ff7a3c996eaf20b2b30fe07e14ad2\": rpc error: code = NotFound desc = could not find container \"84b8331202fbfcf70dee794c91c4472ba39ff7a3c996eaf20b2b30fe07e14ad2\": container with ID starting with 84b8331202fbfcf70dee794c91c4472ba39ff7a3c996eaf20b2b30fe07e14ad2 not found: ID does not exist" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.640670 5047 scope.go:117] "RemoveContainer" containerID="32b939c9054bff897de2c9bc7e06c46485057377ca90daad4c84f6e37485a364" Feb 23 07:10:31 crc kubenswrapper[5047]: E0223 07:10:31.641106 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b939c9054bff897de2c9bc7e06c46485057377ca90daad4c84f6e37485a364\": container with ID starting with 32b939c9054bff897de2c9bc7e06c46485057377ca90daad4c84f6e37485a364 not found: ID does not exist" containerID="32b939c9054bff897de2c9bc7e06c46485057377ca90daad4c84f6e37485a364" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.641301 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b939c9054bff897de2c9bc7e06c46485057377ca90daad4c84f6e37485a364"} err="failed to get container status \"32b939c9054bff897de2c9bc7e06c46485057377ca90daad4c84f6e37485a364\": rpc error: code = NotFound desc = could not find container \"32b939c9054bff897de2c9bc7e06c46485057377ca90daad4c84f6e37485a364\": container with ID starting with 32b939c9054bff897de2c9bc7e06c46485057377ca90daad4c84f6e37485a364 not found: ID does not exist" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.806591 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.814327 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.847858 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:10:31 crc kubenswrapper[5047]: E0223 07:10:31.848542 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerName="nova-metadata-log" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.848609 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerName="nova-metadata-log" Feb 23 07:10:31 crc kubenswrapper[5047]: E0223 07:10:31.848719 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerName="nova-metadata-metadata" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.848773 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerName="nova-metadata-metadata" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.849021 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerName="nova-metadata-metadata" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.849124 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc9b19ab-9f8a-4127-944e-116e2b843310" containerName="nova-metadata-log" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.855673 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.858707 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.858795 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.882617 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.912098 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.912608 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-logs\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.912740 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-config-data\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.912838 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p25s\" (UniqueName: \"kubernetes.io/projected/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-kube-api-access-2p25s\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:31 crc kubenswrapper[5047]: I0223 07:10:31.912967 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:32 crc kubenswrapper[5047]: I0223 07:10:32.015128 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:32 crc kubenswrapper[5047]: I0223 07:10:32.015230 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-logs\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:32 crc kubenswrapper[5047]: I0223 07:10:32.015264 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-config-data\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:32 crc kubenswrapper[5047]: I0223 07:10:32.015296 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p25s\" (UniqueName: \"kubernetes.io/projected/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-kube-api-access-2p25s\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:32 crc kubenswrapper[5047]: I0223 07:10:32.015345 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:32 crc kubenswrapper[5047]: I0223 07:10:32.015950 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-logs\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:32 crc kubenswrapper[5047]: I0223 07:10:32.020111 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:32 crc kubenswrapper[5047]: I0223 07:10:32.020723 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:32 crc kubenswrapper[5047]: I0223 07:10:32.021301 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-config-data\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:32 crc kubenswrapper[5047]: I0223 07:10:32.040335 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p25s\" (UniqueName: \"kubernetes.io/projected/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-kube-api-access-2p25s\") pod \"nova-metadata-0\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " pod="openstack/nova-metadata-0" Feb 23 07:10:32 crc kubenswrapper[5047]: I0223 07:10:32.180668 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:10:32 crc kubenswrapper[5047]: I0223 07:10:32.354097 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc9b19ab-9f8a-4127-944e-116e2b843310" path="/var/lib/kubelet/pods/cc9b19ab-9f8a-4127-944e-116e2b843310/volumes" Feb 23 07:10:32 crc kubenswrapper[5047]: I0223 07:10:32.696154 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:10:32 crc kubenswrapper[5047]: W0223 07:10:32.716143 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa3349ce_92c0_4fc5_ae0e_6424be7ca179.slice/crio-fc330f41b07a0a1f9d7cf14f2b8c1a2fdadd4ed855b8d39328b1ef99c577ad53 WatchSource:0}: Error finding container fc330f41b07a0a1f9d7cf14f2b8c1a2fdadd4ed855b8d39328b1ef99c577ad53: Status 404 returned error can't find the container with id fc330f41b07a0a1f9d7cf14f2b8c1a2fdadd4ed855b8d39328b1ef99c577ad53 Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.408038 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.448518 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-public-tls-certs\") pod \"4815efeb-328e-40fb-b420-c62b2fe7e984\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.448606 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4815efeb-328e-40fb-b420-c62b2fe7e984-logs\") pod \"4815efeb-328e-40fb-b420-c62b2fe7e984\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.448680 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lvxx\" (UniqueName: \"kubernetes.io/projected/4815efeb-328e-40fb-b420-c62b2fe7e984-kube-api-access-5lvxx\") pod \"4815efeb-328e-40fb-b420-c62b2fe7e984\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.448723 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-internal-tls-certs\") pod \"4815efeb-328e-40fb-b420-c62b2fe7e984\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.448763 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-config-data\") pod \"4815efeb-328e-40fb-b420-c62b2fe7e984\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.448921 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-combined-ca-bundle\") pod \"4815efeb-328e-40fb-b420-c62b2fe7e984\" (UID: \"4815efeb-328e-40fb-b420-c62b2fe7e984\") " Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.450073 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4815efeb-328e-40fb-b420-c62b2fe7e984-logs" (OuterVolumeSpecName: "logs") pod "4815efeb-328e-40fb-b420-c62b2fe7e984" (UID: "4815efeb-328e-40fb-b420-c62b2fe7e984"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.477214 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4815efeb-328e-40fb-b420-c62b2fe7e984-kube-api-access-5lvxx" (OuterVolumeSpecName: "kube-api-access-5lvxx") pod "4815efeb-328e-40fb-b420-c62b2fe7e984" (UID: "4815efeb-328e-40fb-b420-c62b2fe7e984"). InnerVolumeSpecName "kube-api-access-5lvxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.502459 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-config-data" (OuterVolumeSpecName: "config-data") pod "4815efeb-328e-40fb-b420-c62b2fe7e984" (UID: "4815efeb-328e-40fb-b420-c62b2fe7e984"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.505979 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4815efeb-328e-40fb-b420-c62b2fe7e984" (UID: "4815efeb-328e-40fb-b420-c62b2fe7e984"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.506014 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4815efeb-328e-40fb-b420-c62b2fe7e984","Type":"ContainerDied","Data":"d90f3bbcab0a45e3eee9fff485dafa6c8e80a9a3523dd5af3ef4d6557e4f1557"} Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.505979 5047 generic.go:334] "Generic (PLEG): container finished" podID="4815efeb-328e-40fb-b420-c62b2fe7e984" containerID="d90f3bbcab0a45e3eee9fff485dafa6c8e80a9a3523dd5af3ef4d6557e4f1557" exitCode=0 Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.506057 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4815efeb-328e-40fb-b420-c62b2fe7e984","Type":"ContainerDied","Data":"8f17261f176f6d5d424dbfb614d2a74ba699861f4180f37782e6e9a0484cd2bb"} Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.506082 5047 scope.go:117] "RemoveContainer" containerID="d90f3bbcab0a45e3eee9fff485dafa6c8e80a9a3523dd5af3ef4d6557e4f1557" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.506251 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.513921 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa3349ce-92c0-4fc5-ae0e-6424be7ca179","Type":"ContainerStarted","Data":"57928662f0ede1894731138f7257539c72b6683c8af427121b07e24b7dd35d1f"} Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.513981 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa3349ce-92c0-4fc5-ae0e-6424be7ca179","Type":"ContainerStarted","Data":"0108825e07a1a995d5a32fde6ef858690c1ed2dd364cbab05781f1acab8f3fcb"} Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.513996 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa3349ce-92c0-4fc5-ae0e-6424be7ca179","Type":"ContainerStarted","Data":"fc330f41b07a0a1f9d7cf14f2b8c1a2fdadd4ed855b8d39328b1ef99c577ad53"} Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.537098 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.537081542 podStartE2EDuration="2.537081542s" podCreationTimestamp="2026-02-23 07:10:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:10:33.532399418 +0000 UTC m=+1555.783726552" watchObservedRunningTime="2026-02-23 07:10:33.537081542 +0000 UTC m=+1555.788408676" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.546309 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4815efeb-328e-40fb-b420-c62b2fe7e984" (UID: "4815efeb-328e-40fb-b420-c62b2fe7e984"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.546350 5047 scope.go:117] "RemoveContainer" containerID="739cba4d63c62882eaa467eda1e1e3105a514f7d12f9bc99a6fa97459459f545" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.550227 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4815efeb-328e-40fb-b420-c62b2fe7e984" (UID: "4815efeb-328e-40fb-b420-c62b2fe7e984"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.551856 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lvxx\" (UniqueName: \"kubernetes.io/projected/4815efeb-328e-40fb-b420-c62b2fe7e984-kube-api-access-5lvxx\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.551882 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.551893 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.551917 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.551926 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4815efeb-328e-40fb-b420-c62b2fe7e984-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.551937 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4815efeb-328e-40fb-b420-c62b2fe7e984-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.564432 5047 scope.go:117] "RemoveContainer" containerID="d90f3bbcab0a45e3eee9fff485dafa6c8e80a9a3523dd5af3ef4d6557e4f1557" Feb 23 07:10:33 crc kubenswrapper[5047]: E0223 07:10:33.565073 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d90f3bbcab0a45e3eee9fff485dafa6c8e80a9a3523dd5af3ef4d6557e4f1557\": container with ID starting with d90f3bbcab0a45e3eee9fff485dafa6c8e80a9a3523dd5af3ef4d6557e4f1557 not found: ID does not exist" containerID="d90f3bbcab0a45e3eee9fff485dafa6c8e80a9a3523dd5af3ef4d6557e4f1557" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.565280 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d90f3bbcab0a45e3eee9fff485dafa6c8e80a9a3523dd5af3ef4d6557e4f1557"} err="failed to get container status \"d90f3bbcab0a45e3eee9fff485dafa6c8e80a9a3523dd5af3ef4d6557e4f1557\": rpc error: code = NotFound desc = could not find container \"d90f3bbcab0a45e3eee9fff485dafa6c8e80a9a3523dd5af3ef4d6557e4f1557\": container with ID starting with d90f3bbcab0a45e3eee9fff485dafa6c8e80a9a3523dd5af3ef4d6557e4f1557 not found: ID does not exist" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.565429 5047 scope.go:117] "RemoveContainer" containerID="739cba4d63c62882eaa467eda1e1e3105a514f7d12f9bc99a6fa97459459f545" Feb 23 07:10:33 crc kubenswrapper[5047]: E0223 07:10:33.566656 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"739cba4d63c62882eaa467eda1e1e3105a514f7d12f9bc99a6fa97459459f545\": container with ID starting with 739cba4d63c62882eaa467eda1e1e3105a514f7d12f9bc99a6fa97459459f545 not found: ID does not exist" containerID="739cba4d63c62882eaa467eda1e1e3105a514f7d12f9bc99a6fa97459459f545" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.566703 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"739cba4d63c62882eaa467eda1e1e3105a514f7d12f9bc99a6fa97459459f545"} err="failed to get container status \"739cba4d63c62882eaa467eda1e1e3105a514f7d12f9bc99a6fa97459459f545\": rpc error: code = NotFound desc = could not find container \"739cba4d63c62882eaa467eda1e1e3105a514f7d12f9bc99a6fa97459459f545\": container with ID starting with 739cba4d63c62882eaa467eda1e1e3105a514f7d12f9bc99a6fa97459459f545 not found: ID does not exist" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.791221 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.876964 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.921438 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.952219 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.975332 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:33 crc kubenswrapper[5047]: E0223 07:10:33.975963 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4815efeb-328e-40fb-b420-c62b2fe7e984" containerName="nova-api-log" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.975988 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4815efeb-328e-40fb-b420-c62b2fe7e984" containerName="nova-api-log" Feb 23 07:10:33 crc kubenswrapper[5047]: E0223 07:10:33.976021 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4815efeb-328e-40fb-b420-c62b2fe7e984" containerName="nova-api-api" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.976029 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4815efeb-328e-40fb-b420-c62b2fe7e984" containerName="nova-api-api" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.976221 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4815efeb-328e-40fb-b420-c62b2fe7e984" containerName="nova-api-log" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.976255 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4815efeb-328e-40fb-b420-c62b2fe7e984" containerName="nova-api-api" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.977479 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.981445 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.981453 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.981529 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 07:10:33 crc kubenswrapper[5047]: I0223 07:10:33.988600 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.042602 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fp7fw"] Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.081328 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-public-tls-certs\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.081439 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lcj9\" (UniqueName: \"kubernetes.io/projected/db4c480a-88f1-42e1-bdce-21bdb85ecc48-kube-api-access-8lcj9\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.082592 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-config-data\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.082890 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.082980 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db4c480a-88f1-42e1-bdce-21bdb85ecc48-logs\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.083038 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-internal-tls-certs\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.294893 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-public-tls-certs\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.295039 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lcj9\" (UniqueName: \"kubernetes.io/projected/db4c480a-88f1-42e1-bdce-21bdb85ecc48-kube-api-access-8lcj9\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.295093 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-config-data\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.295149 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.295178 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db4c480a-88f1-42e1-bdce-21bdb85ecc48-logs\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.295207 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-internal-tls-certs\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.295929 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db4c480a-88f1-42e1-bdce-21bdb85ecc48-logs\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.299888 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-internal-tls-certs\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.302638 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-config-data\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.302718 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.305553 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-public-tls-certs\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.318269 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lcj9\" (UniqueName: \"kubernetes.io/projected/db4c480a-88f1-42e1-bdce-21bdb85ecc48-kube-api-access-8lcj9\") pod \"nova-api-0\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.358304 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4815efeb-328e-40fb-b420-c62b2fe7e984" path="/var/lib/kubelet/pods/4815efeb-328e-40fb-b420-c62b2fe7e984/volumes" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.598459 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:10:34 crc kubenswrapper[5047]: I0223 07:10:34.853520 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 07:10:35 crc kubenswrapper[5047]: I0223 07:10:35.096375 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:35 crc kubenswrapper[5047]: W0223 07:10:35.109938 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb4c480a_88f1_42e1_bdce_21bdb85ecc48.slice/crio-2a15af10b6fe7afc847351c4b45a65bee6b1b97516f6012fd8ae34523c830c7c WatchSource:0}: Error finding container 2a15af10b6fe7afc847351c4b45a65bee6b1b97516f6012fd8ae34523c830c7c: Status 404 returned error can't find the container with id 2a15af10b6fe7afc847351c4b45a65bee6b1b97516f6012fd8ae34523c830c7c Feb 23 07:10:35 crc kubenswrapper[5047]: I0223 07:10:35.542179 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fp7fw" podUID="7cc22dfb-ad4d-4ae5-95f8-e55057a57878" containerName="registry-server" containerID="cri-o://cd70f1b82a81d66e733cecc5c78a1ec8cdf8eab584ac96d8ce6822ca69bea138" gracePeriod=2 Feb 23 07:10:35 crc kubenswrapper[5047]: I0223 07:10:35.542687 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db4c480a-88f1-42e1-bdce-21bdb85ecc48","Type":"ContainerStarted","Data":"832e9b67b68d0d91fdc72fdc39407b689d44735575143025b03550e2cf607d48"} Feb 23 07:10:35 crc kubenswrapper[5047]: I0223 07:10:35.542719 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db4c480a-88f1-42e1-bdce-21bdb85ecc48","Type":"ContainerStarted","Data":"2a15af10b6fe7afc847351c4b45a65bee6b1b97516f6012fd8ae34523c830c7c"} Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.092612 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.242871 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8r6h\" (UniqueName: \"kubernetes.io/projected/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-kube-api-access-t8r6h\") pod \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\" (UID: \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\") " Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.243060 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-utilities\") pod \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\" (UID: \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\") " Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.243171 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-catalog-content\") pod \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\" (UID: \"7cc22dfb-ad4d-4ae5-95f8-e55057a57878\") " Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.244568 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-utilities" (OuterVolumeSpecName: "utilities") pod "7cc22dfb-ad4d-4ae5-95f8-e55057a57878" (UID: "7cc22dfb-ad4d-4ae5-95f8-e55057a57878"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.251231 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-kube-api-access-t8r6h" (OuterVolumeSpecName: "kube-api-access-t8r6h") pod "7cc22dfb-ad4d-4ae5-95f8-e55057a57878" (UID: "7cc22dfb-ad4d-4ae5-95f8-e55057a57878"). InnerVolumeSpecName "kube-api-access-t8r6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.346734 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8r6h\" (UniqueName: \"kubernetes.io/projected/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-kube-api-access-t8r6h\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.346789 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.354149 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7cc22dfb-ad4d-4ae5-95f8-e55057a57878" (UID: "7cc22dfb-ad4d-4ae5-95f8-e55057a57878"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.449490 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc22dfb-ad4d-4ae5-95f8-e55057a57878-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.561040 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db4c480a-88f1-42e1-bdce-21bdb85ecc48","Type":"ContainerStarted","Data":"9e039263a53bbeb35ac12ba86211b210fd996b0f94ebba2771ea8c71cf94c08e"} Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.568154 5047 generic.go:334] "Generic (PLEG): container finished" podID="7cc22dfb-ad4d-4ae5-95f8-e55057a57878" containerID="cd70f1b82a81d66e733cecc5c78a1ec8cdf8eab584ac96d8ce6822ca69bea138" exitCode=0 Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.568210 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp7fw" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.568258 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp7fw" event={"ID":"7cc22dfb-ad4d-4ae5-95f8-e55057a57878","Type":"ContainerDied","Data":"cd70f1b82a81d66e733cecc5c78a1ec8cdf8eab584ac96d8ce6822ca69bea138"} Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.568887 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp7fw" event={"ID":"7cc22dfb-ad4d-4ae5-95f8-e55057a57878","Type":"ContainerDied","Data":"6cea7d333b14564f1ca636f29e95221a532028f3eaf3d3406b107e02d96f208c"} Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.568973 5047 scope.go:117] "RemoveContainer" containerID="cd70f1b82a81d66e733cecc5c78a1ec8cdf8eab584ac96d8ce6822ca69bea138" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.603646 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.6036120289999998 podStartE2EDuration="3.603612029s" podCreationTimestamp="2026-02-23 07:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:10:36.591332026 +0000 UTC m=+1558.842659240" watchObservedRunningTime="2026-02-23 07:10:36.603612029 +0000 UTC m=+1558.854939193" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.618232 5047 scope.go:117] "RemoveContainer" containerID="ab5ddd8e0748a837463e80388d303cfefe1235017cec17ffa96027aaf3e1fb93" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.641124 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fp7fw"] Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.649021 5047 scope.go:117] "RemoveContainer" containerID="3d61fb04ee0bbb4b053b534ea0c28bb11d21c70a67a08ecb4cddd00c5f55237c" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.653795 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fp7fw"] Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.704297 5047 scope.go:117] "RemoveContainer" containerID="cd70f1b82a81d66e733cecc5c78a1ec8cdf8eab584ac96d8ce6822ca69bea138" Feb 23 07:10:36 crc kubenswrapper[5047]: E0223 07:10:36.705090 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd70f1b82a81d66e733cecc5c78a1ec8cdf8eab584ac96d8ce6822ca69bea138\": container with ID starting with cd70f1b82a81d66e733cecc5c78a1ec8cdf8eab584ac96d8ce6822ca69bea138 not found: ID does not exist" containerID="cd70f1b82a81d66e733cecc5c78a1ec8cdf8eab584ac96d8ce6822ca69bea138" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.705158 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd70f1b82a81d66e733cecc5c78a1ec8cdf8eab584ac96d8ce6822ca69bea138"} err="failed to get container status \"cd70f1b82a81d66e733cecc5c78a1ec8cdf8eab584ac96d8ce6822ca69bea138\": rpc error: code = NotFound desc = could not find container \"cd70f1b82a81d66e733cecc5c78a1ec8cdf8eab584ac96d8ce6822ca69bea138\": container with ID starting with cd70f1b82a81d66e733cecc5c78a1ec8cdf8eab584ac96d8ce6822ca69bea138 not found: ID does not exist" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.705202 5047 scope.go:117] "RemoveContainer" containerID="ab5ddd8e0748a837463e80388d303cfefe1235017cec17ffa96027aaf3e1fb93" Feb 23 07:10:36 crc kubenswrapper[5047]: E0223 07:10:36.705725 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab5ddd8e0748a837463e80388d303cfefe1235017cec17ffa96027aaf3e1fb93\": container with ID starting with ab5ddd8e0748a837463e80388d303cfefe1235017cec17ffa96027aaf3e1fb93 not found: ID does not exist" containerID="ab5ddd8e0748a837463e80388d303cfefe1235017cec17ffa96027aaf3e1fb93" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.705789 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab5ddd8e0748a837463e80388d303cfefe1235017cec17ffa96027aaf3e1fb93"} err="failed to get container status \"ab5ddd8e0748a837463e80388d303cfefe1235017cec17ffa96027aaf3e1fb93\": rpc error: code = NotFound desc = could not find container \"ab5ddd8e0748a837463e80388d303cfefe1235017cec17ffa96027aaf3e1fb93\": container with ID starting with ab5ddd8e0748a837463e80388d303cfefe1235017cec17ffa96027aaf3e1fb93 not found: ID does not exist" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.705830 5047 scope.go:117] "RemoveContainer" containerID="3d61fb04ee0bbb4b053b534ea0c28bb11d21c70a67a08ecb4cddd00c5f55237c" Feb 23 07:10:36 crc kubenswrapper[5047]: E0223 07:10:36.706188 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d61fb04ee0bbb4b053b534ea0c28bb11d21c70a67a08ecb4cddd00c5f55237c\": container with ID starting with 3d61fb04ee0bbb4b053b534ea0c28bb11d21c70a67a08ecb4cddd00c5f55237c not found: ID does not exist" containerID="3d61fb04ee0bbb4b053b534ea0c28bb11d21c70a67a08ecb4cddd00c5f55237c" Feb 23 07:10:36 crc kubenswrapper[5047]: I0223 07:10:36.706244 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d61fb04ee0bbb4b053b534ea0c28bb11d21c70a67a08ecb4cddd00c5f55237c"} err="failed to get container status \"3d61fb04ee0bbb4b053b534ea0c28bb11d21c70a67a08ecb4cddd00c5f55237c\": rpc error: code = NotFound desc = could not find container \"3d61fb04ee0bbb4b053b534ea0c28bb11d21c70a67a08ecb4cddd00c5f55237c\": container with ID starting with 3d61fb04ee0bbb4b053b534ea0c28bb11d21c70a67a08ecb4cddd00c5f55237c not found: ID does not exist" Feb 23 07:10:37 crc kubenswrapper[5047]: I0223 07:10:37.181707 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:10:37 crc kubenswrapper[5047]: I0223 07:10:37.181790 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:10:38 crc kubenswrapper[5047]: I0223 07:10:38.353768 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc22dfb-ad4d-4ae5-95f8-e55057a57878" path="/var/lib/kubelet/pods/7cc22dfb-ad4d-4ae5-95f8-e55057a57878/volumes" Feb 23 07:10:39 crc kubenswrapper[5047]: I0223 07:10:39.854306 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 07:10:39 crc kubenswrapper[5047]: I0223 07:10:39.896385 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 07:10:40 crc kubenswrapper[5047]: I0223 07:10:40.679641 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 07:10:42 crc kubenswrapper[5047]: I0223 07:10:42.181883 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 07:10:42 crc kubenswrapper[5047]: I0223 07:10:42.182529 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 07:10:43 crc kubenswrapper[5047]: I0223 07:10:43.200125 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:43 crc kubenswrapper[5047]: I0223 07:10:43.200183 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:44 crc kubenswrapper[5047]: I0223 07:10:44.599111 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:10:44 crc kubenswrapper[5047]: I0223 07:10:44.599190 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:10:45 crc kubenswrapper[5047]: I0223 07:10:45.614170 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="db4c480a-88f1-42e1-bdce-21bdb85ecc48" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:45 crc kubenswrapper[5047]: I0223 07:10:45.614235 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="db4c480a-88f1-42e1-bdce-21bdb85ecc48" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:45 crc kubenswrapper[5047]: I0223 07:10:45.868619 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 23 07:10:52 crc kubenswrapper[5047]: I0223 07:10:52.185989 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 07:10:52 crc kubenswrapper[5047]: I0223 07:10:52.190789 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 07:10:52 crc kubenswrapper[5047]: I0223 07:10:52.192869 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 07:10:52 crc kubenswrapper[5047]: I0223 07:10:52.807807 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 07:10:54 crc kubenswrapper[5047]: I0223 07:10:54.612255 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 07:10:54 crc kubenswrapper[5047]: I0223 07:10:54.613183 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 07:10:54 crc kubenswrapper[5047]: I0223 07:10:54.618632 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 07:10:54 crc kubenswrapper[5047]: I0223 07:10:54.621430 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 07:10:54 crc kubenswrapper[5047]: I0223 07:10:54.823148 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 07:10:54 crc kubenswrapper[5047]: I0223 07:10:54.835439 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.042203 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bz2gr"] Feb 23 07:11:13 crc kubenswrapper[5047]: E0223 07:11:13.043197 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc22dfb-ad4d-4ae5-95f8-e55057a57878" containerName="extract-content" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.043211 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc22dfb-ad4d-4ae5-95f8-e55057a57878" containerName="extract-content" Feb 23 07:11:13 crc kubenswrapper[5047]: E0223 07:11:13.043230 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc22dfb-ad4d-4ae5-95f8-e55057a57878" containerName="registry-server" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.043235 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc22dfb-ad4d-4ae5-95f8-e55057a57878" containerName="registry-server" Feb 23 07:11:13 crc kubenswrapper[5047]: E0223 07:11:13.043246 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc22dfb-ad4d-4ae5-95f8-e55057a57878" containerName="extract-utilities" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.043253 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc22dfb-ad4d-4ae5-95f8-e55057a57878" containerName="extract-utilities" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.043435 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc22dfb-ad4d-4ae5-95f8-e55057a57878" containerName="registry-server" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.044193 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bz2gr" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.055128 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.063870 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b51d-account-create-update-6tf4f"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.065208 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b51d-account-create-update-6tf4f" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.101032 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.141334 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b51d-account-create-update-6tf4f"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.164228 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bz2gr"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.206009 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b51d-account-create-update-9qmbt"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.215970 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef3c81d-1f2a-447d-b8c3-687002fa8c35-operator-scripts\") pod \"placement-b51d-account-create-update-6tf4f\" (UID: \"8ef3c81d-1f2a-447d-b8c3-687002fa8c35\") " pod="openstack/placement-b51d-account-create-update-6tf4f" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.216205 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn7df\" (UniqueName: \"kubernetes.io/projected/8ef3c81d-1f2a-447d-b8c3-687002fa8c35-kube-api-access-nn7df\") pod \"placement-b51d-account-create-update-6tf4f\" (UID: \"8ef3c81d-1f2a-447d-b8c3-687002fa8c35\") " pod="openstack/placement-b51d-account-create-update-6tf4f" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.216687 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h2kg\" (UniqueName: \"kubernetes.io/projected/4cdbcdf8-785e-4383-ab02-4492360bf4b4-kube-api-access-6h2kg\") pod \"root-account-create-update-bz2gr\" (UID: \"4cdbcdf8-785e-4383-ab02-4492360bf4b4\") " pod="openstack/root-account-create-update-bz2gr" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.216979 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cdbcdf8-785e-4383-ab02-4492360bf4b4-operator-scripts\") pod \"root-account-create-update-bz2gr\" (UID: \"4cdbcdf8-785e-4383-ab02-4492360bf4b4\") " pod="openstack/root-account-create-update-bz2gr" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.316318 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b51d-account-create-update-9qmbt"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.332927 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cdbcdf8-785e-4383-ab02-4492360bf4b4-operator-scripts\") pod \"root-account-create-update-bz2gr\" (UID: \"4cdbcdf8-785e-4383-ab02-4492360bf4b4\") " pod="openstack/root-account-create-update-bz2gr" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.333155 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef3c81d-1f2a-447d-b8c3-687002fa8c35-operator-scripts\") pod \"placement-b51d-account-create-update-6tf4f\" (UID: \"8ef3c81d-1f2a-447d-b8c3-687002fa8c35\") " pod="openstack/placement-b51d-account-create-update-6tf4f" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.333204 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn7df\" (UniqueName: \"kubernetes.io/projected/8ef3c81d-1f2a-447d-b8c3-687002fa8c35-kube-api-access-nn7df\") pod \"placement-b51d-account-create-update-6tf4f\" (UID: \"8ef3c81d-1f2a-447d-b8c3-687002fa8c35\") " pod="openstack/placement-b51d-account-create-update-6tf4f" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.333356 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-965e-account-create-update-r6tbz"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.333457 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h2kg\" (UniqueName: \"kubernetes.io/projected/4cdbcdf8-785e-4383-ab02-4492360bf4b4-kube-api-access-6h2kg\") pod \"root-account-create-update-bz2gr\" (UID: \"4cdbcdf8-785e-4383-ab02-4492360bf4b4\") " pod="openstack/root-account-create-update-bz2gr" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.334978 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cdbcdf8-785e-4383-ab02-4492360bf4b4-operator-scripts\") pod \"root-account-create-update-bz2gr\" (UID: \"4cdbcdf8-785e-4383-ab02-4492360bf4b4\") " pod="openstack/root-account-create-update-bz2gr" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.335174 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-965e-account-create-update-r6tbz" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.339963 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.343613 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef3c81d-1f2a-447d-b8c3-687002fa8c35-operator-scripts\") pod \"placement-b51d-account-create-update-6tf4f\" (UID: \"8ef3c81d-1f2a-447d-b8c3-687002fa8c35\") " pod="openstack/placement-b51d-account-create-update-6tf4f" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.368529 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-965e-account-create-update-r6tbz"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.410976 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-8wl9w"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.425846 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-8wl9w"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.438240 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h2kg\" (UniqueName: \"kubernetes.io/projected/4cdbcdf8-785e-4383-ab02-4492360bf4b4-kube-api-access-6h2kg\") pod \"root-account-create-update-bz2gr\" (UID: \"4cdbcdf8-785e-4383-ab02-4492360bf4b4\") " pod="openstack/root-account-create-update-bz2gr" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.439062 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn7df\" (UniqueName: \"kubernetes.io/projected/8ef3c81d-1f2a-447d-b8c3-687002fa8c35-kube-api-access-nn7df\") pod \"placement-b51d-account-create-update-6tf4f\" (UID: \"8ef3c81d-1f2a-447d-b8c3-687002fa8c35\") " pod="openstack/placement-b51d-account-create-update-6tf4f" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.440515 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vltlc\" (UniqueName: \"kubernetes.io/projected/6665cd0b-0b9d-4485-8cdb-7abe364508d1-kube-api-access-vltlc\") pod \"barbican-965e-account-create-update-r6tbz\" (UID: \"6665cd0b-0b9d-4485-8cdb-7abe364508d1\") " pod="openstack/barbican-965e-account-create-update-r6tbz" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.440789 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6665cd0b-0b9d-4485-8cdb-7abe364508d1-operator-scripts\") pod \"barbican-965e-account-create-update-r6tbz\" (UID: \"6665cd0b-0b9d-4485-8cdb-7abe364508d1\") " pod="openstack/barbican-965e-account-create-update-r6tbz" Feb 23 07:11:13 crc kubenswrapper[5047]: E0223 07:11:13.456629 5047 secret.go:188] Couldn't get secret openstack/placement-scripts: secret "placement-scripts" not found Feb 23 07:11:13 crc kubenswrapper[5047]: E0223 07:11:13.456699 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts podName:a452777f-60a7-4cfc-9e9b-b262be0a27cf nodeName:}" failed. No retries permitted until 2026-02-23 07:11:13.956682988 +0000 UTC m=+1596.208010112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts") pod "placement-5f4db5cb66-tjpmr" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf") : secret "placement-scripts" not found Feb 23 07:11:13 crc kubenswrapper[5047]: E0223 07:11:13.457557 5047 secret.go:188] Couldn't get secret openstack/placement-config-data: secret "placement-config-data" not found Feb 23 07:11:13 crc kubenswrapper[5047]: E0223 07:11:13.457589 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data podName:a452777f-60a7-4cfc-9e9b-b262be0a27cf nodeName:}" failed. No retries permitted until 2026-02-23 07:11:13.957580232 +0000 UTC m=+1596.208907366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data") pod "placement-5f4db5cb66-tjpmr" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf") : secret "placement-config-data" not found Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.518453 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-86db-account-create-update-dfrnh"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.520192 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-86db-account-create-update-dfrnh" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.531887 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.544129 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.568878 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgg69\" (UniqueName: \"kubernetes.io/projected/6c6d00f2-ae85-4f68-9645-94e666f0e1c2-kube-api-access-sgg69\") pod \"glance-86db-account-create-update-dfrnh\" (UID: \"6c6d00f2-ae85-4f68-9645-94e666f0e1c2\") " pod="openstack/glance-86db-account-create-update-dfrnh" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.568954 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vltlc\" (UniqueName: \"kubernetes.io/projected/6665cd0b-0b9d-4485-8cdb-7abe364508d1-kube-api-access-vltlc\") pod \"barbican-965e-account-create-update-r6tbz\" (UID: \"6665cd0b-0b9d-4485-8cdb-7abe364508d1\") " pod="openstack/barbican-965e-account-create-update-r6tbz" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.569068 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c6d00f2-ae85-4f68-9645-94e666f0e1c2-operator-scripts\") pod \"glance-86db-account-create-update-dfrnh\" (UID: \"6c6d00f2-ae85-4f68-9645-94e666f0e1c2\") " pod="openstack/glance-86db-account-create-update-dfrnh" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.569102 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6665cd0b-0b9d-4485-8cdb-7abe364508d1-operator-scripts\") pod \"barbican-965e-account-create-update-r6tbz\" (UID: \"6665cd0b-0b9d-4485-8cdb-7abe364508d1\") " pod="openstack/barbican-965e-account-create-update-r6tbz" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.570074 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6665cd0b-0b9d-4485-8cdb-7abe364508d1-operator-scripts\") pod \"barbican-965e-account-create-update-r6tbz\" (UID: \"6665cd0b-0b9d-4485-8cdb-7abe364508d1\") " pod="openstack/barbican-965e-account-create-update-r6tbz" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.583755 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.584006 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="ae539167-f011-4f50-8ce6-df90580fa157" containerName="openstackclient" containerID="cri-o://458110c59c1b32388e1bbf0559e62ae01cdd38bb1a5dc8a73f30c5ca7efaf50f" gracePeriod=2 Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.605548 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-86db-account-create-update-dfrnh"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.648055 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vltlc\" (UniqueName: \"kubernetes.io/projected/6665cd0b-0b9d-4485-8cdb-7abe364508d1-kube-api-access-vltlc\") pod \"barbican-965e-account-create-update-r6tbz\" (UID: \"6665cd0b-0b9d-4485-8cdb-7abe364508d1\") " pod="openstack/barbican-965e-account-create-update-r6tbz" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.653546 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.669496 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bz2gr" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.671489 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c6d00f2-ae85-4f68-9645-94e666f0e1c2-operator-scripts\") pod \"glance-86db-account-create-update-dfrnh\" (UID: \"6c6d00f2-ae85-4f68-9645-94e666f0e1c2\") " pod="openstack/glance-86db-account-create-update-dfrnh" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.671736 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgg69\" (UniqueName: \"kubernetes.io/projected/6c6d00f2-ae85-4f68-9645-94e666f0e1c2-kube-api-access-sgg69\") pod \"glance-86db-account-create-update-dfrnh\" (UID: \"6c6d00f2-ae85-4f68-9645-94e666f0e1c2\") " pod="openstack/glance-86db-account-create-update-dfrnh" Feb 23 07:11:13 crc kubenswrapper[5047]: E0223 07:11:13.674353 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 07:11:13 crc kubenswrapper[5047]: E0223 07:11:13.674507 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data podName:1e83857b-5e17-4878-8f9b-e8d1a65325ba nodeName:}" failed. No retries permitted until 2026-02-23 07:11:14.174395365 +0000 UTC m=+1596.425722499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data") pod "rabbitmq-cell1-server-0" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba") : configmap "rabbitmq-cell1-config-data" not found Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.674890 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c6d00f2-ae85-4f68-9645-94e666f0e1c2-operator-scripts\") pod \"glance-86db-account-create-update-dfrnh\" (UID: \"6c6d00f2-ae85-4f68-9645-94e666f0e1c2\") " pod="openstack/glance-86db-account-create-update-dfrnh" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.675668 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-965e-account-create-update-r6tbz" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.701424 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b51d-account-create-update-6tf4f" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.718293 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgg69\" (UniqueName: \"kubernetes.io/projected/6c6d00f2-ae85-4f68-9645-94e666f0e1c2-kube-api-access-sgg69\") pod \"glance-86db-account-create-update-dfrnh\" (UID: \"6c6d00f2-ae85-4f68-9645-94e666f0e1c2\") " pod="openstack/glance-86db-account-create-update-dfrnh" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.725161 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-86db-account-create-update-5dpfx"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.760137 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-86db-account-create-update-5dpfx"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.783949 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.784304 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="b9ebb281-4310-483e-b599-3d3c8775e341" containerName="ovn-northd" containerID="cri-o://5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22" gracePeriod=30 Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.785271 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="b9ebb281-4310-483e-b599-3d3c8775e341" containerName="openstack-network-exporter" containerID="cri-o://1495f22cd17554121ec2c04f6d6b705647c2b88d975d0e151cc974cc3354d8b0" gracePeriod=30 Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.818335 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-965e-account-create-update-hv27b"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.848996 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8c0b-account-create-update-vwfrq"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.866000 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-965e-account-create-update-hv27b"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.906544 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-86db-account-create-update-dfrnh" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.926989 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8c0b-account-create-update-vwfrq"] Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.977816 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8c0b-account-create-update-fvht5"] Feb 23 07:11:13 crc kubenswrapper[5047]: E0223 07:11:13.978398 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae539167-f011-4f50-8ce6-df90580fa157" containerName="openstackclient" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.978414 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae539167-f011-4f50-8ce6-df90580fa157" containerName="openstackclient" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.978727 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae539167-f011-4f50-8ce6-df90580fa157" containerName="openstackclient" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.981385 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c0b-account-create-update-fvht5" Feb 23 07:11:13 crc kubenswrapper[5047]: I0223 07:11:13.990462 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 23 07:11:14 crc kubenswrapper[5047]: E0223 07:11:14.002338 5047 secret.go:188] Couldn't get secret openstack/placement-scripts: secret "placement-scripts" not found Feb 23 07:11:14 crc kubenswrapper[5047]: E0223 07:11:14.002403 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts podName:a452777f-60a7-4cfc-9e9b-b262be0a27cf nodeName:}" failed. No retries permitted until 2026-02-23 07:11:15.002388629 +0000 UTC m=+1597.253715763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts") pod "placement-5f4db5cb66-tjpmr" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf") : secret "placement-scripts" not found Feb 23 07:11:14 crc kubenswrapper[5047]: E0223 07:11:14.002724 5047 secret.go:188] Couldn't get secret openstack/placement-config-data: secret "placement-config-data" not found Feb 23 07:11:14 crc kubenswrapper[5047]: E0223 07:11:14.002817 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data podName:a452777f-60a7-4cfc-9e9b-b262be0a27cf nodeName:}" failed. No retries permitted until 2026-02-23 07:11:15.002793141 +0000 UTC m=+1597.254120275 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data") pod "placement-5f4db5cb66-tjpmr" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf") : secret "placement-config-data" not found Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.033002 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e7d5-account-create-update-89ndz"] Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.034514 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e7d5-account-create-update-89ndz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.048179 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.074356 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c0b-account-create-update-fvht5"] Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.102994 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e7d5-account-create-update-89ndz"] Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.109090 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbd8c46d-274a-4544-8d0e-16242f0d9673-operator-scripts\") pod \"cinder-8c0b-account-create-update-fvht5\" (UID: \"fbd8c46d-274a-4544-8d0e-16242f0d9673\") " pod="openstack/cinder-8c0b-account-create-update-fvht5" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.109282 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxrlq\" (UniqueName: \"kubernetes.io/projected/fbd8c46d-274a-4544-8d0e-16242f0d9673-kube-api-access-zxrlq\") pod \"cinder-8c0b-account-create-update-fvht5\" (UID: \"fbd8c46d-274a-4544-8d0e-16242f0d9673\") " pod="openstack/cinder-8c0b-account-create-update-fvht5" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.154541 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e7d5-account-create-update-lwfj5"] Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.179987 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e7d5-account-create-update-lwfj5"] Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.194108 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-ac64-account-create-update-zn4xz"] Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.197139 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac64-account-create-update-zn4xz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.201614 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.209538 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5b36-account-create-update-sr2bf"] Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.215262 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kjzz\" (UniqueName: \"kubernetes.io/projected/b7504baf-d384-4fb8-a053-0da0c933f1e4-kube-api-access-5kjzz\") pod \"neutron-e7d5-account-create-update-89ndz\" (UID: \"b7504baf-d384-4fb8-a053-0da0c933f1e4\") " pod="openstack/neutron-e7d5-account-create-update-89ndz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.215346 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbd8c46d-274a-4544-8d0e-16242f0d9673-operator-scripts\") pod \"cinder-8c0b-account-create-update-fvht5\" (UID: \"fbd8c46d-274a-4544-8d0e-16242f0d9673\") " pod="openstack/cinder-8c0b-account-create-update-fvht5" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.215444 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7504baf-d384-4fb8-a053-0da0c933f1e4-operator-scripts\") pod \"neutron-e7d5-account-create-update-89ndz\" (UID: \"b7504baf-d384-4fb8-a053-0da0c933f1e4\") " pod="openstack/neutron-e7d5-account-create-update-89ndz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.215496 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxrlq\" (UniqueName: \"kubernetes.io/projected/fbd8c46d-274a-4544-8d0e-16242f0d9673-kube-api-access-zxrlq\") pod \"cinder-8c0b-account-create-update-fvht5\" (UID: \"fbd8c46d-274a-4544-8d0e-16242f0d9673\") " pod="openstack/cinder-8c0b-account-create-update-fvht5" Feb 23 07:11:14 crc kubenswrapper[5047]: E0223 07:11:14.218107 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 07:11:14 crc kubenswrapper[5047]: E0223 07:11:14.218205 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data podName:1e83857b-5e17-4878-8f9b-e8d1a65325ba nodeName:}" failed. No retries permitted until 2026-02-23 07:11:15.218177157 +0000 UTC m=+1597.469504291 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data") pod "rabbitmq-cell1-server-0" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba") : configmap "rabbitmq-cell1-config-data" not found Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.219738 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbd8c46d-274a-4544-8d0e-16242f0d9673-operator-scripts\") pod \"cinder-8c0b-account-create-update-fvht5\" (UID: \"fbd8c46d-274a-4544-8d0e-16242f0d9673\") " pod="openstack/cinder-8c0b-account-create-update-fvht5" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.241360 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5b36-account-create-update-sr2bf" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.280533 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.286414 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ac64-account-create-update-zn4xz"] Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.289459 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxrlq\" (UniqueName: \"kubernetes.io/projected/fbd8c46d-274a-4544-8d0e-16242f0d9673-kube-api-access-zxrlq\") pod \"cinder-8c0b-account-create-update-fvht5\" (UID: \"fbd8c46d-274a-4544-8d0e-16242f0d9673\") " pod="openstack/cinder-8c0b-account-create-update-fvht5" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.321149 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7504baf-d384-4fb8-a053-0da0c933f1e4-operator-scripts\") pod \"neutron-e7d5-account-create-update-89ndz\" (UID: \"b7504baf-d384-4fb8-a053-0da0c933f1e4\") " pod="openstack/neutron-e7d5-account-create-update-89ndz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.321208 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82712c2b-a91c-4e0e-9400-0624b0459f57-operator-scripts\") pod \"nova-api-ac64-account-create-update-zn4xz\" (UID: \"82712c2b-a91c-4e0e-9400-0624b0459f57\") " pod="openstack/nova-api-ac64-account-create-update-zn4xz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.321294 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6gjd\" (UniqueName: \"kubernetes.io/projected/82712c2b-a91c-4e0e-9400-0624b0459f57-kube-api-access-p6gjd\") pod \"nova-api-ac64-account-create-update-zn4xz\" (UID: \"82712c2b-a91c-4e0e-9400-0624b0459f57\") " pod="openstack/nova-api-ac64-account-create-update-zn4xz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.321323 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/969db92a-069e-4713-bca7-5e4ebb89d612-operator-scripts\") pod \"nova-cell0-5b36-account-create-update-sr2bf\" (UID: \"969db92a-069e-4713-bca7-5e4ebb89d612\") " pod="openstack/nova-cell0-5b36-account-create-update-sr2bf" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.321344 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kjzz\" (UniqueName: \"kubernetes.io/projected/b7504baf-d384-4fb8-a053-0da0c933f1e4-kube-api-access-5kjzz\") pod \"neutron-e7d5-account-create-update-89ndz\" (UID: \"b7504baf-d384-4fb8-a053-0da0c933f1e4\") " pod="openstack/neutron-e7d5-account-create-update-89ndz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.321380 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwsp5\" (UniqueName: \"kubernetes.io/projected/969db92a-069e-4713-bca7-5e4ebb89d612-kube-api-access-pwsp5\") pod \"nova-cell0-5b36-account-create-update-sr2bf\" (UID: \"969db92a-069e-4713-bca7-5e4ebb89d612\") " pod="openstack/nova-cell0-5b36-account-create-update-sr2bf" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.323816 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7504baf-d384-4fb8-a053-0da0c933f1e4-operator-scripts\") pod \"neutron-e7d5-account-create-update-89ndz\" (UID: \"b7504baf-d384-4fb8-a053-0da0c933f1e4\") " pod="openstack/neutron-e7d5-account-create-update-89ndz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.378051 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c0b-account-create-update-fvht5" Feb 23 07:11:14 crc kubenswrapper[5047]: E0223 07:11:14.461692 5047 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.129:59232->38.102.83.129:42399: write tcp 38.102.83.129:59232->38.102.83.129:42399: write: broken pipe Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.479123 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6gjd\" (UniqueName: \"kubernetes.io/projected/82712c2b-a91c-4e0e-9400-0624b0459f57-kube-api-access-p6gjd\") pod \"nova-api-ac64-account-create-update-zn4xz\" (UID: \"82712c2b-a91c-4e0e-9400-0624b0459f57\") " pod="openstack/nova-api-ac64-account-create-update-zn4xz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.479252 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/969db92a-069e-4713-bca7-5e4ebb89d612-operator-scripts\") pod \"nova-cell0-5b36-account-create-update-sr2bf\" (UID: \"969db92a-069e-4713-bca7-5e4ebb89d612\") " pod="openstack/nova-cell0-5b36-account-create-update-sr2bf" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.479399 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwsp5\" (UniqueName: \"kubernetes.io/projected/969db92a-069e-4713-bca7-5e4ebb89d612-kube-api-access-pwsp5\") pod \"nova-cell0-5b36-account-create-update-sr2bf\" (UID: \"969db92a-069e-4713-bca7-5e4ebb89d612\") " pod="openstack/nova-cell0-5b36-account-create-update-sr2bf" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.479735 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82712c2b-a91c-4e0e-9400-0624b0459f57-operator-scripts\") pod \"nova-api-ac64-account-create-update-zn4xz\" (UID: \"82712c2b-a91c-4e0e-9400-0624b0459f57\") " pod="openstack/nova-api-ac64-account-create-update-zn4xz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.563419 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/969db92a-069e-4713-bca7-5e4ebb89d612-operator-scripts\") pod \"nova-cell0-5b36-account-create-update-sr2bf\" (UID: \"969db92a-069e-4713-bca7-5e4ebb89d612\") " pod="openstack/nova-cell0-5b36-account-create-update-sr2bf" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.576527 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kjzz\" (UniqueName: \"kubernetes.io/projected/b7504baf-d384-4fb8-a053-0da0c933f1e4-kube-api-access-5kjzz\") pod \"neutron-e7d5-account-create-update-89ndz\" (UID: \"b7504baf-d384-4fb8-a053-0da0c933f1e4\") " pod="openstack/neutron-e7d5-account-create-update-89ndz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.576757 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82712c2b-a91c-4e0e-9400-0624b0459f57-operator-scripts\") pod \"nova-api-ac64-account-create-update-zn4xz\" (UID: \"82712c2b-a91c-4e0e-9400-0624b0459f57\") " pod="openstack/nova-api-ac64-account-create-update-zn4xz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.682050 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwsp5\" (UniqueName: \"kubernetes.io/projected/969db92a-069e-4713-bca7-5e4ebb89d612-kube-api-access-pwsp5\") pod \"nova-cell0-5b36-account-create-update-sr2bf\" (UID: \"969db92a-069e-4713-bca7-5e4ebb89d612\") " pod="openstack/nova-cell0-5b36-account-create-update-sr2bf" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.714360 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e7d5-account-create-update-89ndz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.782058 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6gjd\" (UniqueName: \"kubernetes.io/projected/82712c2b-a91c-4e0e-9400-0624b0459f57-kube-api-access-p6gjd\") pod \"nova-api-ac64-account-create-update-zn4xz\" (UID: \"82712c2b-a91c-4e0e-9400-0624b0459f57\") " pod="openstack/nova-api-ac64-account-create-update-zn4xz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.827566 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146d74a4-9a27-47d9-87e7-3372501acab0" path="/var/lib/kubelet/pods/146d74a4-9a27-47d9-87e7-3372501acab0/volumes" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.828542 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1efebf35-1d9c-450d-abb5-6e247edc9db2" path="/var/lib/kubelet/pods/1efebf35-1d9c-450d-abb5-6e247edc9db2/volumes" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.829540 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3757bd5c-675e-4944-9a92-9e684e25ef6d" path="/var/lib/kubelet/pods/3757bd5c-675e-4944-9a92-9e684e25ef6d/volumes" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.830094 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63179dda-c120-44c4-9f07-1cef37ca07e0" path="/var/lib/kubelet/pods/63179dda-c120-44c4-9f07-1cef37ca07e0/volumes" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.838783 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9" path="/var/lib/kubelet/pods/a90c97f0-3c5e-4b65-aee9-bd9c2dd5fbb9/volumes" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.839395 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8693532-d88f-49d6-a72c-4d751df3a3eb" path="/var/lib/kubelet/pods/b8693532-d88f-49d6-a72c-4d751df3a3eb/volumes" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.840099 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.840140 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5b36-account-create-update-sr2bf"] Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.840153 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.840165 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-tqpzs"] Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.843624 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="01ca037a-0388-4fdc-9106-1abbfc17566d" containerName="openstack-network-exporter" containerID="cri-o://aab582d93c645f1cd1f93f646d13990d4cebf1f69a19788d254697fedd7fc7f7" gracePeriod=300 Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.868468 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5b36-account-create-update-sr2bf" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.869979 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-tqpzs"] Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.908287 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-08b6-account-create-update-hxmz5"] Feb 23 07:11:14 crc kubenswrapper[5047]: E0223 07:11:14.922062 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 07:11:14 crc kubenswrapper[5047]: E0223 07:11:14.922142 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data podName:6776acf2-e53f-4892-847d-8667669a5eb9 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:15.4221272 +0000 UTC m=+1597.673454334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data") pod "rabbitmq-server-0" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9") : configmap "rabbitmq-config-data" not found Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.935707 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac64-account-create-update-zn4xz" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.936062 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:14 crc kubenswrapper[5047]: I0223 07:11:14.938092 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.025335 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnc4s\" (UniqueName: \"kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s\") pod \"nova-cell1-08b6-account-create-update-hxmz5\" (UID: \"9819ef7e-f619-45a5-b5d3-e67bb5214f24\") " pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.025815 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts\") pod \"nova-cell1-08b6-account-create-update-hxmz5\" (UID: \"9819ef7e-f619-45a5-b5d3-e67bb5214f24\") " pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:15 crc kubenswrapper[5047]: E0223 07:11:15.028009 5047 secret.go:188] Couldn't get secret openstack/placement-scripts: secret "placement-scripts" not found Feb 23 07:11:15 crc kubenswrapper[5047]: E0223 07:11:15.028074 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts podName:a452777f-60a7-4cfc-9e9b-b262be0a27cf nodeName:}" failed. No retries permitted until 2026-02-23 07:11:17.028058552 +0000 UTC m=+1599.279385686 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts") pod "placement-5f4db5cb66-tjpmr" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf") : secret "placement-scripts" not found Feb 23 07:11:15 crc kubenswrapper[5047]: E0223 07:11:15.028519 5047 secret.go:188] Couldn't get secret openstack/placement-config-data: secret "placement-config-data" not found Feb 23 07:11:15 crc kubenswrapper[5047]: E0223 07:11:15.028548 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data podName:a452777f-60a7-4cfc-9e9b-b262be0a27cf nodeName:}" failed. No retries permitted until 2026-02-23 07:11:17.028538784 +0000 UTC m=+1599.279865918 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data") pod "placement-5f4db5cb66-tjpmr" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf") : secret "placement-config-data" not found Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.041881 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ac64-account-create-update-thwgw"] Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.071983 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5b36-account-create-update-pthd9"] Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.101354 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5b36-account-create-update-pthd9"] Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.131509 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnc4s\" (UniqueName: \"kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s\") pod \"nova-cell1-08b6-account-create-update-hxmz5\" (UID: \"9819ef7e-f619-45a5-b5d3-e67bb5214f24\") " pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.131628 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts\") pod \"nova-cell1-08b6-account-create-update-hxmz5\" (UID: \"9819ef7e-f619-45a5-b5d3-e67bb5214f24\") " pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:15 crc kubenswrapper[5047]: E0223 07:11:15.131955 5047 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 23 07:11:15 crc kubenswrapper[5047]: E0223 07:11:15.132027 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts podName:9819ef7e-f619-45a5-b5d3-e67bb5214f24 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:15.632003771 +0000 UTC m=+1597.883330905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts") pod "nova-cell1-08b6-account-create-update-hxmz5" (UID: "9819ef7e-f619-45a5-b5d3-e67bb5214f24") : configmap "openstack-cell1-scripts" not found Feb 23 07:11:15 crc kubenswrapper[5047]: E0223 07:11:15.163372 5047 projected.go:194] Error preparing data for projected volume kube-api-access-dnc4s for pod openstack/nova-cell1-08b6-account-create-update-hxmz5: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 23 07:11:15 crc kubenswrapper[5047]: E0223 07:11:15.163467 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s podName:9819ef7e-f619-45a5-b5d3-e67bb5214f24 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:15.663440449 +0000 UTC m=+1597.914767583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dnc4s" (UniqueName: "kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s") pod "nova-cell1-08b6-account-create-update-hxmz5" (UID: "9819ef7e-f619-45a5-b5d3-e67bb5214f24") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.176872 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-08b6-account-create-update-hxmz5"] Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.210526 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-kbprb"] Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.223562 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="01ca037a-0388-4fdc-9106-1abbfc17566d" containerName="ovsdbserver-nb" containerID="cri-o://dbc428f4bdef3134e02420519caff4df17024718afa37dadd912cd858a13c0cc" gracePeriod=300 Feb 23 07:11:15 crc kubenswrapper[5047]: E0223 07:11:15.234424 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 07:11:15 crc kubenswrapper[5047]: E0223 07:11:15.234512 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data podName:1e83857b-5e17-4878-8f9b-e8d1a65325ba nodeName:}" failed. No retries permitted until 2026-02-23 07:11:17.234490722 +0000 UTC m=+1599.485817856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data") pod "rabbitmq-cell1-server-0" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba") : configmap "rabbitmq-cell1-config-data" not found Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.285209 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-ac64-account-create-update-thwgw"] Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.311605 5047 generic.go:334] "Generic (PLEG): container finished" podID="b9ebb281-4310-483e-b599-3d3c8775e341" containerID="1495f22cd17554121ec2c04f6d6b705647c2b88d975d0e151cc974cc3354d8b0" exitCode=2 Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.312166 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9ebb281-4310-483e-b599-3d3c8775e341","Type":"ContainerDied","Data":"1495f22cd17554121ec2c04f6d6b705647c2b88d975d0e151cc974cc3354d8b0"} Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.365632 5047 generic.go:334] "Generic (PLEG): container finished" podID="01ca037a-0388-4fdc-9106-1abbfc17566d" containerID="aab582d93c645f1cd1f93f646d13990d4cebf1f69a19788d254697fedd7fc7f7" exitCode=2 Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.365727 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"01ca037a-0388-4fdc-9106-1abbfc17566d","Type":"ContainerDied","Data":"aab582d93c645f1cd1f93f646d13990d4cebf1f69a19788d254697fedd7fc7f7"} Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.385241 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-kbprb"] Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.405927 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9bx9t"] Feb 23 07:11:15 crc kubenswrapper[5047]: E0223 07:11:15.443372 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 07:11:15 crc kubenswrapper[5047]: E0223 07:11:15.443451 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data podName:6776acf2-e53f-4892-847d-8667669a5eb9 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:16.443425808 +0000 UTC m=+1598.694752942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data") pod "rabbitmq-server-0" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9") : configmap "rabbitmq-config-data" not found Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.456503 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9bx9t"] Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.508124 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2dqll"] Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.510708 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.548258 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-08b6-account-create-update-5vjkd"] Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.560478 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-nglmq"] Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.571863 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-08b6-account-create-update-5vjkd"] Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.582146 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc70af3-2572-4019-adcb-8aecf538ae27-catalog-content\") pod \"redhat-marketplace-2dqll\" (UID: \"8fc70af3-2572-4019-adcb-8aecf538ae27\") " pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.582330 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4hn8\" (UniqueName: \"kubernetes.io/projected/8fc70af3-2572-4019-adcb-8aecf538ae27-kube-api-access-c4hn8\") pod \"redhat-marketplace-2dqll\" (UID: \"8fc70af3-2572-4019-adcb-8aecf538ae27\") " pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.582381 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc70af3-2572-4019-adcb-8aecf538ae27-utilities\") pod \"redhat-marketplace-2dqll\" (UID: \"8fc70af3-2572-4019-adcb-8aecf538ae27\") " pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.586593 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-nglmq"] Feb 23 07:11:15 crc kubenswrapper[5047]: I0223 07:11:15.598719 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dqll"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.619847 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-hzs78"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.620086 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-hzs78" podUID="e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d" containerName="openstack-network-exporter" containerID="cri-o://13fb30f0e98afd5102c75b063d5e9dedad3a88eb8a1cb4f9dfb35321bd545830" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.629565 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-8zsv6"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.640041 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-8zsv6"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.670314 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fk6gc"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.680954 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-f7lbh"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.687491 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnc4s\" (UniqueName: \"kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s\") pod \"nova-cell1-08b6-account-create-update-hxmz5\" (UID: \"9819ef7e-f619-45a5-b5d3-e67bb5214f24\") " pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.687811 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc70af3-2572-4019-adcb-8aecf538ae27-catalog-content\") pod \"redhat-marketplace-2dqll\" (UID: \"8fc70af3-2572-4019-adcb-8aecf538ae27\") " pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.687862 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts\") pod \"nova-cell1-08b6-account-create-update-hxmz5\" (UID: \"9819ef7e-f619-45a5-b5d3-e67bb5214f24\") " pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.687919 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4hn8\" (UniqueName: \"kubernetes.io/projected/8fc70af3-2572-4019-adcb-8aecf538ae27-kube-api-access-c4hn8\") pod \"redhat-marketplace-2dqll\" (UID: \"8fc70af3-2572-4019-adcb-8aecf538ae27\") " pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.687946 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc70af3-2572-4019-adcb-8aecf538ae27-utilities\") pod \"redhat-marketplace-2dqll\" (UID: \"8fc70af3-2572-4019-adcb-8aecf538ae27\") " pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:15.688088 5047 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:15.688179 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts podName:9819ef7e-f619-45a5-b5d3-e67bb5214f24 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:16.688147938 +0000 UTC m=+1598.939475072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts") pod "nova-cell1-08b6-account-create-update-hxmz5" (UID: "9819ef7e-f619-45a5-b5d3-e67bb5214f24") : configmap "openstack-cell1-scripts" not found Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.688473 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc70af3-2572-4019-adcb-8aecf538ae27-utilities\") pod \"redhat-marketplace-2dqll\" (UID: \"8fc70af3-2572-4019-adcb-8aecf538ae27\") " pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.688734 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc70af3-2572-4019-adcb-8aecf538ae27-catalog-content\") pod \"redhat-marketplace-2dqll\" (UID: \"8fc70af3-2572-4019-adcb-8aecf538ae27\") " pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:15.692894 5047 projected.go:194] Error preparing data for projected volume kube-api-access-dnc4s for pod openstack/nova-cell1-08b6-account-create-update-hxmz5: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:15.692971 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s podName:9819ef7e-f619-45a5-b5d3-e67bb5214f24 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:16.692952965 +0000 UTC m=+1598.944280099 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnc4s" (UniqueName: "kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s") pod "nova-cell1-08b6-account-create-update-hxmz5" (UID: "9819ef7e-f619-45a5-b5d3-e67bb5214f24") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.698412 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-vw9rj"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.707337 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-vw9rj"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.719897 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5f4db5cb66-tjpmr"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.729610 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5f4db5cb66-tjpmr" podUID="a452777f-60a7-4cfc-9e9b-b262be0a27cf" containerName="placement-log" containerID="cri-o://a0e78cccdc69fcb4e0931f1814bebbb8de46564c39b911b4f4864d69cd56c138" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.730499 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5f4db5cb66-tjpmr" podUID="a452777f-60a7-4cfc-9e9b-b262be0a27cf" containerName="placement-api" containerID="cri-o://1bff42b06a8741633963197012e9bcd6401b4df5472444c488f4134bb829bd6d" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.778335 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.781244 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" containerName="openstack-network-exporter" containerID="cri-o://98cb4fcb07ef6bff93f43c15de37a5ab70e29f4de36a5da8359c405261211da7" gracePeriod=300 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.814590 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4hn8\" (UniqueName: \"kubernetes.io/projected/8fc70af3-2572-4019-adcb-8aecf538ae27-kube-api-access-c4hn8\") pod \"redhat-marketplace-2dqll\" (UID: \"8fc70af3-2572-4019-adcb-8aecf538ae27\") " pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.825295 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826122 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-server" containerID="cri-o://07872e17cdb6ef25b82b90ca7744f33fec2b7c83d31131b37d885019a804232f" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826604 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-updater" containerID="cri-o://cf249292c8f2ae891705ce8b850d38cf0ae227f4304c261b1ef67518dde444f6" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826691 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="swift-recon-cron" containerID="cri-o://5438c98e6edb083f8efe8f16e3d1152881677a6739c97c34bfdb8575aac23a30" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826708 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-auditor" containerID="cri-o://3322a1a63d982b007337c016df9dadd09417398badcaa9dcc4d0a6bf4fe12d41" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826744 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="rsync" containerID="cri-o://1497b14f7cabd5e74a05e2a73df9cf90fa7bf74823b616c5182cd8dc8c2842e9" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826767 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-replicator" containerID="cri-o://7f2b74932ff71cab49843de5a99395f2f3995ed6305e9aef4d30ec156d1541a1" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826789 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-expirer" containerID="cri-o://610194c27d45dd83cb9a83ae67eb964cc41e536d39f862f18b7fbc1666c8f18e" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826811 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-server" containerID="cri-o://bd9a6c11cbd3bdfd513be881b8c42a51f7ab2e9b360a02ddb2a5e8383e963bd4" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826838 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-updater" containerID="cri-o://fb9fbedf021ff6c481977b592703012f584d141b404e356a7e74031c5b137c6a" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826853 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-reaper" containerID="cri-o://17ebe3519431a9953fc33fed5f2c323c9c8f0bbaad32401ddf2b831449414025" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826880 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-auditor" containerID="cri-o://5f0a6444a7b19329569218b09c622e3e4c94a546c76d994e6e7f55372800737f" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826898 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-auditor" containerID="cri-o://a19fd42fd14901f6743c36bb99a68aa21bd85e41d93af3960094cb69a6484383" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826945 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-replicator" containerID="cri-o://e8eadc030f3e0f8a80e20468bb35964a892723d0a32063f32fdb445ccfa9a64c" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826974 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-replicator" containerID="cri-o://ef7d56ab512ba7ddcef234fef06dfaa7575dadd48c522e8ea9b685da5f62a286" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.826990 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-server" containerID="cri-o://2271c0a7b8b654682e42f554500d0387d96647dadb59e76a372f57a74ce531ad" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.889260 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.935892 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xnqnk"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:15.957455 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" containerName="ovsdbserver-sb" containerID="cri-o://8656eb8e7f120230bec3b1f18fdcd072d9f82031f44305eb6078bf7e798a30c4" gracePeriod=300 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.050061 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-fzmn4"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.091338 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xnqnk"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.127234 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-fzmn4"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.152431 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.153372 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="69d03df4-c334-4c64-a273-e4e307df5add" containerName="glance-httpd" containerID="cri-o://92383f49d49f2ba213699c653ceecca29ba9aba4cc99635c1909b4aa5789dcd9" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.153319 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="69d03df4-c334-4c64-a273-e4e307df5add" containerName="glance-log" containerID="cri-o://d3073ef9efc88121d60f580513fe1f931a6371611ccd81a1a8970553da615a84" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.181940 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7677694455-7s2lc"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.182246 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7677694455-7s2lc" podUID="68efda44-7cf0-44c4-bc50-df0b73ed5b8b" containerName="dnsmasq-dns" containerID="cri-o://7fb546caaa8981682e3f4a1e627c0bc5a97ab373a74aa0917242279733ff16a3" gracePeriod=10 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.213052 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b51d-account-create-update-6tf4f"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.242260 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.242319 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.242571 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4352c518-ada9-4e5e-9327-5bd3c34a2796" containerName="cinder-scheduler" containerID="cri-o://b1156e1b07cb20a8e9857fce83fae53cb9cfb48c367023bcb30470fba1c4f122" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.243090 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7d9eb562-3f84-458f-885a-e2fbb3e86bf3" containerName="glance-log" containerID="cri-o://9fde3b6829cb3940810ad8ed9c3d9375377b1ab751496ae29f4114c773487dad" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.243562 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4352c518-ada9-4e5e-9327-5bd3c34a2796" containerName="probe" containerID="cri-o://999e38cfe887aedaec12de40ca31ea006bf118394964b12586bbd7546473dc73" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.243934 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7d9eb562-3f84-458f-885a-e2fbb3e86bf3" containerName="glance-httpd" containerID="cri-o://182404da8e3655734af75c4e164a4fa1026d1387cba22058cc3f4e7a9156ac6b" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.254545 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-dbb59"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.264823 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-785b8b86cf-6rvf8"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.265302 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-785b8b86cf-6rvf8" podUID="8b703a8a-7e8f-4565-abc7-86f93a83e742" containerName="neutron-api" containerID="cri-o://09006301b0f6020685672d00703b449ef9d808aef32ba06e059c9e2356603a84" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.265999 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-785b8b86cf-6rvf8" podUID="8b703a8a-7e8f-4565-abc7-86f93a83e742" containerName="neutron-httpd" containerID="cri-o://563e1e2512e494abff485a89912e960e6a601055dece369633c27c745f0ea956" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.309203 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-dbb59"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.378164 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="156f1174-6481-45c2-8a34-bc744828e345" path="/var/lib/kubelet/pods/156f1174-6481-45c2-8a34-bc744828e345/volumes" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.386124 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovs-vswitchd" containerID="cri-o://0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.390366 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15d0a292-4600-45ed-a947-37e93bccaea8" path="/var/lib/kubelet/pods/15d0a292-4600-45ed-a947-37e93bccaea8/volumes" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.392264 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b9b1413-5859-4568-b3f8-d9401fe4b34e" path="/var/lib/kubelet/pods/1b9b1413-5859-4568-b3f8-d9401fe4b34e/volumes" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.396766 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e1cee2a-62ea-4010-8d71-d9ddd321a9c2" path="/var/lib/kubelet/pods/2e1cee2a-62ea-4010-8d71-d9ddd321a9c2/volumes" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.400052 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4effbb70-cdb3-42e4-a8d3-e1df904f38b9" path="/var/lib/kubelet/pods/4effbb70-cdb3-42e4-a8d3-e1df904f38b9/volumes" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.401652 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67fadcfb-082c-4b01-9972-816963267dff" path="/var/lib/kubelet/pods/67fadcfb-082c-4b01-9972-816963267dff/volumes" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.402708 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ad17d77-4987-46fc-aa46-622e08b708ea" path="/var/lib/kubelet/pods/9ad17d77-4987-46fc-aa46-622e08b708ea/volumes" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.404376 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aca947e1-09b7-404f-9162-94484bf52701" path="/var/lib/kubelet/pods/aca947e1-09b7-404f-9162-94484bf52701/volumes" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.405156 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b54bac12-7429-4d94-a023-535d11e803d4" path="/var/lib/kubelet/pods/b54bac12-7429-4d94-a023-535d11e803d4/volumes" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.405676 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2629401-ad79-4aab-b3ec-665c69eeaf78" path="/var/lib/kubelet/pods/c2629401-ad79-4aab-b3ec-665c69eeaf78/volumes" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.406837 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a96bf2-f9b9-4f1e-a884-202eb437a8fc" path="/var/lib/kubelet/pods/c7a96bf2-f9b9-4f1e-a884-202eb437a8fc/volumes" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.408820 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2fe919f-6631-4332-af78-35e21d657657" path="/var/lib/kubelet/pods/f2fe919f-6631-4332-af78-35e21d657657/volumes" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.409466 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.409495 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.409506 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-86db-account-create-update-dfrnh"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.409516 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-hfcz7"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.409524 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-s9nk6"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.409724 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="eb737ca7-c18f-4ff9-9285-bb35ee17cd05" containerName="cinder-api-log" containerID="cri-o://8fc7018d470fbeb3c101b59cebfc490bd0f1cc9cb98ca2bbc51f91d35a503470" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.410605 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="eb737ca7-c18f-4ff9-9285-bb35ee17cd05" containerName="cinder-api" containerID="cri-o://28fb45634504bebe907f3061a04e53baf4e24bef8b044220704d400b3404f735" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.416264 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-s9nk6"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.425298 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-hfcz7"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.426570 5047 generic.go:334] "Generic (PLEG): container finished" podID="68efda44-7cf0-44c4-bc50-df0b73ed5b8b" containerID="7fb546caaa8981682e3f4a1e627c0bc5a97ab373a74aa0917242279733ff16a3" exitCode=0 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.426616 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-7s2lc" event={"ID":"68efda44-7cf0-44c4-bc50-df0b73ed5b8b","Type":"ContainerDied","Data":"7fb546caaa8981682e3f4a1e627c0bc5a97ab373a74aa0917242279733ff16a3"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.432254 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8c0b-account-create-update-fvht5"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.436441 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hzs78_e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d/openstack-network-exporter/0.log" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.436509 5047 generic.go:334] "Generic (PLEG): container finished" podID="e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d" containerID="13fb30f0e98afd5102c75b063d5e9dedad3a88eb8a1cb4f9dfb35321bd545830" exitCode=2 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.436572 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hzs78" event={"ID":"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d","Type":"ContainerDied","Data":"13fb30f0e98afd5102c75b063d5e9dedad3a88eb8a1cb4f9dfb35321bd545830"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467292 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="610194c27d45dd83cb9a83ae67eb964cc41e536d39f862f18b7fbc1666c8f18e" exitCode=0 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467326 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="fb9fbedf021ff6c481977b592703012f584d141b404e356a7e74031c5b137c6a" exitCode=0 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467335 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="5f0a6444a7b19329569218b09c622e3e4c94a546c76d994e6e7f55372800737f" exitCode=0 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467342 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="e8eadc030f3e0f8a80e20468bb35964a892723d0a32063f32fdb445ccfa9a64c" exitCode=0 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467349 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="2271c0a7b8b654682e42f554500d0387d96647dadb59e76a372f57a74ce531ad" exitCode=0 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467358 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="cf249292c8f2ae891705ce8b850d38cf0ae227f4304c261b1ef67518dde444f6" exitCode=0 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467367 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="3322a1a63d982b007337c016df9dadd09417398badcaa9dcc4d0a6bf4fe12d41" exitCode=0 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467373 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="7f2b74932ff71cab49843de5a99395f2f3995ed6305e9aef4d30ec156d1541a1" exitCode=0 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467381 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="17ebe3519431a9953fc33fed5f2c323c9c8f0bbaad32401ddf2b831449414025" exitCode=0 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467389 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="a19fd42fd14901f6743c36bb99a68aa21bd85e41d93af3960094cb69a6484383" exitCode=0 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467395 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="ef7d56ab512ba7ddcef234fef06dfaa7575dadd48c522e8ea9b685da5f62a286" exitCode=0 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467401 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="07872e17cdb6ef25b82b90ca7744f33fec2b7c83d31131b37d885019a804232f" exitCode=0 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467449 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"610194c27d45dd83cb9a83ae67eb964cc41e536d39f862f18b7fbc1666c8f18e"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467481 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"fb9fbedf021ff6c481977b592703012f584d141b404e356a7e74031c5b137c6a"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467491 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"5f0a6444a7b19329569218b09c622e3e4c94a546c76d994e6e7f55372800737f"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467499 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"e8eadc030f3e0f8a80e20468bb35964a892723d0a32063f32fdb445ccfa9a64c"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467506 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"2271c0a7b8b654682e42f554500d0387d96647dadb59e76a372f57a74ce531ad"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467514 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"cf249292c8f2ae891705ce8b850d38cf0ae227f4304c261b1ef67518dde444f6"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467523 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"3322a1a63d982b007337c016df9dadd09417398badcaa9dcc4d0a6bf4fe12d41"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467531 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"7f2b74932ff71cab49843de5a99395f2f3995ed6305e9aef4d30ec156d1541a1"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467538 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"17ebe3519431a9953fc33fed5f2c323c9c8f0bbaad32401ddf2b831449414025"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467547 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"a19fd42fd14901f6743c36bb99a68aa21bd85e41d93af3960094cb69a6484383"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467555 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"ef7d56ab512ba7ddcef234fef06dfaa7575dadd48c522e8ea9b685da5f62a286"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.467564 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"07872e17cdb6ef25b82b90ca7744f33fec2b7c83d31131b37d885019a804232f"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.474897 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-j4sn8"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.478563 5047 generic.go:334] "Generic (PLEG): container finished" podID="a452777f-60a7-4cfc-9e9b-b262be0a27cf" containerID="a0e78cccdc69fcb4e0931f1814bebbb8de46564c39b911b4f4864d69cd56c138" exitCode=143 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.478655 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f4db5cb66-tjpmr" event={"ID":"a452777f-60a7-4cfc-9e9b-b262be0a27cf","Type":"ContainerDied","Data":"a0e78cccdc69fcb4e0931f1814bebbb8de46564c39b911b4f4864d69cd56c138"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.492363 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_01ca037a-0388-4fdc-9106-1abbfc17566d/ovsdbserver-nb/0.log" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.492410 5047 generic.go:334] "Generic (PLEG): container finished" podID="01ca037a-0388-4fdc-9106-1abbfc17566d" containerID="dbc428f4bdef3134e02420519caff4df17024718afa37dadd912cd858a13c0cc" exitCode=143 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.492479 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"01ca037a-0388-4fdc-9106-1abbfc17566d","Type":"ContainerDied","Data":"dbc428f4bdef3134e02420519caff4df17024718afa37dadd912cd858a13c0cc"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.497352 5047 generic.go:334] "Generic (PLEG): container finished" podID="ae539167-f011-4f50-8ce6-df90580fa157" containerID="458110c59c1b32388e1bbf0559e62ae01cdd38bb1a5dc8a73f30c5ca7efaf50f" exitCode=137 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.497641 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-j4sn8"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.498343 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1e83857b-5e17-4878-8f9b-e8d1a65325ba" containerName="rabbitmq" containerID="cri-o://2ac5bc2b66d5e8526a58472b2784d5b39320ba72eae19f8bc017391874b9616a" gracePeriod=604800 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.510796 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_850d5f83-0ab4-4d06-8fae-21e0cf9b37c8/ovsdbserver-sb/0.log" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.510866 5047 generic.go:334] "Generic (PLEG): container finished" podID="850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" containerID="98cb4fcb07ef6bff93f43c15de37a5ab70e29f4de36a5da8359c405261211da7" exitCode=2 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.510900 5047 generic.go:334] "Generic (PLEG): container finished" podID="850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" containerID="8656eb8e7f120230bec3b1f18fdcd072d9f82031f44305eb6078bf7e798a30c4" exitCode=143 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.511053 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8","Type":"ContainerDied","Data":"98cb4fcb07ef6bff93f43c15de37a5ab70e29f4de36a5da8359c405261211da7"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.511097 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8","Type":"ContainerDied","Data":"8656eb8e7f120230bec3b1f18fdcd072d9f82031f44305eb6078bf7e798a30c4"} Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.518889 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.519152 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5c628150-3bde-4740-9c94-dc208f61ade2" containerName="nova-scheduler-scheduler" containerID="cri-o://673e725806a724d8ff4b5a6b0941c0c98cab0db5716267243f589266bfe8fc9f" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:16.525134 5047 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 23 07:11:16 crc kubenswrapper[5047]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 23 07:11:16 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNBridge=br-int Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNRemote=tcp:localhost:6642 Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNEncapType=geneve Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNAvailabilityZones= Feb 23 07:11:16 crc kubenswrapper[5047]: ++ EnableChassisAsGateway=true Feb 23 07:11:16 crc kubenswrapper[5047]: ++ PhysicalNetworks= Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNHostName= Feb 23 07:11:16 crc kubenswrapper[5047]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 23 07:11:16 crc kubenswrapper[5047]: ++ ovs_dir=/var/lib/openvswitch Feb 23 07:11:16 crc kubenswrapper[5047]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 23 07:11:16 crc kubenswrapper[5047]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 23 07:11:16 crc kubenswrapper[5047]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 23 07:11:16 crc kubenswrapper[5047]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:11:16 crc kubenswrapper[5047]: + sleep 0.5 Feb 23 07:11:16 crc kubenswrapper[5047]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:11:16 crc kubenswrapper[5047]: + cleanup_ovsdb_server_semaphore Feb 23 07:11:16 crc kubenswrapper[5047]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 23 07:11:16 crc kubenswrapper[5047]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 23 07:11:16 crc kubenswrapper[5047]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-f7lbh" message=< Feb 23 07:11:16 crc kubenswrapper[5047]: Exiting ovsdb-server (5) ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 23 07:11:16 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNBridge=br-int Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNRemote=tcp:localhost:6642 Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNEncapType=geneve Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNAvailabilityZones= Feb 23 07:11:16 crc kubenswrapper[5047]: ++ EnableChassisAsGateway=true Feb 23 07:11:16 crc kubenswrapper[5047]: ++ PhysicalNetworks= Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNHostName= Feb 23 07:11:16 crc kubenswrapper[5047]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 23 07:11:16 crc kubenswrapper[5047]: ++ ovs_dir=/var/lib/openvswitch Feb 23 07:11:16 crc kubenswrapper[5047]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 23 07:11:16 crc kubenswrapper[5047]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 23 07:11:16 crc kubenswrapper[5047]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 23 07:11:16 crc kubenswrapper[5047]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:11:16 crc kubenswrapper[5047]: + sleep 0.5 Feb 23 07:11:16 crc kubenswrapper[5047]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:11:16 crc kubenswrapper[5047]: + cleanup_ovsdb_server_semaphore Feb 23 07:11:16 crc kubenswrapper[5047]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 23 07:11:16 crc kubenswrapper[5047]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 23 07:11:16 crc kubenswrapper[5047]: > Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:16.525172 5047 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 23 07:11:16 crc kubenswrapper[5047]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 23 07:11:16 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNBridge=br-int Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNRemote=tcp:localhost:6642 Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNEncapType=geneve Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNAvailabilityZones= Feb 23 07:11:16 crc kubenswrapper[5047]: ++ EnableChassisAsGateway=true Feb 23 07:11:16 crc kubenswrapper[5047]: ++ PhysicalNetworks= Feb 23 07:11:16 crc kubenswrapper[5047]: ++ OVNHostName= Feb 23 07:11:16 crc kubenswrapper[5047]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 23 07:11:16 crc kubenswrapper[5047]: ++ ovs_dir=/var/lib/openvswitch Feb 23 07:11:16 crc kubenswrapper[5047]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 23 07:11:16 crc kubenswrapper[5047]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 23 07:11:16 crc kubenswrapper[5047]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 23 07:11:16 crc kubenswrapper[5047]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:11:16 crc kubenswrapper[5047]: + sleep 0.5 Feb 23 07:11:16 crc kubenswrapper[5047]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:11:16 crc kubenswrapper[5047]: + cleanup_ovsdb_server_semaphore Feb 23 07:11:16 crc kubenswrapper[5047]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 23 07:11:16 crc kubenswrapper[5047]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 23 07:11:16 crc kubenswrapper[5047]: > pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovsdb-server" containerID="cri-o://62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.525203 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovsdb-server" containerID="cri-o://62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.525411 5047 generic.go:334] "Generic (PLEG): container finished" podID="69d03df4-c334-4c64-a273-e4e307df5add" containerID="d3073ef9efc88121d60f580513fe1f931a6371611ccd81a1a8970553da615a84" exitCode=143 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.525443 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69d03df4-c334-4c64-a273-e4e307df5add","Type":"ContainerDied","Data":"d3073ef9efc88121d60f580513fe1f931a6371611ccd81a1a8970553da615a84"} Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:16.538658 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:16.538734 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data podName:6776acf2-e53f-4892-847d-8667669a5eb9 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:18.538719115 +0000 UTC m=+1600.790046249 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data") pod "rabbitmq-server-0" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9") : configmap "rabbitmq-config-data" not found Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.556226 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e7d5-account-create-update-89ndz"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.576288 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1e83857b-5e17-4878-8f9b-e8d1a65325ba" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.576642 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.576978 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="db4c480a-88f1-42e1-bdce-21bdb85ecc48" containerName="nova-api-log" containerID="cri-o://832e9b67b68d0d91fdc72fdc39407b689d44735575143025b03550e2cf607d48" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.577489 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="db4c480a-88f1-42e1-bdce-21bdb85ecc48" containerName="nova-api-api" containerID="cri-o://9e039263a53bbeb35ac12ba86211b210fd996b0f94ebba2771ea8c71cf94c08e" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.600059 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-ll9m9"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.613210 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-ll9m9"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.633382 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-799dddc985-b669w"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.633743 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-799dddc985-b669w" podUID="8af89633-d2bc-4f80-9e1e-0eb183f11462" containerName="barbican-worker-log" containerID="cri-o://4408ad01f05541b66645dba36a664dd57a94590d9f81a623e6f9d239cac9e77e" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.634383 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-799dddc985-b669w" podUID="8af89633-d2bc-4f80-9e1e-0eb183f11462" containerName="barbican-worker" containerID="cri-o://9aac78820b50823a3db600bfbb072af48e62a5a3552b7f3389ee00434e00efbe" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.661799 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-965e-account-create-update-r6tbz"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.680280 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-67c5f5b45b-szhrl"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.680924 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" podUID="556e6ac3-8c64-4ee2-95f2-511a07bf220b" containerName="barbican-keystone-listener-log" containerID="cri-o://aa4a5146307cec757547a6d59c96484c78787b2844a2b96dc84e891c5cb4091e" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.681444 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" podUID="556e6ac3-8c64-4ee2-95f2-511a07bf220b" containerName="barbican-keystone-listener" containerID="cri-o://d7297c9cf9ea2fd2ebb4ac47a524950334eb77e75ff71aaa81cc766db2c30786" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.692237 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.693045 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerName="nova-metadata-metadata" containerID="cri-o://57928662f0ede1894731138f7257539c72b6683c8af427121b07e24b7dd35d1f" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:16.693113 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:11:16 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:11:16 crc kubenswrapper[5047]: Feb 23 07:11:16 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:11:16 crc kubenswrapper[5047]: Feb 23 07:11:16 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:11:16 crc kubenswrapper[5047]: Feb 23 07:11:16 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:11:16 crc kubenswrapper[5047]: Feb 23 07:11:16 crc kubenswrapper[5047]: if [ -n "barbican" ]; then Feb 23 07:11:16 crc kubenswrapper[5047]: GRANT_DATABASE="barbican" Feb 23 07:11:16 crc kubenswrapper[5047]: else Feb 23 07:11:16 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 07:11:16 crc kubenswrapper[5047]: fi Feb 23 07:11:16 crc kubenswrapper[5047]: Feb 23 07:11:16 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 07:11:16 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:11:16 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:11:16 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:11:16 crc kubenswrapper[5047]: # support updates Feb 23 07:11:16 crc kubenswrapper[5047]: Feb 23 07:11:16 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.693182 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerName="nova-metadata-log" containerID="cri-o://0108825e07a1a995d5a32fde6ef858690c1ed2dd364cbab05781f1acab8f3fcb" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:16.696449 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-965e-account-create-update-r6tbz" podUID="6665cd0b-0b9d-4485-8cdb-7abe364508d1" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.724770 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.742952 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnc4s\" (UniqueName: \"kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s\") pod \"nova-cell1-08b6-account-create-update-hxmz5\" (UID: \"9819ef7e-f619-45a5-b5d3-e67bb5214f24\") " pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.743031 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts\") pod \"nova-cell1-08b6-account-create-update-hxmz5\" (UID: \"9819ef7e-f619-45a5-b5d3-e67bb5214f24\") " pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:16.744091 5047 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:16.744150 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts podName:9819ef7e-f619-45a5-b5d3-e67bb5214f24 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:18.744134248 +0000 UTC m=+1600.995461382 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts") pod "nova-cell1-08b6-account-create-update-hxmz5" (UID: "9819ef7e-f619-45a5-b5d3-e67bb5214f24") : configmap "openstack-cell1-scripts" not found Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.745544 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.745881 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="5408d3e3-d1ab-45e8-b226-0eb3b26fe183" containerName="nova-cell0-conductor-conductor" containerID="cri-o://64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd" gracePeriod=30 Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:16.748706 5047 projected.go:194] Error preparing data for projected volume kube-api-access-dnc4s for pod openstack/nova-cell1-08b6-account-create-update-hxmz5: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 23 07:11:16 crc kubenswrapper[5047]: E0223 07:11:16.748753 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s podName:9819ef7e-f619-45a5-b5d3-e67bb5214f24 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:18.74874196 +0000 UTC m=+1601.000069094 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnc4s" (UniqueName: "kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s") pod "nova-cell1-08b6-account-create-update-hxmz5" (UID: "9819ef7e-f619-45a5-b5d3-e67bb5214f24") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.798277 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wlldh"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.839325 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-w65gd"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.863390 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wlldh"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.922573 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-w65gd"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.952614 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ac64-account-create-update-zn4xz"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.968245 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwwzn"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.988054 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:11:16 crc kubenswrapper[5047]: I0223 07:11:16.988414 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="911535c0-45eb-4361-b169-fad54a54d78b" containerName="nova-cell1-conductor-conductor" containerID="cri-o://fd49a3d170be31b10745d22fe236e4a8ddd84be4aa8f10cffd23d28b0ec0dcb6" gracePeriod=30 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.011370 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwwzn"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.021000 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.021452 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d513cfd3-cb98-440f-b564-d36d8f20f5a4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://106b2bdb20d1a2d2e867ad06c463fa1d14f76f5b815d49afbd6405b8e6d0af57" gracePeriod=30 Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.058108 5047 secret.go:188] Couldn't get secret openstack/placement-config-data: secret "placement-config-data" not found Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.058156 5047 secret.go:188] Couldn't get secret openstack/placement-scripts: secret "placement-scripts" not found Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.058292 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts podName:a452777f-60a7-4cfc-9e9b-b262be0a27cf nodeName:}" failed. No retries permitted until 2026-02-23 07:11:21.058257627 +0000 UTC m=+1603.309584921 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts") pod "placement-5f4db5cb66-tjpmr" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf") : secret "placement-scripts" not found Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.060145 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data podName:a452777f-60a7-4cfc-9e9b-b262be0a27cf nodeName:}" failed. No retries permitted until 2026-02-23 07:11:21.060113106 +0000 UTC m=+1603.311440370 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data") pod "placement-5f4db5cb66-tjpmr" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf") : secret "placement-config-data" not found Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.091286 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-r5j6n"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.100931 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-r5j6n"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.124513 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5b36-account-create-update-sr2bf"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.131287 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-08b6-account-create-update-hxmz5"] Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.132390 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-dnc4s operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" podUID="9819ef7e-f619-45a5-b5d3-e67bb5214f24" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.137791 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-z9zcf"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.145876 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-z9zcf"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.152148 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5cb54ddf68-t6k79"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.152482 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5cb54ddf68-t6k79" podUID="27668b66-4868-448a-b2dd-e270ed4bc677" containerName="barbican-api-log" containerID="cri-o://41b717817973f595812fe1eb20824b0b443f9f0d8cd641d6601c7057d79f6854" gracePeriod=30 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.152976 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5cb54ddf68-t6k79" podUID="27668b66-4868-448a-b2dd-e270ed4bc677" containerName="barbican-api" containerID="cri-o://71481b06f6db4dcf1c5efa4c83fd7a72403b3075d95de76c6a2614c7e972fbcd" gracePeriod=30 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.162324 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.203402 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-965e-account-create-update-r6tbz"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.234267 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bz2gr"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.251724 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_850d5f83-0ab4-4d06-8fae-21e0cf9b37c8/ovsdbserver-sb/0.log" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.251821 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.274801 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.274917 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data podName:1e83857b-5e17-4878-8f9b-e8d1a65325ba nodeName:}" failed. No retries permitted until 2026-02-23 07:11:21.274877306 +0000 UTC m=+1603.526204440 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data") pod "rabbitmq-cell1-server-0" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba") : configmap "rabbitmq-cell1-config-data" not found Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.276496 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_01ca037a-0388-4fdc-9106-1abbfc17566d/ovsdbserver-nb/0.log" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.276600 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.284667 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b51d-account-create-update-6tf4f"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.316475 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="6776acf2-e53f-4892-847d-8667669a5eb9" containerName="rabbitmq" containerID="cri-o://4ace61959f8493ad41766db44b20ac96b4147030ebc66648117da372593fea6c" gracePeriod=604800 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.320952 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-86db-account-create-update-dfrnh"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.331341 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hzs78_e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d/openstack-network-exporter/0.log" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.331467 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.336026 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.350805 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384218 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-ovs-rundir\") pod \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384292 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-ovsdb-rundir\") pod \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384321 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"01ca037a-0388-4fdc-9106-1abbfc17566d\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384342 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-ovsdbserver-nb\") pod \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384364 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae539167-f011-4f50-8ce6-df90580fa157-combined-ca-bundle\") pod \"ae539167-f011-4f50-8ce6-df90580fa157\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384386 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-ovn-rundir\") pod \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384450 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-config\") pod \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384471 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-scripts\") pod \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384488 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-ovsdbserver-sb\") pod \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384510 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-metrics-certs-tls-certs\") pod \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384538 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-dns-svc\") pod \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384568 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-combined-ca-bundle\") pod \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384603 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnz8l\" (UniqueName: \"kubernetes.io/projected/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-kube-api-access-bnz8l\") pod \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384627 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae539167-f011-4f50-8ce6-df90580fa157-openstack-config-secret\") pod \"ae539167-f011-4f50-8ce6-df90580fa157\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384648 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/01ca037a-0388-4fdc-9106-1abbfc17566d-ovsdb-rundir\") pod \"01ca037a-0388-4fdc-9106-1abbfc17566d\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384699 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-ovsdbserver-sb-tls-certs\") pod \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384714 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01ca037a-0388-4fdc-9106-1abbfc17566d-scripts\") pod \"01ca037a-0388-4fdc-9106-1abbfc17566d\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384733 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-metrics-certs-tls-certs\") pod \"01ca037a-0388-4fdc-9106-1abbfc17566d\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384757 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxnk7\" (UniqueName: \"kubernetes.io/projected/ae539167-f011-4f50-8ce6-df90580fa157-kube-api-access-cxnk7\") pod \"ae539167-f011-4f50-8ce6-df90580fa157\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384779 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-combined-ca-bundle\") pod \"01ca037a-0388-4fdc-9106-1abbfc17566d\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384814 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-combined-ca-bundle\") pod \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384831 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnxxt\" (UniqueName: \"kubernetes.io/projected/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-kube-api-access-fnxxt\") pod \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384852 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384873 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-config\") pod \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384893 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-dns-swift-storage-0\") pod \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\" (UID: \"68efda44-7cf0-44c4-bc50-df0b73ed5b8b\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384939 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpb6b\" (UniqueName: \"kubernetes.io/projected/01ca037a-0388-4fdc-9106-1abbfc17566d-kube-api-access-tpb6b\") pod \"01ca037a-0388-4fdc-9106-1abbfc17566d\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384969 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-ovsdbserver-nb-tls-certs\") pod \"01ca037a-0388-4fdc-9106-1abbfc17566d\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.384996 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ca037a-0388-4fdc-9106-1abbfc17566d-config\") pod \"01ca037a-0388-4fdc-9106-1abbfc17566d\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.385026 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6x24\" (UniqueName: \"kubernetes.io/projected/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-kube-api-access-k6x24\") pod \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\" (UID: \"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.385051 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-metrics-certs-tls-certs\") pod \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.385097 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-config\") pod \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\" (UID: \"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.385152 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae539167-f011-4f50-8ce6-df90580fa157-openstack-config\") pod \"ae539167-f011-4f50-8ce6-df90580fa157\" (UID: \"ae539167-f011-4f50-8ce6-df90580fa157\") " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.385992 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d" (UID: "e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.392196 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" (UID: "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.407446 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="9b66eae6-4565-4f56-8bdc-009aa1101a64" containerName="galera" containerID="cri-o://2336c09e2d0dcabaf0c9bd5811bb1fa5b57db421e70518d3ce36701f7bc66f57" gracePeriod=30 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.408781 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "01ca037a-0388-4fdc-9106-1abbfc17566d" (UID: "01ca037a-0388-4fdc-9106-1abbfc17566d"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.409442 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d" (UID: "e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.419498 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-scripts" (OuterVolumeSpecName: "scripts") pod "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" (UID: "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.428174 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ca037a-0388-4fdc-9106-1abbfc17566d-scripts" (OuterVolumeSpecName: "scripts") pod "01ca037a-0388-4fdc-9106-1abbfc17566d" (UID: "01ca037a-0388-4fdc-9106-1abbfc17566d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.432138 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ca037a-0388-4fdc-9106-1abbfc17566d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "01ca037a-0388-4fdc-9106-1abbfc17566d" (UID: "01ca037a-0388-4fdc-9106-1abbfc17566d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.433020 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ca037a-0388-4fdc-9106-1abbfc17566d-config" (OuterVolumeSpecName: "config") pod "01ca037a-0388-4fdc-9106-1abbfc17566d" (UID: "01ca037a-0388-4fdc-9106-1abbfc17566d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.433872 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-config" (OuterVolumeSpecName: "config") pod "e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d" (UID: "e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.435721 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-config" (OuterVolumeSpecName: "config") pod "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" (UID: "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.437354 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae539167-f011-4f50-8ce6-df90580fa157-kube-api-access-cxnk7" (OuterVolumeSpecName: "kube-api-access-cxnk7") pod "ae539167-f011-4f50-8ce6-df90580fa157" (UID: "ae539167-f011-4f50-8ce6-df90580fa157"). InnerVolumeSpecName "kube-api-access-cxnk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.461149 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-kube-api-access-k6x24" (OuterVolumeSpecName: "kube-api-access-k6x24") pod "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" (UID: "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8"). InnerVolumeSpecName "kube-api-access-k6x24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.461249 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ca037a-0388-4fdc-9106-1abbfc17566d-kube-api-access-tpb6b" (OuterVolumeSpecName: "kube-api-access-tpb6b") pod "01ca037a-0388-4fdc-9106-1abbfc17566d" (UID: "01ca037a-0388-4fdc-9106-1abbfc17566d"). InnerVolumeSpecName "kube-api-access-tpb6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.461306 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" (UID: "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.491201 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-kube-api-access-fnxxt" (OuterVolumeSpecName: "kube-api-access-fnxxt") pod "e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d" (UID: "e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d"). InnerVolumeSpecName "kube-api-access-fnxxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.491299 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-kube-api-access-bnz8l" (OuterVolumeSpecName: "kube-api-access-bnz8l") pod "68efda44-7cf0-44c4-bc50-df0b73ed5b8b" (UID: "68efda44-7cf0-44c4-bc50-df0b73ed5b8b"). InnerVolumeSpecName "kube-api-access-bnz8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.503324 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:11:17 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: if [ -n "glance" ]; then Feb 23 07:11:17 crc kubenswrapper[5047]: GRANT_DATABASE="glance" Feb 23 07:11:17 crc kubenswrapper[5047]: else Feb 23 07:11:17 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 07:11:17 crc kubenswrapper[5047]: fi Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 07:11:17 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:11:17 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:11:17 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:11:17 crc kubenswrapper[5047]: # support updates Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.505976 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-86db-account-create-update-dfrnh" podUID="6c6d00f2-ae85-4f68-9645-94e666f0e1c2" Feb 23 07:11:17 crc kubenswrapper[5047]: W0223 07:11:17.510457 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7504baf_d384_4fb8_a053_0da0c933f1e4.slice/crio-a440497af269fcb2c1ad327c0ad418bb3253aba37a62ba7f9c12fc95edeacd5e WatchSource:0}: Error finding container a440497af269fcb2c1ad327c0ad418bb3253aba37a62ba7f9c12fc95edeacd5e: Status 404 returned error can't find the container with id a440497af269fcb2c1ad327c0ad418bb3253aba37a62ba7f9c12fc95edeacd5e Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.510831 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:11:17 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: if [ -n "cinder" ]; then Feb 23 07:11:17 crc kubenswrapper[5047]: GRANT_DATABASE="cinder" Feb 23 07:11:17 crc kubenswrapper[5047]: else Feb 23 07:11:17 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 07:11:17 crc kubenswrapper[5047]: fi Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 07:11:17 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:11:17 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:11:17 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:11:17 crc kubenswrapper[5047]: # support updates Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.511964 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-8c0b-account-create-update-fvht5" podUID="fbd8c46d-274a-4544-8d0e-16242f0d9673" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517444 5047 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-ovs-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517491 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517528 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517543 5047 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517558 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517569 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnz8l\" (UniqueName: \"kubernetes.io/projected/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-kube-api-access-bnz8l\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517579 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/01ca037a-0388-4fdc-9106-1abbfc17566d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517588 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01ca037a-0388-4fdc-9106-1abbfc17566d-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517599 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxnk7\" (UniqueName: \"kubernetes.io/projected/ae539167-f011-4f50-8ce6-df90580fa157-kube-api-access-cxnk7\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517610 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnxxt\" (UniqueName: \"kubernetes.io/projected/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-kube-api-access-fnxxt\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517629 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517638 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517647 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpb6b\" (UniqueName: \"kubernetes.io/projected/01ca037a-0388-4fdc-9106-1abbfc17566d-kube-api-access-tpb6b\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.517658 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ca037a-0388-4fdc-9106-1abbfc17566d-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.518290 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6x24\" (UniqueName: \"kubernetes.io/projected/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-kube-api-access-k6x24\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.518333 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.534225 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8c0b-account-create-update-fvht5"] Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.537693 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:11:17 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: if [ -n "neutron" ]; then Feb 23 07:11:17 crc kubenswrapper[5047]: GRANT_DATABASE="neutron" Feb 23 07:11:17 crc kubenswrapper[5047]: else Feb 23 07:11:17 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 07:11:17 crc kubenswrapper[5047]: fi Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 07:11:17 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:11:17 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:11:17 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:11:17 crc kubenswrapper[5047]: # support updates Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.538337 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:11:17 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: if [ -n "placement" ]; then Feb 23 07:11:17 crc kubenswrapper[5047]: GRANT_DATABASE="placement" Feb 23 07:11:17 crc kubenswrapper[5047]: else Feb 23 07:11:17 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 07:11:17 crc kubenswrapper[5047]: fi Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 07:11:17 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:11:17 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:11:17 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:11:17 crc kubenswrapper[5047]: # support updates Feb 23 07:11:17 crc kubenswrapper[5047]: Feb 23 07:11:17 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.542845 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-b51d-account-create-update-6tf4f" podUID="8ef3c81d-1f2a-447d-b8c3-687002fa8c35" Feb 23 07:11:17 crc kubenswrapper[5047]: E0223 07:11:17.542894 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-e7d5-account-create-update-89ndz" podUID="b7504baf-d384-4fb8-a053-0da0c933f1e4" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.576052 5047 generic.go:334] "Generic (PLEG): container finished" podID="8b703a8a-7e8f-4565-abc7-86f93a83e742" containerID="563e1e2512e494abff485a89912e960e6a601055dece369633c27c745f0ea956" exitCode=0 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.576171 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-785b8b86cf-6rvf8" event={"ID":"8b703a8a-7e8f-4565-abc7-86f93a83e742","Type":"ContainerDied","Data":"563e1e2512e494abff485a89912e960e6a601055dece369633c27c745f0ea956"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.579860 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_01ca037a-0388-4fdc-9106-1abbfc17566d/ovsdbserver-nb/0.log" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.580278 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.580454 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"01ca037a-0388-4fdc-9106-1abbfc17566d","Type":"ContainerDied","Data":"cf311530a77d8812faec78173b72066894f73fbdc50a2ec099fdf21241d0732f"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.581289 5047 scope.go:117] "RemoveContainer" containerID="aab582d93c645f1cd1f93f646d13990d4cebf1f69a19788d254697fedd7fc7f7" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.599780 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d" (UID: "e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.602567 5047 generic.go:334] "Generic (PLEG): container finished" podID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerID="0108825e07a1a995d5a32fde6ef858690c1ed2dd364cbab05781f1acab8f3fcb" exitCode=143 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.602694 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa3349ce-92c0-4fc5-ae0e-6424be7ca179","Type":"ContainerDied","Data":"0108825e07a1a995d5a32fde6ef858690c1ed2dd364cbab05781f1acab8f3fcb"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.608279 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae539167-f011-4f50-8ce6-df90580fa157-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae539167-f011-4f50-8ce6-df90580fa157" (UID: "ae539167-f011-4f50-8ce6-df90580fa157"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.611939 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_850d5f83-0ab4-4d06-8fae-21e0cf9b37c8/ovsdbserver-sb/0.log" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.612117 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"850d5f83-0ab4-4d06-8fae-21e0cf9b37c8","Type":"ContainerDied","Data":"53842cd9436ba4b00e89054c5237078030ef7cd1d36577b1aec59563f9b5b96d"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.612243 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.618370 5047 generic.go:334] "Generic (PLEG): container finished" podID="27668b66-4868-448a-b2dd-e270ed4bc677" containerID="41b717817973f595812fe1eb20824b0b443f9f0d8cd641d6601c7057d79f6854" exitCode=143 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.618563 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cb54ddf68-t6k79" event={"ID":"27668b66-4868-448a-b2dd-e270ed4bc677","Type":"ContainerDied","Data":"41b717817973f595812fe1eb20824b0b443f9f0d8cd641d6601c7057d79f6854"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.621026 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae539167-f011-4f50-8ce6-df90580fa157-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.621052 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.644099 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e7d5-account-create-update-89ndz"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.644420 5047 generic.go:334] "Generic (PLEG): container finished" podID="7d9eb562-3f84-458f-885a-e2fbb3e86bf3" containerID="9fde3b6829cb3940810ad8ed9c3d9375377b1ab751496ae29f4114c773487dad" exitCode=143 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.644470 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d9eb562-3f84-458f-885a-e2fbb3e86bf3","Type":"ContainerDied","Data":"9fde3b6829cb3940810ad8ed9c3d9375377b1ab751496ae29f4114c773487dad"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.649706 5047 scope.go:117] "RemoveContainer" containerID="dbc428f4bdef3134e02420519caff4df17024718afa37dadd912cd858a13c0cc" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.657698 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c0b-account-create-update-fvht5" event={"ID":"fbd8c46d-274a-4544-8d0e-16242f0d9673","Type":"ContainerStarted","Data":"760e15cebff217b3513a79374e60d2b94b1735120681d693c8851f17a26e5dad"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.667120 5047 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.691411 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01ca037a-0388-4fdc-9106-1abbfc17566d" (UID: "01ca037a-0388-4fdc-9106-1abbfc17566d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.696191 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-d867c5cb7-qbvv7"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.697221 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-d867c5cb7-qbvv7" podUID="f29958fb-3a36-427e-8094-62f7522b7a17" containerName="proxy-httpd" containerID="cri-o://370e63e6a062f56f822bb2fff7821e2ee1f32ee82378743cc49b225b72866f01" gracePeriod=30 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.697863 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-d867c5cb7-qbvv7" podUID="f29958fb-3a36-427e-8094-62f7522b7a17" containerName="proxy-server" containerID="cri-o://b5495e9f8841c7dec1c3b19f82d389219f186ba7641ce0bc83c99ddb4383352d" gracePeriod=30 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.715927 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="d513cfd3-cb98-440f-b564-d36d8f20f5a4" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.200:6080/vnc_lite.html\": dial tcp 10.217.0.200:6080: connect: connection refused" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.721483 5047 scope.go:117] "RemoveContainer" containerID="98cb4fcb07ef6bff93f43c15de37a5ab70e29f4de36a5da8359c405261211da7" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.723656 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.723696 5047 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.734603 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="1497b14f7cabd5e74a05e2a73df9cf90fa7bf74823b616c5182cd8dc8c2842e9" exitCode=0 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.734657 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="bd9a6c11cbd3bdfd513be881b8c42a51f7ab2e9b360a02ddb2a5e8383e963bd4" exitCode=0 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.734737 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"1497b14f7cabd5e74a05e2a73df9cf90fa7bf74823b616c5182cd8dc8c2842e9"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.734775 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"bd9a6c11cbd3bdfd513be881b8c42a51f7ab2e9b360a02ddb2a5e8383e963bd4"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.740817 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae539167-f011-4f50-8ce6-df90580fa157-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ae539167-f011-4f50-8ce6-df90580fa157" (UID: "ae539167-f011-4f50-8ce6-df90580fa157"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.772135 5047 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.778050 5047 generic.go:334] "Generic (PLEG): container finished" podID="eb737ca7-c18f-4ff9-9285-bb35ee17cd05" containerID="8fc7018d470fbeb3c101b59cebfc490bd0f1cc9cb98ca2bbc51f91d35a503470" exitCode=143 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.778117 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eb737ca7-c18f-4ff9-9285-bb35ee17cd05","Type":"ContainerDied","Data":"8fc7018d470fbeb3c101b59cebfc490bd0f1cc9cb98ca2bbc51f91d35a503470"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.788571 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bz2gr" event={"ID":"4cdbcdf8-785e-4383-ab02-4492360bf4b4","Type":"ContainerStarted","Data":"a1574901785be60857ec18d629ff9b5cfdcf73694ff90509e4c6b4d31858496b"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.813778 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.814372 5047 scope.go:117] "RemoveContainer" containerID="8656eb8e7f120230bec3b1f18fdcd072d9f82031f44305eb6078bf7e798a30c4" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.823932 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-965e-account-create-update-r6tbz" event={"ID":"6665cd0b-0b9d-4485-8cdb-7abe364508d1","Type":"ContainerStarted","Data":"71f190234e062d39feb415d080c63e22b263ef8e06e0338f19d39b812a121f61"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.840628 5047 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae539167-f011-4f50-8ce6-df90580fa157-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.840681 5047 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.848561 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dqll"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.880837 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-7s2lc" event={"ID":"68efda44-7cf0-44c4-bc50-df0b73ed5b8b","Type":"ContainerDied","Data":"233c74d5d28cbde9dcb36c1578fe5f1cc611f193fe0b34e65d313162444f9a8c"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.881227 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-7s2lc" Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.935082 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5b36-account-create-update-sr2bf"] Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.962190 5047 generic.go:334] "Generic (PLEG): container finished" podID="4352c518-ada9-4e5e-9327-5bd3c34a2796" containerID="999e38cfe887aedaec12de40ca31ea006bf118394964b12586bbd7546473dc73" exitCode=0 Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.962371 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4352c518-ada9-4e5e-9327-5bd3c34a2796","Type":"ContainerDied","Data":"999e38cfe887aedaec12de40ca31ea006bf118394964b12586bbd7546473dc73"} Feb 23 07:11:17 crc kubenswrapper[5047]: I0223 07:11:17.967213 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b51d-account-create-update-6tf4f" event={"ID":"8ef3c81d-1f2a-447d-b8c3-687002fa8c35","Type":"ContainerStarted","Data":"48047bb1cd69f7598941c8f6dcaa0b3afb4b3cd7144ecd212a2a872d5209229a"} Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.017036 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd49a3d170be31b10745d22fe236e4a8ddd84be4aa8f10cffd23d28b0ec0dcb6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.025297 5047 generic.go:334] "Generic (PLEG): container finished" podID="556e6ac3-8c64-4ee2-95f2-511a07bf220b" containerID="aa4a5146307cec757547a6d59c96484c78787b2844a2b96dc84e891c5cb4091e" exitCode=143 Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.025541 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" event={"ID":"556e6ac3-8c64-4ee2-95f2-511a07bf220b","Type":"ContainerDied","Data":"aa4a5146307cec757547a6d59c96484c78787b2844a2b96dc84e891c5cb4091e"} Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.029354 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ac64-account-create-update-zn4xz"] Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.045793 5047 generic.go:334] "Generic (PLEG): container finished" podID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" exitCode=0 Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.045915 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f7lbh" event={"ID":"df2d42b2-545c-47ab-ba87-ff81a4cced8d","Type":"ContainerDied","Data":"62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945"} Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.054104 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd49a3d170be31b10745d22fe236e4a8ddd84be4aa8f10cffd23d28b0ec0dcb6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.054525 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-86db-account-create-update-dfrnh" event={"ID":"6c6d00f2-ae85-4f68-9645-94e666f0e1c2","Type":"ContainerStarted","Data":"9285d6e794551f7669dc7968cc24020bbb6c7a738ddfaf64623705bff362dae4"} Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.075219 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd49a3d170be31b10745d22fe236e4a8ddd84be4aa8f10cffd23d28b0ec0dcb6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.079611 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="911535c0-45eb-4361-b169-fad54a54d78b" containerName="nova-cell1-conductor-conductor" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.080678 5047 generic.go:334] "Generic (PLEG): container finished" podID="db4c480a-88f1-42e1-bdce-21bdb85ecc48" containerID="832e9b67b68d0d91fdc72fdc39407b689d44735575143025b03550e2cf607d48" exitCode=143 Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.082013 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db4c480a-88f1-42e1-bdce-21bdb85ecc48","Type":"ContainerDied","Data":"832e9b67b68d0d91fdc72fdc39407b689d44735575143025b03550e2cf607d48"} Feb 23 07:11:18 crc kubenswrapper[5047]: W0223 07:11:18.095537 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod969db92a_069e_4713_bca7_5e4ebb89d612.slice/crio-6fdef3452d7a7d7bebad69a84aa8804dbcd2c8588dce9e51c5361c499235c79e WatchSource:0}: Error finding container 6fdef3452d7a7d7bebad69a84aa8804dbcd2c8588dce9e51c5361c499235c79e: Status 404 returned error can't find the container with id 6fdef3452d7a7d7bebad69a84aa8804dbcd2c8588dce9e51c5361c499235c79e Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.102537 5047 generic.go:334] "Generic (PLEG): container finished" podID="8af89633-d2bc-4f80-9e1e-0eb183f11462" containerID="4408ad01f05541b66645dba36a664dd57a94590d9f81a623e6f9d239cac9e77e" exitCode=143 Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.102697 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-799dddc985-b669w" event={"ID":"8af89633-d2bc-4f80-9e1e-0eb183f11462","Type":"ContainerDied","Data":"4408ad01f05541b66645dba36a664dd57a94590d9f81a623e6f9d239cac9e77e"} Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.105047 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e7d5-account-create-update-89ndz" event={"ID":"b7504baf-d384-4fb8-a053-0da0c933f1e4","Type":"ContainerStarted","Data":"a440497af269fcb2c1ad327c0ad418bb3253aba37a62ba7f9c12fc95edeacd5e"} Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.141611 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:11:18 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:11:18 crc kubenswrapper[5047]: Feb 23 07:11:18 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:11:18 crc kubenswrapper[5047]: Feb 23 07:11:18 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:11:18 crc kubenswrapper[5047]: Feb 23 07:11:18 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:11:18 crc kubenswrapper[5047]: Feb 23 07:11:18 crc kubenswrapper[5047]: if [ -n "nova_cell0" ]; then Feb 23 07:11:18 crc kubenswrapper[5047]: GRANT_DATABASE="nova_cell0" Feb 23 07:11:18 crc kubenswrapper[5047]: else Feb 23 07:11:18 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 07:11:18 crc kubenswrapper[5047]: fi Feb 23 07:11:18 crc kubenswrapper[5047]: Feb 23 07:11:18 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 07:11:18 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:11:18 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:11:18 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:11:18 crc kubenswrapper[5047]: # support updates Feb 23 07:11:18 crc kubenswrapper[5047]: Feb 23 07:11:18 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.168508 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-5b36-account-create-update-sr2bf" podUID="969db92a-069e-4713-bca7-5e4ebb89d612" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.178321 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hzs78_e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d/openstack-network-exporter/0.log" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.179006 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.182338 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hzs78" event={"ID":"e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d","Type":"ContainerDied","Data":"3103dd1797324eceb063028eb91c70e4411d7a06ccdd9013ce0fb5bbb26c17d7"} Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.183744 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hzs78" Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.217163 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:11:18 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:11:18 crc kubenswrapper[5047]: Feb 23 07:11:18 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:11:18 crc kubenswrapper[5047]: Feb 23 07:11:18 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:11:18 crc kubenswrapper[5047]: Feb 23 07:11:18 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:11:18 crc kubenswrapper[5047]: Feb 23 07:11:18 crc kubenswrapper[5047]: if [ -n "nova_api" ]; then Feb 23 07:11:18 crc kubenswrapper[5047]: GRANT_DATABASE="nova_api" Feb 23 07:11:18 crc kubenswrapper[5047]: else Feb 23 07:11:18 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 07:11:18 crc kubenswrapper[5047]: fi Feb 23 07:11:18 crc kubenswrapper[5047]: Feb 23 07:11:18 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 07:11:18 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:11:18 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:11:18 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:11:18 crc kubenswrapper[5047]: # support updates Feb 23 07:11:18 crc kubenswrapper[5047]: Feb 23 07:11:18 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.218337 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-ac64-account-create-update-zn4xz" podUID="82712c2b-a91c-4e0e-9400-0624b0459f57" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.274630 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-config" (OuterVolumeSpecName: "config") pod "68efda44-7cf0-44c4-bc50-df0b73ed5b8b" (UID: "68efda44-7cf0-44c4-bc50-df0b73ed5b8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.279751 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.280781 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "68efda44-7cf0-44c4-bc50-df0b73ed5b8b" (UID: "68efda44-7cf0-44c4-bc50-df0b73ed5b8b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.376290 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "68efda44-7cf0-44c4-bc50-df0b73ed5b8b" (UID: "68efda44-7cf0-44c4-bc50-df0b73ed5b8b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.387554 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.392298 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.392347 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.394215 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.407339 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5" path="/var/lib/kubelet/pods/0dc3bfa6-76a6-45fb-bf1f-7c12b0959ce5/volumes" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.408780 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c981b84-6b45-4ba9-be71-22f074a5ccd4" path="/var/lib/kubelet/pods/2c981b84-6b45-4ba9-be71-22f074a5ccd4/volumes" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.409479 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4675c789-acd8-480a-a09f-52ab9a827bb6" path="/var/lib/kubelet/pods/4675c789-acd8-480a-a09f-52ab9a827bb6/volumes" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.410465 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49095dcf-0259-4bde-a0c1-695841ebd224" path="/var/lib/kubelet/pods/49095dcf-0259-4bde-a0c1-695841ebd224/volumes" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.412332 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d5409fa-0e21-401f-9b92-09556836eb13" path="/var/lib/kubelet/pods/4d5409fa-0e21-401f-9b92-09556836eb13/volumes" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.413115 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e380efd-ee4f-4339-b6ad-5688849a2c93" path="/var/lib/kubelet/pods/7e380efd-ee4f-4339-b6ad-5688849a2c93/volumes" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.414170 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e852f86-de0c-4691-b6dc-c0149b5932fb" path="/var/lib/kubelet/pods/7e852f86-de0c-4691-b6dc-c0149b5932fb/volumes" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.416177 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a268e45-febe-4ae8-8c29-8f058d3b7e1d" path="/var/lib/kubelet/pods/8a268e45-febe-4ae8-8c29-8f058d3b7e1d/volumes" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.417544 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6427b69-b06b-4935-9206-ae09750f8900" path="/var/lib/kubelet/pods/b6427b69-b06b-4935-9206-ae09750f8900/volumes" Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.432378 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.432490 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="5408d3e3-d1ab-45e8-b226-0eb3b26fe183" containerName="nova-cell0-conductor-conductor" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.505824 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "68efda44-7cf0-44c4-bc50-df0b73ed5b8b" (UID: "68efda44-7cf0-44c4-bc50-df0b73ed5b8b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.520633 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" (UID: "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.552667 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68efda44-7cf0-44c4-bc50-df0b73ed5b8b" (UID: "68efda44-7cf0-44c4-bc50-df0b73ed5b8b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.589636 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d" (UID: "e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.602531 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.602569 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.602579 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.602590 5047 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68efda44-7cf0-44c4-bc50-df0b73ed5b8b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.602665 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.602728 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data podName:6776acf2-e53f-4892-847d-8667669a5eb9 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:22.602708891 +0000 UTC m=+1604.854036025 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data") pod "rabbitmq-server-0" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9") : configmap "rabbitmq-config-data" not found Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.610303 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" (UID: "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.653489 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" (UID: "850d5f83-0ab4-4d06-8fae-21e0cf9b37c8"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.663716 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae539167-f011-4f50-8ce6-df90580fa157-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ae539167-f011-4f50-8ce6-df90580fa157" (UID: "ae539167-f011-4f50-8ce6-df90580fa157"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.702105 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "01ca037a-0388-4fdc-9106-1abbfc17566d" (UID: "01ca037a-0388-4fdc-9106-1abbfc17566d"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.707033 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "01ca037a-0388-4fdc-9106-1abbfc17566d" (UID: "01ca037a-0388-4fdc-9106-1abbfc17566d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.723167 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-metrics-certs-tls-certs\") pod \"01ca037a-0388-4fdc-9106-1abbfc17566d\" (UID: \"01ca037a-0388-4fdc-9106-1abbfc17566d\") " Feb 23 07:11:18 crc kubenswrapper[5047]: W0223 07:11:18.723379 5047 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/01ca037a-0388-4fdc-9106-1abbfc17566d/volumes/kubernetes.io~secret/metrics-certs-tls-certs Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.723397 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "01ca037a-0388-4fdc-9106-1abbfc17566d" (UID: "01ca037a-0388-4fdc-9106-1abbfc17566d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.724169 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.724187 5047 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae539167-f011-4f50-8ce6-df90580fa157-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.724198 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.724207 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.724216 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ca037a-0388-4fdc-9106-1abbfc17566d-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.834437 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnc4s\" (UniqueName: \"kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s\") pod \"nova-cell1-08b6-account-create-update-hxmz5\" (UID: \"9819ef7e-f619-45a5-b5d3-e67bb5214f24\") " pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:18 crc kubenswrapper[5047]: I0223 07:11:18.835183 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts\") pod \"nova-cell1-08b6-account-create-update-hxmz5\" (UID: \"9819ef7e-f619-45a5-b5d3-e67bb5214f24\") " pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.835475 5047 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.835702 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts podName:9819ef7e-f619-45a5-b5d3-e67bb5214f24 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:22.835684361 +0000 UTC m=+1605.087011485 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts") pod "nova-cell1-08b6-account-create-update-hxmz5" (UID: "9819ef7e-f619-45a5-b5d3-e67bb5214f24") : configmap "openstack-cell1-scripts" not found Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.845342 5047 projected.go:194] Error preparing data for projected volume kube-api-access-dnc4s for pod openstack/nova-cell1-08b6-account-create-update-hxmz5: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.845457 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s podName:9819ef7e-f619-45a5-b5d3-e67bb5214f24 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:22.845425457 +0000 UTC m=+1605.096752601 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnc4s" (UniqueName: "kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s") pod "nova-cell1-08b6-account-create-update-hxmz5" (UID: "9819ef7e-f619-45a5-b5d3-e67bb5214f24") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.997514 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2336c09e2d0dcabaf0c9bd5811bb1fa5b57db421e70518d3ce36701f7bc66f57 is running failed: container process not found" containerID="2336c09e2d0dcabaf0c9bd5811bb1fa5b57db421e70518d3ce36701f7bc66f57" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.997789 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2336c09e2d0dcabaf0c9bd5811bb1fa5b57db421e70518d3ce36701f7bc66f57 is running failed: container process not found" containerID="2336c09e2d0dcabaf0c9bd5811bb1fa5b57db421e70518d3ce36701f7bc66f57" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.998112 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2336c09e2d0dcabaf0c9bd5811bb1fa5b57db421e70518d3ce36701f7bc66f57 is running failed: container process not found" containerID="2336c09e2d0dcabaf0c9bd5811bb1fa5b57db421e70518d3ce36701f7bc66f57" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 23 07:11:18 crc kubenswrapper[5047]: E0223 07:11:18.998173 5047 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2336c09e2d0dcabaf0c9bd5811bb1fa5b57db421e70518d3ce36701f7bc66f57 is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="9b66eae6-4565-4f56-8bdc-009aa1101a64" containerName="galera" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.236197 5047 scope.go:117] "RemoveContainer" containerID="458110c59c1b32388e1bbf0559e62ae01cdd38bb1a5dc8a73f30c5ca7efaf50f" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.239864 5047 generic.go:334] "Generic (PLEG): container finished" podID="9b66eae6-4565-4f56-8bdc-009aa1101a64" containerID="2336c09e2d0dcabaf0c9bd5811bb1fa5b57db421e70518d3ce36701f7bc66f57" exitCode=0 Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.239984 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9b66eae6-4565-4f56-8bdc-009aa1101a64","Type":"ContainerDied","Data":"2336c09e2d0dcabaf0c9bd5811bb1fa5b57db421e70518d3ce36701f7bc66f57"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.245887 5047 generic.go:334] "Generic (PLEG): container finished" podID="d513cfd3-cb98-440f-b564-d36d8f20f5a4" containerID="106b2bdb20d1a2d2e867ad06c463fa1d14f76f5b815d49afbd6405b8e6d0af57" exitCode=0 Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.246097 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d513cfd3-cb98-440f-b564-d36d8f20f5a4","Type":"ContainerDied","Data":"106b2bdb20d1a2d2e867ad06c463fa1d14f76f5b815d49afbd6405b8e6d0af57"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.246142 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d513cfd3-cb98-440f-b564-d36d8f20f5a4","Type":"ContainerDied","Data":"56e9e173bf16fb369d683be7c134bed29b8b32cbd978952eb87f9751d77dabbd"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.246176 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56e9e173bf16fb369d683be7c134bed29b8b32cbd978952eb87f9751d77dabbd" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.252230 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5b36-account-create-update-sr2bf" event={"ID":"969db92a-069e-4713-bca7-5e4ebb89d612","Type":"ContainerStarted","Data":"6fdef3452d7a7d7bebad69a84aa8804dbcd2c8588dce9e51c5361c499235c79e"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.259046 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e7d5-account-create-update-89ndz" event={"ID":"b7504baf-d384-4fb8-a053-0da0c933f1e4","Type":"ContainerDied","Data":"a440497af269fcb2c1ad327c0ad418bb3253aba37a62ba7f9c12fc95edeacd5e"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.259103 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a440497af269fcb2c1ad327c0ad418bb3253aba37a62ba7f9c12fc95edeacd5e" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.268019 5047 generic.go:334] "Generic (PLEG): container finished" podID="4352c518-ada9-4e5e-9327-5bd3c34a2796" containerID="b1156e1b07cb20a8e9857fce83fae53cb9cfb48c367023bcb30470fba1c4f122" exitCode=0 Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.268153 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4352c518-ada9-4e5e-9327-5bd3c34a2796","Type":"ContainerDied","Data":"b1156e1b07cb20a8e9857fce83fae53cb9cfb48c367023bcb30470fba1c4f122"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.287346 5047 generic.go:334] "Generic (PLEG): container finished" podID="4cdbcdf8-785e-4383-ab02-4492360bf4b4" containerID="f57fe1717429a55ae45cdfc5deaf8b990a03382780211bed789b9fa6228db106" exitCode=1 Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.287965 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bz2gr" event={"ID":"4cdbcdf8-785e-4383-ab02-4492360bf4b4","Type":"ContainerDied","Data":"f57fe1717429a55ae45cdfc5deaf8b990a03382780211bed789b9fa6228db106"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.289125 5047 scope.go:117] "RemoveContainer" containerID="f57fe1717429a55ae45cdfc5deaf8b990a03382780211bed789b9fa6228db106" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.298207 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-86db-account-create-update-dfrnh" event={"ID":"6c6d00f2-ae85-4f68-9645-94e666f0e1c2","Type":"ContainerDied","Data":"9285d6e794551f7669dc7968cc24020bbb6c7a738ddfaf64623705bff362dae4"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.298263 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9285d6e794551f7669dc7968cc24020bbb6c7a738ddfaf64623705bff362dae4" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.309681 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-5f4db5cb66-tjpmr" podUID="a452777f-60a7-4cfc-9e9b-b262be0a27cf" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.167:8778/\": read tcp 10.217.0.2:39796->10.217.0.167:8778: read: connection reset by peer" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.310345 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-5f4db5cb66-tjpmr" podUID="a452777f-60a7-4cfc-9e9b-b262be0a27cf" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.167:8778/\": read tcp 10.217.0.2:39794->10.217.0.167:8778: read: connection reset by peer" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.334537 5047 generic.go:334] "Generic (PLEG): container finished" podID="911535c0-45eb-4361-b169-fad54a54d78b" containerID="fd49a3d170be31b10745d22fe236e4a8ddd84be4aa8f10cffd23d28b0ec0dcb6" exitCode=0 Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.334663 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"911535c0-45eb-4361-b169-fad54a54d78b","Type":"ContainerDied","Data":"fd49a3d170be31b10745d22fe236e4a8ddd84be4aa8f10cffd23d28b0ec0dcb6"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.346688 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac64-account-create-update-zn4xz" event={"ID":"82712c2b-a91c-4e0e-9400-0624b0459f57","Type":"ContainerStarted","Data":"5514b4db54daaed4a675e4dd17a6506ddcc223c2ff34bb30243b0a73453d4636"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.352561 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c0b-account-create-update-fvht5" event={"ID":"fbd8c46d-274a-4544-8d0e-16242f0d9673","Type":"ContainerDied","Data":"760e15cebff217b3513a79374e60d2b94b1735120681d693c8851f17a26e5dad"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.352591 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="760e15cebff217b3513a79374e60d2b94b1735120681d693c8851f17a26e5dad" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.354611 5047 generic.go:334] "Generic (PLEG): container finished" podID="f29958fb-3a36-427e-8094-62f7522b7a17" containerID="b5495e9f8841c7dec1c3b19f82d389219f186ba7641ce0bc83c99ddb4383352d" exitCode=0 Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.354648 5047 generic.go:334] "Generic (PLEG): container finished" podID="f29958fb-3a36-427e-8094-62f7522b7a17" containerID="370e63e6a062f56f822bb2fff7821e2ee1f32ee82378743cc49b225b72866f01" exitCode=0 Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.354677 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d867c5cb7-qbvv7" event={"ID":"f29958fb-3a36-427e-8094-62f7522b7a17","Type":"ContainerDied","Data":"b5495e9f8841c7dec1c3b19f82d389219f186ba7641ce0bc83c99ddb4383352d"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.354692 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d867c5cb7-qbvv7" event={"ID":"f29958fb-3a36-427e-8094-62f7522b7a17","Type":"ContainerDied","Data":"370e63e6a062f56f822bb2fff7821e2ee1f32ee82378743cc49b225b72866f01"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.355758 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dqll" event={"ID":"8fc70af3-2572-4019-adcb-8aecf538ae27","Type":"ContainerStarted","Data":"b4540fa9d05dda117642801ed880bc527e2c94ba7d53cadaed2d9d9d9b2022b4"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.372159 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b51d-account-create-update-6tf4f" event={"ID":"8ef3c81d-1f2a-447d-b8c3-687002fa8c35","Type":"ContainerDied","Data":"48047bb1cd69f7598941c8f6dcaa0b3afb4b3cd7144ecd212a2a872d5209229a"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.372219 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48047bb1cd69f7598941c8f6dcaa0b3afb4b3cd7144ecd212a2a872d5209229a" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.394443 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-965e-account-create-update-r6tbz" event={"ID":"6665cd0b-0b9d-4485-8cdb-7abe364508d1","Type":"ContainerDied","Data":"71f190234e062d39feb415d080c63e22b263ef8e06e0338f19d39b812a121f61"} Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.394502 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71f190234e062d39feb415d080c63e22b263ef8e06e0338f19d39b812a121f61" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.658807 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.715662 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c0b-account-create-update-fvht5" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.782322 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.782766 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b51d-account-create-update-6tf4f" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.782863 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7677694455-7s2lc"] Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.789069 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7677694455-7s2lc"] Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.806982 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.809661 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxrlq\" (UniqueName: \"kubernetes.io/projected/fbd8c46d-274a-4544-8d0e-16242f0d9673-kube-api-access-zxrlq\") pod \"fbd8c46d-274a-4544-8d0e-16242f0d9673\" (UID: \"fbd8c46d-274a-4544-8d0e-16242f0d9673\") " Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.809701 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-combined-ca-bundle\") pod \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.809925 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-nova-novncproxy-tls-certs\") pod \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.809979 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbd8c46d-274a-4544-8d0e-16242f0d9673-operator-scripts\") pod \"fbd8c46d-274a-4544-8d0e-16242f0d9673\" (UID: \"fbd8c46d-274a-4544-8d0e-16242f0d9673\") " Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.809996 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-config-data\") pod \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.810067 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dktq\" (UniqueName: \"kubernetes.io/projected/d513cfd3-cb98-440f-b564-d36d8f20f5a4-kube-api-access-7dktq\") pod \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.810094 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef3c81d-1f2a-447d-b8c3-687002fa8c35-operator-scripts\") pod \"8ef3c81d-1f2a-447d-b8c3-687002fa8c35\" (UID: \"8ef3c81d-1f2a-447d-b8c3-687002fa8c35\") " Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.810109 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn7df\" (UniqueName: \"kubernetes.io/projected/8ef3c81d-1f2a-447d-b8c3-687002fa8c35-kube-api-access-nn7df\") pod \"8ef3c81d-1f2a-447d-b8c3-687002fa8c35\" (UID: \"8ef3c81d-1f2a-447d-b8c3-687002fa8c35\") " Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.810130 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-vencrypt-tls-certs\") pod \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.817831 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="eb737ca7-c18f-4ff9-9285-bb35ee17cd05" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.163:8776/healthcheck\": read tcp 10.217.0.2:50208->10.217.0.163:8776: read: connection reset by peer" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.819994 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbd8c46d-274a-4544-8d0e-16242f0d9673-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fbd8c46d-274a-4544-8d0e-16242f0d9673" (UID: "fbd8c46d-274a-4544-8d0e-16242f0d9673"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.824234 5047 scope.go:117] "RemoveContainer" containerID="7fb546caaa8981682e3f4a1e627c0bc5a97ab373a74aa0917242279733ff16a3" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.824700 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-86db-account-create-update-dfrnh" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.825755 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ef3c81d-1f2a-447d-b8c3-687002fa8c35-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ef3c81d-1f2a-447d-b8c3-687002fa8c35" (UID: "8ef3c81d-1f2a-447d-b8c3-687002fa8c35"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.828429 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.829637 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-965e-account-create-update-r6tbz" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.845539 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 07:11:19 crc kubenswrapper[5047]: E0223 07:11:19.854489 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 673e725806a724d8ff4b5a6b0941c0c98cab0db5716267243f589266bfe8fc9f is running failed: container process not found" containerID="673e725806a724d8ff4b5a6b0941c0c98cab0db5716267243f589266bfe8fc9f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:11:19 crc kubenswrapper[5047]: E0223 07:11:19.859825 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 673e725806a724d8ff4b5a6b0941c0c98cab0db5716267243f589266bfe8fc9f is running failed: container process not found" containerID="673e725806a724d8ff4b5a6b0941c0c98cab0db5716267243f589266bfe8fc9f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:11:19 crc kubenswrapper[5047]: E0223 07:11:19.861239 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 673e725806a724d8ff4b5a6b0941c0c98cab0db5716267243f589266bfe8fc9f is running failed: container process not found" containerID="673e725806a724d8ff4b5a6b0941c0c98cab0db5716267243f589266bfe8fc9f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:11:19 crc kubenswrapper[5047]: E0223 07:11:19.861316 5047 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 673e725806a724d8ff4b5a6b0941c0c98cab0db5716267243f589266bfe8fc9f is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5c628150-3bde-4740-9c94-dc208f61ade2" containerName="nova-scheduler-scheduler" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.866555 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.868176 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d513cfd3-cb98-440f-b564-d36d8f20f5a4-kube-api-access-7dktq" (OuterVolumeSpecName: "kube-api-access-7dktq") pod "d513cfd3-cb98-440f-b564-d36d8f20f5a4" (UID: "d513cfd3-cb98-440f-b564-d36d8f20f5a4"). InnerVolumeSpecName "kube-api-access-7dktq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.868820 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbd8c46d-274a-4544-8d0e-16242f0d9673-kube-api-access-zxrlq" (OuterVolumeSpecName: "kube-api-access-zxrlq") pod "fbd8c46d-274a-4544-8d0e-16242f0d9673" (UID: "fbd8c46d-274a-4544-8d0e-16242f0d9673"). InnerVolumeSpecName "kube-api-access-zxrlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.869387 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef3c81d-1f2a-447d-b8c3-687002fa8c35-kube-api-access-nn7df" (OuterVolumeSpecName: "kube-api-access-nn7df") pod "8ef3c81d-1f2a-447d-b8c3-687002fa8c35" (UID: "8ef3c81d-1f2a-447d-b8c3-687002fa8c35"). InnerVolumeSpecName "kube-api-access-nn7df". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.900594 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-hzs78"] Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.904142 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7d9eb562-3f84-458f-885a-e2fbb3e86bf3" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.179:9292/healthcheck\": read tcp 10.217.0.2:52718->10.217.0.179:9292: read: connection reset by peer" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.904413 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7d9eb562-3f84-458f-885a-e2fbb3e86bf3" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.179:9292/healthcheck\": read tcp 10.217.0.2:52714->10.217.0.179:9292: read: connection reset by peer" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.906560 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-hzs78"] Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.919558 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e7d5-account-create-update-89ndz" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.924133 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxrlq\" (UniqueName: \"kubernetes.io/projected/fbd8c46d-274a-4544-8d0e-16242f0d9673-kube-api-access-zxrlq\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.924193 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbd8c46d-274a-4544-8d0e-16242f0d9673-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.924206 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dktq\" (UniqueName: \"kubernetes.io/projected/d513cfd3-cb98-440f-b564-d36d8f20f5a4-kube-api-access-7dktq\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.924218 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef3c81d-1f2a-447d-b8c3-687002fa8c35-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.924228 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn7df\" (UniqueName: \"kubernetes.io/projected/8ef3c81d-1f2a-447d-b8c3-687002fa8c35-kube-api-access-nn7df\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:19 crc kubenswrapper[5047]: I0223 07:11:19.985081 5047 scope.go:117] "RemoveContainer" containerID="df677e26a5cb88dc86b06af292209bbf0272870d4b53c39276d9422ebee88afe" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.032273 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6665cd0b-0b9d-4485-8cdb-7abe364508d1-operator-scripts\") pod \"6665cd0b-0b9d-4485-8cdb-7abe364508d1\" (UID: \"6665cd0b-0b9d-4485-8cdb-7abe364508d1\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.032316 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgg69\" (UniqueName: \"kubernetes.io/projected/6c6d00f2-ae85-4f68-9645-94e666f0e1c2-kube-api-access-sgg69\") pod \"6c6d00f2-ae85-4f68-9645-94e666f0e1c2\" (UID: \"6c6d00f2-ae85-4f68-9645-94e666f0e1c2\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.032377 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7504baf-d384-4fb8-a053-0da0c933f1e4-operator-scripts\") pod \"b7504baf-d384-4fb8-a053-0da0c933f1e4\" (UID: \"b7504baf-d384-4fb8-a053-0da0c933f1e4\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.032453 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vltlc\" (UniqueName: \"kubernetes.io/projected/6665cd0b-0b9d-4485-8cdb-7abe364508d1-kube-api-access-vltlc\") pod \"6665cd0b-0b9d-4485-8cdb-7abe364508d1\" (UID: \"6665cd0b-0b9d-4485-8cdb-7abe364508d1\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.032568 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c6d00f2-ae85-4f68-9645-94e666f0e1c2-operator-scripts\") pod \"6c6d00f2-ae85-4f68-9645-94e666f0e1c2\" (UID: \"6c6d00f2-ae85-4f68-9645-94e666f0e1c2\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.032591 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kjzz\" (UniqueName: \"kubernetes.io/projected/b7504baf-d384-4fb8-a053-0da0c933f1e4-kube-api-access-5kjzz\") pod \"b7504baf-d384-4fb8-a053-0da0c933f1e4\" (UID: \"b7504baf-d384-4fb8-a053-0da0c933f1e4\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.047957 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6665cd0b-0b9d-4485-8cdb-7abe364508d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6665cd0b-0b9d-4485-8cdb-7abe364508d1" (UID: "6665cd0b-0b9d-4485-8cdb-7abe364508d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.048379 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c6d00f2-ae85-4f68-9645-94e666f0e1c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c6d00f2-ae85-4f68-9645-94e666f0e1c2" (UID: "6c6d00f2-ae85-4f68-9645-94e666f0e1c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.053000 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7504baf-d384-4fb8-a053-0da0c933f1e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7504baf-d384-4fb8-a053-0da0c933f1e4" (UID: "b7504baf-d384-4fb8-a053-0da0c933f1e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.071437 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6665cd0b-0b9d-4485-8cdb-7abe364508d1-kube-api-access-vltlc" (OuterVolumeSpecName: "kube-api-access-vltlc") pod "6665cd0b-0b9d-4485-8cdb-7abe364508d1" (UID: "6665cd0b-0b9d-4485-8cdb-7abe364508d1"). InnerVolumeSpecName "kube-api-access-vltlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.071964 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c6d00f2-ae85-4f68-9645-94e666f0e1c2-kube-api-access-sgg69" (OuterVolumeSpecName: "kube-api-access-sgg69") pod "6c6d00f2-ae85-4f68-9645-94e666f0e1c2" (UID: "6c6d00f2-ae85-4f68-9645-94e666f0e1c2"). InnerVolumeSpecName "kube-api-access-sgg69". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.073618 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7504baf-d384-4fb8-a053-0da0c933f1e4-kube-api-access-5kjzz" (OuterVolumeSpecName: "kube-api-access-5kjzz") pod "b7504baf-d384-4fb8-a053-0da0c933f1e4" (UID: "b7504baf-d384-4fb8-a053-0da0c933f1e4"). InnerVolumeSpecName "kube-api-access-5kjzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.112997 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": read tcp 10.217.0.2:46880->10.217.0.207:8775: read: connection reset by peer" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.113369 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": read tcp 10.217.0.2:46884->10.217.0.207:8775: read: connection reset by peer" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.133126 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d513cfd3-cb98-440f-b564-d36d8f20f5a4" (UID: "d513cfd3-cb98-440f-b564-d36d8f20f5a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.151561 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-config-data" (OuterVolumeSpecName: "config-data") pod "d513cfd3-cb98-440f-b564-d36d8f20f5a4" (UID: "d513cfd3-cb98-440f-b564-d36d8f20f5a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.160209 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "d513cfd3-cb98-440f-b564-d36d8f20f5a4" (UID: "d513cfd3-cb98-440f-b564-d36d8f20f5a4"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.160275 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "d513cfd3-cb98-440f-b564-d36d8f20f5a4" (UID: "d513cfd3-cb98-440f-b564-d36d8f20f5a4"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.169439 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-nova-novncproxy-tls-certs\") pod \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.169555 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-config-data\") pod \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.169738 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-vencrypt-tls-certs\") pod \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\" (UID: \"d513cfd3-cb98-440f-b564-d36d8f20f5a4\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.170861 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgg69\" (UniqueName: \"kubernetes.io/projected/6c6d00f2-ae85-4f68-9645-94e666f0e1c2-kube-api-access-sgg69\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.170881 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6665cd0b-0b9d-4485-8cdb-7abe364508d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.170894 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:20 crc kubenswrapper[5047]: W0223 07:11:20.172890 5047 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d513cfd3-cb98-440f-b564-d36d8f20f5a4/volumes/kubernetes.io~secret/nova-novncproxy-tls-certs Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.172959 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "d513cfd3-cb98-440f-b564-d36d8f20f5a4" (UID: "d513cfd3-cb98-440f-b564-d36d8f20f5a4"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: W0223 07:11:20.172973 5047 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d513cfd3-cb98-440f-b564-d36d8f20f5a4/volumes/kubernetes.io~secret/vencrypt-tls-certs Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.173007 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "d513cfd3-cb98-440f-b564-d36d8f20f5a4" (UID: "d513cfd3-cb98-440f-b564-d36d8f20f5a4"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: W0223 07:11:20.173014 5047 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d513cfd3-cb98-440f-b564-d36d8f20f5a4/volumes/kubernetes.io~secret/config-data Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.173060 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-config-data" (OuterVolumeSpecName: "config-data") pod "d513cfd3-cb98-440f-b564-d36d8f20f5a4" (UID: "d513cfd3-cb98-440f-b564-d36d8f20f5a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.170903 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7504baf-d384-4fb8-a053-0da0c933f1e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.174049 5047 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.174061 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vltlc\" (UniqueName: \"kubernetes.io/projected/6665cd0b-0b9d-4485-8cdb-7abe364508d1-kube-api-access-vltlc\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.174073 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.174084 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c6d00f2-ae85-4f68-9645-94e666f0e1c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.174095 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kjzz\" (UniqueName: \"kubernetes.io/projected/b7504baf-d384-4fb8-a053-0da0c933f1e4-kube-api-access-5kjzz\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.174105 5047 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d513cfd3-cb98-440f-b564-d36d8f20f5a4-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.191324 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fk6gc" podUID="6f9e1257-3765-4d4e-8110-81c55d1546d4" containerName="ovn-controller" probeResult="failure" output=< Feb 23 07:11:20 crc kubenswrapper[5047]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Feb 23 07:11:20 crc kubenswrapper[5047]: > Feb 23 07:11:20 crc kubenswrapper[5047]: E0223 07:11:20.202172 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:20 crc kubenswrapper[5047]: E0223 07:11:20.202755 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:20 crc kubenswrapper[5047]: E0223 07:11:20.203209 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:20 crc kubenswrapper[5047]: E0223 07:11:20.203271 5047 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovsdb-server" Feb 23 07:11:20 crc kubenswrapper[5047]: E0223 07:11:20.203374 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:20 crc kubenswrapper[5047]: E0223 07:11:20.207111 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:20 crc kubenswrapper[5047]: E0223 07:11:20.208660 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:20 crc kubenswrapper[5047]: E0223 07:11:20.208706 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovs-vswitchd" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.363569 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ca037a-0388-4fdc-9106-1abbfc17566d" path="/var/lib/kubelet/pods/01ca037a-0388-4fdc-9106-1abbfc17566d/volumes" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.364858 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68efda44-7cf0-44c4-bc50-df0b73ed5b8b" path="/var/lib/kubelet/pods/68efda44-7cf0-44c4-bc50-df0b73ed5b8b/volumes" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.365537 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" path="/var/lib/kubelet/pods/850d5f83-0ab4-4d06-8fae-21e0cf9b37c8/volumes" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.367173 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae539167-f011-4f50-8ce6-df90580fa157" path="/var/lib/kubelet/pods/ae539167-f011-4f50-8ce6-df90580fa157/volumes" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.367714 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d" path="/var/lib/kubelet/pods/e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d/volumes" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.434065 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5cb54ddf68-t6k79" podUID="27668b66-4868-448a-b2dd-e270ed4bc677" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:37454->10.217.0.164:9311: read: connection reset by peer" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.439198 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5cb54ddf68-t6k79" podUID="27668b66-4868-448a-b2dd-e270ed4bc677" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:37448->10.217.0.164:9311: read: connection reset by peer" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.448268 5047 generic.go:334] "Generic (PLEG): container finished" podID="4cdbcdf8-785e-4383-ab02-4492360bf4b4" containerID="0d84f1852bd6b7dbc4d9bdf4ea958827422207ab1118be596167434c3cd34775" exitCode=1 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.448336 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bz2gr" event={"ID":"4cdbcdf8-785e-4383-ab02-4492360bf4b4","Type":"ContainerDied","Data":"0d84f1852bd6b7dbc4d9bdf4ea958827422207ab1118be596167434c3cd34775"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.449036 5047 scope.go:117] "RemoveContainer" containerID="0d84f1852bd6b7dbc4d9bdf4ea958827422207ab1118be596167434c3cd34775" Feb 23 07:11:20 crc kubenswrapper[5047]: E0223 07:11:20.449326 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-bz2gr_openstack(4cdbcdf8-785e-4383-ab02-4492360bf4b4)\"" pod="openstack/root-account-create-update-bz2gr" podUID="4cdbcdf8-785e-4383-ab02-4492360bf4b4" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.454432 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4352c518-ada9-4e5e-9327-5bd3c34a2796","Type":"ContainerDied","Data":"34bba24d9435900c4d3d508105adeca7395482f32084702d9e129a01e170eb4d"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.454490 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34bba24d9435900c4d3d508105adeca7395482f32084702d9e129a01e170eb4d" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.465456 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5b36-account-create-update-sr2bf" event={"ID":"969db92a-069e-4713-bca7-5e4ebb89d612","Type":"ContainerDied","Data":"6fdef3452d7a7d7bebad69a84aa8804dbcd2c8588dce9e51c5361c499235c79e"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.465527 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fdef3452d7a7d7bebad69a84aa8804dbcd2c8588dce9e51c5361c499235c79e" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.471784 5047 generic.go:334] "Generic (PLEG): container finished" podID="db4c480a-88f1-42e1-bdce-21bdb85ecc48" containerID="9e039263a53bbeb35ac12ba86211b210fd996b0f94ebba2771ea8c71cf94c08e" exitCode=0 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.471995 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db4c480a-88f1-42e1-bdce-21bdb85ecc48","Type":"ContainerDied","Data":"9e039263a53bbeb35ac12ba86211b210fd996b0f94ebba2771ea8c71cf94c08e"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.480486 5047 generic.go:334] "Generic (PLEG): container finished" podID="a452777f-60a7-4cfc-9e9b-b262be0a27cf" containerID="1bff42b06a8741633963197012e9bcd6401b4df5472444c488f4134bb829bd6d" exitCode=0 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.480651 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f4db5cb66-tjpmr" event={"ID":"a452777f-60a7-4cfc-9e9b-b262be0a27cf","Type":"ContainerDied","Data":"1bff42b06a8741633963197012e9bcd6401b4df5472444c488f4134bb829bd6d"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.480714 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f4db5cb66-tjpmr" event={"ID":"a452777f-60a7-4cfc-9e9b-b262be0a27cf","Type":"ContainerDied","Data":"a48ca9edf9fb0ab543fac6566b706c45ff5504533f7079893ebae06fdb015716"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.480729 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a48ca9edf9fb0ab543fac6566b706c45ff5504533f7079893ebae06fdb015716" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.485270 5047 generic.go:334] "Generic (PLEG): container finished" podID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerID="57928662f0ede1894731138f7257539c72b6683c8af427121b07e24b7dd35d1f" exitCode=0 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.485340 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa3349ce-92c0-4fc5-ae0e-6424be7ca179","Type":"ContainerDied","Data":"57928662f0ede1894731138f7257539c72b6683c8af427121b07e24b7dd35d1f"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.490180 5047 generic.go:334] "Generic (PLEG): container finished" podID="7d9eb562-3f84-458f-885a-e2fbb3e86bf3" containerID="182404da8e3655734af75c4e164a4fa1026d1387cba22058cc3f4e7a9156ac6b" exitCode=0 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.490299 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d9eb562-3f84-458f-885a-e2fbb3e86bf3","Type":"ContainerDied","Data":"182404da8e3655734af75c4e164a4fa1026d1387cba22058cc3f4e7a9156ac6b"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.499281 5047 generic.go:334] "Generic (PLEG): container finished" podID="69d03df4-c334-4c64-a273-e4e307df5add" containerID="92383f49d49f2ba213699c653ceecca29ba9aba4cc99635c1909b4aa5789dcd9" exitCode=0 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.499381 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69d03df4-c334-4c64-a273-e4e307df5add","Type":"ContainerDied","Data":"92383f49d49f2ba213699c653ceecca29ba9aba4cc99635c1909b4aa5789dcd9"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.501004 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac64-account-create-update-zn4xz" event={"ID":"82712c2b-a91c-4e0e-9400-0624b0459f57","Type":"ContainerDied","Data":"5514b4db54daaed4a675e4dd17a6506ddcc223c2ff34bb30243b0a73453d4636"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.501075 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5514b4db54daaed4a675e4dd17a6506ddcc223c2ff34bb30243b0a73453d4636" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.502985 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"911535c0-45eb-4361-b169-fad54a54d78b","Type":"ContainerDied","Data":"88888b246d29f927a3c12dc0e8aa82e6bcb34276355e5324bd30894dc59ad8af"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.503017 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88888b246d29f927a3c12dc0e8aa82e6bcb34276355e5324bd30894dc59ad8af" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.514973 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9b66eae6-4565-4f56-8bdc-009aa1101a64","Type":"ContainerDied","Data":"560cfda7a2ac6dcc6ad6cf9845a31396ed7f647407dc569b698b26c35498c194"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.515035 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="560cfda7a2ac6dcc6ad6cf9845a31396ed7f647407dc569b698b26c35498c194" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.520093 5047 generic.go:334] "Generic (PLEG): container finished" podID="eb737ca7-c18f-4ff9-9285-bb35ee17cd05" containerID="28fb45634504bebe907f3061a04e53baf4e24bef8b044220704d400b3404f735" exitCode=0 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.520152 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eb737ca7-c18f-4ff9-9285-bb35ee17cd05","Type":"ContainerDied","Data":"28fb45634504bebe907f3061a04e53baf4e24bef8b044220704d400b3404f735"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.526585 5047 generic.go:334] "Generic (PLEG): container finished" podID="8fc70af3-2572-4019-adcb-8aecf538ae27" containerID="7d8407173e772c769800365332bb20da85cafec8a5d5122f7467ba940f1ccaab" exitCode=0 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.526707 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dqll" event={"ID":"8fc70af3-2572-4019-adcb-8aecf538ae27","Type":"ContainerDied","Data":"7d8407173e772c769800365332bb20da85cafec8a5d5122f7467ba940f1ccaab"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.531054 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d867c5cb7-qbvv7" event={"ID":"f29958fb-3a36-427e-8094-62f7522b7a17","Type":"ContainerDied","Data":"50597a9835253b8b1600f3f45196fa05180d5e0f4546ef14277aebfeb1eff068"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.531115 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50597a9835253b8b1600f3f45196fa05180d5e0f4546ef14277aebfeb1eff068" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.533303 5047 generic.go:334] "Generic (PLEG): container finished" podID="5c628150-3bde-4740-9c94-dc208f61ade2" containerID="673e725806a724d8ff4b5a6b0941c0c98cab0db5716267243f589266bfe8fc9f" exitCode=0 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.533361 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c628150-3bde-4740-9c94-dc208f61ade2","Type":"ContainerDied","Data":"673e725806a724d8ff4b5a6b0941c0c98cab0db5716267243f589266bfe8fc9f"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.533407 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c628150-3bde-4740-9c94-dc208f61ade2","Type":"ContainerDied","Data":"3da926f76ee4233efa2ff012807dcadbff80d3b70dd92f81f859289ec06ac986"} Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.533418 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-965e-account-create-update-r6tbz" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.533422 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3da926f76ee4233efa2ff012807dcadbff80d3b70dd92f81f859289ec06ac986" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.533485 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c0b-account-create-update-fvht5" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.536502 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e7d5-account-create-update-89ndz" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.536653 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.536836 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-08b6-account-create-update-hxmz5" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.536982 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b51d-account-create-update-6tf4f" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.537748 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-86db-account-create-update-dfrnh" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.710733 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.714244 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.725566 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="ceilometer-central-agent" containerID="cri-o://ac42b72f2ef3e70086002c27d0507227c425dcef23dd72ee4b41f90f506d1f7e" gracePeriod=30 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.725797 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="proxy-httpd" containerID="cri-o://1843426c8ae30cc809f15219c4a71a415a94a52a2bc0a9a35f7e3c5e3ca3cd62" gracePeriod=30 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.725846 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="sg-core" containerID="cri-o://de1bae2fc8abe4518e053551089b717523cf45b88661909a16503dc7aee58baf" gracePeriod=30 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.725888 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="ceilometer-notification-agent" containerID="cri-o://235052c048504a3689eeba925718b5995ae8bb50fcdd7c36cd0a37a3cac313c6" gracePeriod=30 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.742552 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.743388 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.743675 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="95d08092-3eac-4289-b594-a77d5dfecfe9" containerName="kube-state-metrics" containerID="cri-o://897b14a78dfbcdad54f56571d5c5ba61205fe2678162c41bb69a05cb9c0da5b7" gracePeriod=30 Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.749708 5047 scope.go:117] "RemoveContainer" containerID="13fb30f0e98afd5102c75b063d5e9dedad3a88eb8a1cb4f9dfb35321bd545830" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.806099 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5b36-account-create-update-sr2bf" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.809568 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 07:11:20 crc kubenswrapper[5047]: E0223 07:11:20.828568 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.892259 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-86db-account-create-update-dfrnh"] Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.913740 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.958591 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-config-data\") pod \"f29958fb-3a36-427e-8094-62f7522b7a17\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.958671 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts\") pod \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.958722 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n4v2\" (UniqueName: \"kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-kube-api-access-2n4v2\") pod \"f29958fb-3a36-427e-8094-62f7522b7a17\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.958761 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911535c0-45eb-4361-b169-fad54a54d78b-combined-ca-bundle\") pod \"911535c0-45eb-4361-b169-fad54a54d78b\" (UID: \"911535c0-45eb-4361-b169-fad54a54d78b\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.958789 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-internal-tls-certs\") pod \"f29958fb-3a36-427e-8094-62f7522b7a17\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.958819 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-kolla-config\") pod \"9b66eae6-4565-4f56-8bdc-009aa1101a64\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.958847 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-combined-ca-bundle\") pod \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.958889 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/969db92a-069e-4713-bca7-5e4ebb89d612-operator-scripts\") pod \"969db92a-069e-4713-bca7-5e4ebb89d612\" (UID: \"969db92a-069e-4713-bca7-5e4ebb89d612\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.958935 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b66eae6-4565-4f56-8bdc-009aa1101a64-galera-tls-certs\") pod \"9b66eae6-4565-4f56-8bdc-009aa1101a64\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.958971 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-internal-tls-certs\") pod \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959026 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911535c0-45eb-4361-b169-fad54a54d78b-config-data\") pod \"911535c0-45eb-4361-b169-fad54a54d78b\" (UID: \"911535c0-45eb-4361-b169-fad54a54d78b\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959089 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a452777f-60a7-4cfc-9e9b-b262be0a27cf-logs\") pod \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959134 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b66eae6-4565-4f56-8bdc-009aa1101a64-config-data-generated\") pod \"9b66eae6-4565-4f56-8bdc-009aa1101a64\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959161 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfn69\" (UniqueName: \"kubernetes.io/projected/9b66eae6-4565-4f56-8bdc-009aa1101a64-kube-api-access-pfn69\") pod \"9b66eae6-4565-4f56-8bdc-009aa1101a64\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959207 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-public-tls-certs\") pod \"f29958fb-3a36-427e-8094-62f7522b7a17\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959229 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-config-data-default\") pod \"9b66eae6-4565-4f56-8bdc-009aa1101a64\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959295 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b66eae6-4565-4f56-8bdc-009aa1101a64-combined-ca-bundle\") pod \"9b66eae6-4565-4f56-8bdc-009aa1101a64\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959327 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmtwx\" (UniqueName: \"kubernetes.io/projected/911535c0-45eb-4361-b169-fad54a54d78b-kube-api-access-kmtwx\") pod \"911535c0-45eb-4361-b169-fad54a54d78b\" (UID: \"911535c0-45eb-4361-b169-fad54a54d78b\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959363 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwsp5\" (UniqueName: \"kubernetes.io/projected/969db92a-069e-4713-bca7-5e4ebb89d612-kube-api-access-pwsp5\") pod \"969db92a-069e-4713-bca7-5e4ebb89d612\" (UID: \"969db92a-069e-4713-bca7-5e4ebb89d612\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959410 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-etc-swift\") pod \"f29958fb-3a36-427e-8094-62f7522b7a17\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959429 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f29958fb-3a36-427e-8094-62f7522b7a17-run-httpd\") pod \"f29958fb-3a36-427e-8094-62f7522b7a17\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959462 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data\") pod \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959540 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-combined-ca-bundle\") pod \"f29958fb-3a36-427e-8094-62f7522b7a17\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959576 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"9b66eae6-4565-4f56-8bdc-009aa1101a64\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959600 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-public-tls-certs\") pod \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959636 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k82bh\" (UniqueName: \"kubernetes.io/projected/a452777f-60a7-4cfc-9e9b-b262be0a27cf-kube-api-access-k82bh\") pod \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\" (UID: \"a452777f-60a7-4cfc-9e9b-b262be0a27cf\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959684 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f29958fb-3a36-427e-8094-62f7522b7a17-log-httpd\") pod \"f29958fb-3a36-427e-8094-62f7522b7a17\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.959706 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-operator-scripts\") pod \"9b66eae6-4565-4f56-8bdc-009aa1101a64\" (UID: \"9b66eae6-4565-4f56-8bdc-009aa1101a64\") " Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.961267 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9b66eae6-4565-4f56-8bdc-009aa1101a64" (UID: "9b66eae6-4565-4f56-8bdc-009aa1101a64"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.963066 5047 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.963912 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b66eae6-4565-4f56-8bdc-009aa1101a64" (UID: "9b66eae6-4565-4f56-8bdc-009aa1101a64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.964516 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a452777f-60a7-4cfc-9e9b-b262be0a27cf-logs" (OuterVolumeSpecName: "logs") pod "a452777f-60a7-4cfc-9e9b-b262be0a27cf" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.964756 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/969db92a-069e-4713-bca7-5e4ebb89d612-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "969db92a-069e-4713-bca7-5e4ebb89d612" (UID: "969db92a-069e-4713-bca7-5e4ebb89d612"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:20 crc kubenswrapper[5047]: I0223 07:11:20.971129 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f29958fb-3a36-427e-8094-62f7522b7a17-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f29958fb-3a36-427e-8094-62f7522b7a17" (UID: "f29958fb-3a36-427e-8094-62f7522b7a17"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.006773 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f29958fb-3a36-427e-8094-62f7522b7a17-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f29958fb-3a36-427e-8094-62f7522b7a17" (UID: "f29958fb-3a36-427e-8094-62f7522b7a17"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.022598 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-86db-account-create-update-dfrnh"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.022621 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts" (OuterVolumeSpecName: "scripts") pod "a452777f-60a7-4cfc-9e9b-b262be0a27cf" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.050445 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b66eae6-4565-4f56-8bdc-009aa1101a64-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "9b66eae6-4565-4f56-8bdc-009aa1101a64" (UID: "9b66eae6-4565-4f56-8bdc-009aa1101a64"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.053725 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "9b66eae6-4565-4f56-8bdc-009aa1101a64" (UID: "9b66eae6-4565-4f56-8bdc-009aa1101a64"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.057966 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.058140 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.063856 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a452777f-60a7-4cfc-9e9b-b262be0a27cf-kube-api-access-k82bh" (OuterVolumeSpecName: "kube-api-access-k82bh") pod "a452777f-60a7-4cfc-9e9b-b262be0a27cf" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf"). InnerVolumeSpecName "kube-api-access-k82bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.065962 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.066049 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="b9ebb281-4310-483e-b599-3d3c8775e341" containerName="ovn-northd" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.066522 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f29958fb-3a36-427e-8094-62f7522b7a17" (UID: "f29958fb-3a36-427e-8094-62f7522b7a17"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.068482 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac64-account-create-update-zn4xz" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.070417 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-kube-api-access-2n4v2" (OuterVolumeSpecName: "kube-api-access-2n4v2") pod "f29958fb-3a36-427e-8094-62f7522b7a17" (UID: "f29958fb-3a36-427e-8094-62f7522b7a17"). InnerVolumeSpecName "kube-api-access-2n4v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.077154 5047 scope.go:117] "RemoveContainer" containerID="f57fe1717429a55ae45cdfc5deaf8b990a03382780211bed789b9fa6228db106" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.077738 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-combined-ca-bundle\") pod \"4352c518-ada9-4e5e-9327-5bd3c34a2796\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.078032 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-scripts\") pod \"4352c518-ada9-4e5e-9327-5bd3c34a2796\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.078274 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-etc-swift\") pod \"f29958fb-3a36-427e-8094-62f7522b7a17\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.078347 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4352c518-ada9-4e5e-9327-5bd3c34a2796-etc-machine-id\") pod \"4352c518-ada9-4e5e-9327-5bd3c34a2796\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.078523 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-478lp\" (UniqueName: \"kubernetes.io/projected/4352c518-ada9-4e5e-9327-5bd3c34a2796-kube-api-access-478lp\") pod \"4352c518-ada9-4e5e-9327-5bd3c34a2796\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.078591 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n4v2\" (UniqueName: \"kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-kube-api-access-2n4v2\") pod \"f29958fb-3a36-427e-8094-62f7522b7a17\" (UID: \"f29958fb-3a36-427e-8094-62f7522b7a17\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.078650 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-config-data-custom\") pod \"4352c518-ada9-4e5e-9327-5bd3c34a2796\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.078745 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-config-data\") pod \"4352c518-ada9-4e5e-9327-5bd3c34a2796\" (UID: \"4352c518-ada9-4e5e-9327-5bd3c34a2796\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.079249 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/911535c0-45eb-4361-b169-fad54a54d78b-kube-api-access-kmtwx" (OuterVolumeSpecName: "kube-api-access-kmtwx") pod "911535c0-45eb-4361-b169-fad54a54d78b" (UID: "911535c0-45eb-4361-b169-fad54a54d78b"). InnerVolumeSpecName "kube-api-access-kmtwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: W0223 07:11:21.079476 5047 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f29958fb-3a36-427e-8094-62f7522b7a17/volumes/kubernetes.io~projected/kube-api-access-2n4v2 Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.079573 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-kube-api-access-2n4v2" (OuterVolumeSpecName: "kube-api-access-2n4v2") pod "f29958fb-3a36-427e-8094-62f7522b7a17" (UID: "f29958fb-3a36-427e-8094-62f7522b7a17"). InnerVolumeSpecName "kube-api-access-2n4v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: W0223 07:11:21.079919 5047 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f29958fb-3a36-427e-8094-62f7522b7a17/volumes/kubernetes.io~projected/etc-swift Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.079949 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f29958fb-3a36-427e-8094-62f7522b7a17" (UID: "f29958fb-3a36-427e-8094-62f7522b7a17"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.079974 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4352c518-ada9-4e5e-9327-5bd3c34a2796-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4352c518-ada9-4e5e-9327-5bd3c34a2796" (UID: "4352c518-ada9-4e5e-9327-5bd3c34a2796"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.080667 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k82bh\" (UniqueName: \"kubernetes.io/projected/a452777f-60a7-4cfc-9e9b-b262be0a27cf-kube-api-access-k82bh\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.080701 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f29958fb-3a36-427e-8094-62f7522b7a17-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.080724 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.080747 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.080760 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n4v2\" (UniqueName: \"kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-kube-api-access-2n4v2\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.080779 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/969db92a-069e-4713-bca7-5e4ebb89d612-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.080794 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a452777f-60a7-4cfc-9e9b-b262be0a27cf-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.080810 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b66eae6-4565-4f56-8bdc-009aa1101a64-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.080824 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b66eae6-4565-4f56-8bdc-009aa1101a64-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.080845 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmtwx\" (UniqueName: \"kubernetes.io/projected/911535c0-45eb-4361-b169-fad54a54d78b-kube-api-access-kmtwx\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.080871 5047 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f29958fb-3a36-427e-8094-62f7522b7a17-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.080886 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f29958fb-3a36-427e-8094-62f7522b7a17-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.080901 5047 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4352c518-ada9-4e5e-9327-5bd3c34a2796-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.081322 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b66eae6-4565-4f56-8bdc-009aa1101a64-kube-api-access-pfn69" (OuterVolumeSpecName: "kube-api-access-pfn69") pod "9b66eae6-4565-4f56-8bdc-009aa1101a64" (UID: "9b66eae6-4565-4f56-8bdc-009aa1101a64"). InnerVolumeSpecName "kube-api-access-pfn69". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.084092 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8b47-account-create-update-btdp7"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.102738 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4352c518-ada9-4e5e-9327-5bd3c34a2796" (UID: "4352c518-ada9-4e5e-9327-5bd3c34a2796"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.103797 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/969db92a-069e-4713-bca7-5e4ebb89d612-kube-api-access-pwsp5" (OuterVolumeSpecName: "kube-api-access-pwsp5") pod "969db92a-069e-4713-bca7-5e4ebb89d612" (UID: "969db92a-069e-4713-bca7-5e4ebb89d612"). InnerVolumeSpecName "kube-api-access-pwsp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.104853 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4352c518-ada9-4e5e-9327-5bd3c34a2796-kube-api-access-478lp" (OuterVolumeSpecName: "kube-api-access-478lp") pod "4352c518-ada9-4e5e-9327-5bd3c34a2796" (UID: "4352c518-ada9-4e5e-9327-5bd3c34a2796"). InnerVolumeSpecName "kube-api-access-478lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.109750 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-scripts" (OuterVolumeSpecName: "scripts") pod "4352c518-ada9-4e5e-9327-5bd3c34a2796" (UID: "4352c518-ada9-4e5e-9327-5bd3c34a2796"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.121071 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.121732 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="4c814692-df9e-470f-8aad-364d48f82b81" containerName="memcached" containerID="cri-o://bcb9e71f1342662d147a950f9bce3acb70d72672c9ecd0106c2560c4f8b80214" gracePeriod=30 Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.127513 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8b47-account-create-update-btdp7"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.134852 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "9b66eae6-4565-4f56-8bdc-009aa1101a64" (UID: "9b66eae6-4565-4f56-8bdc-009aa1101a64"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.137353 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8b47-account-create-update-5zh78"] Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138273 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a452777f-60a7-4cfc-9e9b-b262be0a27cf" containerName="placement-log" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138287 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a452777f-60a7-4cfc-9e9b-b262be0a27cf" containerName="placement-log" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138300 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ca037a-0388-4fdc-9106-1abbfc17566d" containerName="openstack-network-exporter" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138306 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ca037a-0388-4fdc-9106-1abbfc17566d" containerName="openstack-network-exporter" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138318 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4352c518-ada9-4e5e-9327-5bd3c34a2796" containerName="probe" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138326 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4352c518-ada9-4e5e-9327-5bd3c34a2796" containerName="probe" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138339 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a452777f-60a7-4cfc-9e9b-b262be0a27cf" containerName="placement-api" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138345 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a452777f-60a7-4cfc-9e9b-b262be0a27cf" containerName="placement-api" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138373 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ca037a-0388-4fdc-9106-1abbfc17566d" containerName="ovsdbserver-nb" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138379 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ca037a-0388-4fdc-9106-1abbfc17566d" containerName="ovsdbserver-nb" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138389 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d" containerName="openstack-network-exporter" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138395 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d" containerName="openstack-network-exporter" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138406 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29958fb-3a36-427e-8094-62f7522b7a17" containerName="proxy-httpd" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138412 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29958fb-3a36-427e-8094-62f7522b7a17" containerName="proxy-httpd" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138427 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d513cfd3-cb98-440f-b564-d36d8f20f5a4" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138433 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d513cfd3-cb98-440f-b564-d36d8f20f5a4" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138447 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68efda44-7cf0-44c4-bc50-df0b73ed5b8b" containerName="init" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138459 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="68efda44-7cf0-44c4-bc50-df0b73ed5b8b" containerName="init" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138470 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b66eae6-4565-4f56-8bdc-009aa1101a64" containerName="mysql-bootstrap" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138478 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b66eae6-4565-4f56-8bdc-009aa1101a64" containerName="mysql-bootstrap" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138505 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4352c518-ada9-4e5e-9327-5bd3c34a2796" containerName="cinder-scheduler" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138510 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4352c518-ada9-4e5e-9327-5bd3c34a2796" containerName="cinder-scheduler" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138523 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b66eae6-4565-4f56-8bdc-009aa1101a64" containerName="galera" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138528 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b66eae6-4565-4f56-8bdc-009aa1101a64" containerName="galera" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138536 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" containerName="openstack-network-exporter" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138541 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" containerName="openstack-network-exporter" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138551 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" containerName="ovsdbserver-sb" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138558 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" containerName="ovsdbserver-sb" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138568 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911535c0-45eb-4361-b169-fad54a54d78b" containerName="nova-cell1-conductor-conductor" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138576 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="911535c0-45eb-4361-b169-fad54a54d78b" containerName="nova-cell1-conductor-conductor" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138588 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29958fb-3a36-427e-8094-62f7522b7a17" containerName="proxy-server" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138593 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29958fb-3a36-427e-8094-62f7522b7a17" containerName="proxy-server" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.138606 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68efda44-7cf0-44c4-bc50-df0b73ed5b8b" containerName="dnsmasq-dns" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138611 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="68efda44-7cf0-44c4-bc50-df0b73ed5b8b" containerName="dnsmasq-dns" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138781 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" containerName="ovsdbserver-sb" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138799 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4352c518-ada9-4e5e-9327-5bd3c34a2796" containerName="cinder-scheduler" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138807 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29958fb-3a36-427e-8094-62f7522b7a17" containerName="proxy-httpd" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138815 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="850d5f83-0ab4-4d06-8fae-21e0cf9b37c8" containerName="openstack-network-exporter" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138823 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a452777f-60a7-4cfc-9e9b-b262be0a27cf" containerName="placement-log" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138832 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ca037a-0388-4fdc-9106-1abbfc17566d" containerName="ovsdbserver-nb" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138840 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="911535c0-45eb-4361-b169-fad54a54d78b" containerName="nova-cell1-conductor-conductor" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138849 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b66eae6-4565-4f56-8bdc-009aa1101a64" containerName="galera" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138861 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f71e4e-4c37-45a9-ab2a-b9197e9b5f7d" containerName="openstack-network-exporter" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138872 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4352c518-ada9-4e5e-9327-5bd3c34a2796" containerName="probe" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138881 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29958fb-3a36-427e-8094-62f7522b7a17" containerName="proxy-server" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138889 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d513cfd3-cb98-440f-b564-d36d8f20f5a4" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138899 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ca037a-0388-4fdc-9106-1abbfc17566d" containerName="openstack-network-exporter" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138920 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a452777f-60a7-4cfc-9e9b-b262be0a27cf" containerName="placement-api" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.138927 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="68efda44-7cf0-44c4-bc50-df0b73ed5b8b" containerName="dnsmasq-dns" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.141481 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b47-account-create-update-5zh78" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.147090 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.161012 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8b47-account-create-update-5zh78"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.171438 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lc9cb"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.182131 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lc9cb"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.183053 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6gjd\" (UniqueName: \"kubernetes.io/projected/82712c2b-a91c-4e0e-9400-0624b0459f57-kube-api-access-p6gjd\") pod \"82712c2b-a91c-4e0e-9400-0624b0459f57\" (UID: \"82712c2b-a91c-4e0e-9400-0624b0459f57\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.183125 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82712c2b-a91c-4e0e-9400-0624b0459f57-operator-scripts\") pod \"82712c2b-a91c-4e0e-9400-0624b0459f57\" (UID: \"82712c2b-a91c-4e0e-9400-0624b0459f57\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.183997 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf9f5\" (UniqueName: \"kubernetes.io/projected/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-kube-api-access-xf9f5\") pod \"keystone-8b47-account-create-update-5zh78\" (UID: \"c9e6c86c-c8da-4c0b-8aec-758d3b2f5731\") " pod="openstack/keystone-8b47-account-create-update-5zh78" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.184164 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-operator-scripts\") pod \"keystone-8b47-account-create-update-5zh78\" (UID: \"c9e6c86c-c8da-4c0b-8aec-758d3b2f5731\") " pod="openstack/keystone-8b47-account-create-update-5zh78" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.184390 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfn69\" (UniqueName: \"kubernetes.io/projected/9b66eae6-4565-4f56-8bdc-009aa1101a64-kube-api-access-pfn69\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.184403 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.184417 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwsp5\" (UniqueName: \"kubernetes.io/projected/969db92a-069e-4713-bca7-5e4ebb89d612-kube-api-access-pwsp5\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.184460 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.184473 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-478lp\" (UniqueName: \"kubernetes.io/projected/4352c518-ada9-4e5e-9327-5bd3c34a2796-kube-api-access-478lp\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.184483 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.187311 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82712c2b-a91c-4e0e-9400-0624b0459f57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82712c2b-a91c-4e0e-9400-0624b0459f57" (UID: "82712c2b-a91c-4e0e-9400-0624b0459f57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.195726 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911535c0-45eb-4361-b169-fad54a54d78b-config-data" (OuterVolumeSpecName: "config-data") pod "911535c0-45eb-4361-b169-fad54a54d78b" (UID: "911535c0-45eb-4361-b169-fad54a54d78b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.204258 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-8tflc"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.215104 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-config-data" (OuterVolumeSpecName: "config-data") pod "f29958fb-3a36-427e-8094-62f7522b7a17" (UID: "f29958fb-3a36-427e-8094-62f7522b7a17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.215169 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82712c2b-a91c-4e0e-9400-0624b0459f57-kube-api-access-p6gjd" (OuterVolumeSpecName: "kube-api-access-p6gjd") pod "82712c2b-a91c-4e0e-9400-0624b0459f57" (UID: "82712c2b-a91c-4e0e-9400-0624b0459f57"). InnerVolumeSpecName "kube-api-access-p6gjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.216625 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.236349 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-8tflc"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.249434 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b51d-account-create-update-6tf4f"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.271240 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b51d-account-create-update-6tf4f"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.289267 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-operator-scripts\") pod \"keystone-8b47-account-create-update-5zh78\" (UID: \"c9e6c86c-c8da-4c0b-8aec-758d3b2f5731\") " pod="openstack/keystone-8b47-account-create-update-5zh78" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.289531 5047 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.289650 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-operator-scripts podName:c9e6c86c-c8da-4c0b-8aec-758d3b2f5731 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:21.789618714 +0000 UTC m=+1604.040945848 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-operator-scripts") pod "keystone-8b47-account-create-update-5zh78" (UID: "c9e6c86c-c8da-4c0b-8aec-758d3b2f5731") : configmap "openstack-scripts" not found Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.290874 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911535c0-45eb-4361-b169-fad54a54d78b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "911535c0-45eb-4361-b169-fad54a54d78b" (UID: "911535c0-45eb-4361-b169-fad54a54d78b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.293723 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf9f5\" (UniqueName: \"kubernetes.io/projected/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-kube-api-access-xf9f5\") pod \"keystone-8b47-account-create-update-5zh78\" (UID: \"c9e6c86c-c8da-4c0b-8aec-758d3b2f5731\") " pod="openstack/keystone-8b47-account-create-update-5zh78" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.294055 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6gjd\" (UniqueName: \"kubernetes.io/projected/82712c2b-a91c-4e0e-9400-0624b0459f57-kube-api-access-p6gjd\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.294082 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82712c2b-a91c-4e0e-9400-0624b0459f57-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.294092 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.294103 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911535c0-45eb-4361-b169-fad54a54d78b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.294112 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911535c0-45eb-4361-b169-fad54a54d78b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.294204 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.294272 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data podName:1e83857b-5e17-4878-8f9b-e8d1a65325ba nodeName:}" failed. No retries permitted until 2026-02-23 07:11:29.294246275 +0000 UTC m=+1611.545573599 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data") pod "rabbitmq-cell1-server-0" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba") : configmap "rabbitmq-cell1-config-data" not found Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.315795 5047 projected.go:194] Error preparing data for projected volume kube-api-access-xf9f5 for pod openstack/keystone-8b47-account-create-update-5zh78: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.315870 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-kube-api-access-xf9f5 podName:c9e6c86c-c8da-4c0b-8aec-758d3b2f5731 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:21.815850605 +0000 UTC m=+1604.067177729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xf9f5" (UniqueName: "kubernetes.io/projected/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-kube-api-access-xf9f5") pod "keystone-8b47-account-create-update-5zh78" (UID: "c9e6c86c-c8da-4c0b-8aec-758d3b2f5731") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.329578 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5864496f8c-9bg5d"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.329831 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-5864496f8c-9bg5d" podUID="f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" containerName="keystone-api" containerID="cri-o://fc5d3551724091dfc09fc2b99a9eac93b56874e2010d67bb0d6736fc0d101cdb" gracePeriod=30 Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.354428 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b66eae6-4565-4f56-8bdc-009aa1101a64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b66eae6-4565-4f56-8bdc-009aa1101a64" (UID: "9b66eae6-4565-4f56-8bdc-009aa1101a64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.381604 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a452777f-60a7-4cfc-9e9b-b262be0a27cf" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.386876 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data" (OuterVolumeSpecName: "config-data") pod "a452777f-60a7-4cfc-9e9b-b262be0a27cf" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.389890 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8b47-account-create-update-5zh78"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.398124 5047 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.398463 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.398516 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b66eae6-4565-4f56-8bdc-009aa1101a64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.398880 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.398896 5047 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.405106 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-cglrk"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.405203 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f29958fb-3a36-427e-8094-62f7522b7a17" (UID: "f29958fb-3a36-427e-8094-62f7522b7a17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.407137 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f29958fb-3a36-427e-8094-62f7522b7a17" (UID: "f29958fb-3a36-427e-8094-62f7522b7a17"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.409912 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-cglrk"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.419372 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f29958fb-3a36-427e-8094-62f7522b7a17" (UID: "f29958fb-3a36-427e-8094-62f7522b7a17"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.427898 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bz2gr"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.437557 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.452430 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8c0b-account-create-update-fvht5"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.453052 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.454956 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.459840 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8c0b-account-create-update-fvht5"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.462849 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.463206 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b66eae6-4565-4f56-8bdc-009aa1101a64-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "9b66eae6-4565-4f56-8bdc-009aa1101a64" (UID: "9b66eae6-4565-4f56-8bdc-009aa1101a64"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.469123 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4352c518-ada9-4e5e-9327-5bd3c34a2796" (UID: "4352c518-ada9-4e5e-9327-5bd3c34a2796"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.496312 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-xf9f5 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-8b47-account-create-update-5zh78" podUID="c9e6c86c-c8da-4c0b-8aec-758d3b2f5731" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.502453 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-combined-ca-bundle\") pod \"69d03df4-c334-4c64-a273-e4e307df5add\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.502870 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-etc-machine-id\") pod \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.502981 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-combined-ca-bundle\") pod \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.503100 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-httpd-run\") pod \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.503265 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-config-data\") pod \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.503564 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.503708 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-combined-ca-bundle\") pod \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.503786 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8txjc\" (UniqueName: \"kubernetes.io/projected/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-kube-api-access-8txjc\") pod \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.503934 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69d03df4-c334-4c64-a273-e4e307df5add-httpd-run\") pod \"69d03df4-c334-4c64-a273-e4e307df5add\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.504010 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c628150-3bde-4740-9c94-dc208f61ade2-combined-ca-bundle\") pod \"5c628150-3bde-4740-9c94-dc208f61ade2\" (UID: \"5c628150-3bde-4740-9c94-dc208f61ade2\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.504088 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c628150-3bde-4740-9c94-dc208f61ade2-config-data\") pod \"5c628150-3bde-4740-9c94-dc208f61ade2\" (UID: \"5c628150-3bde-4740-9c94-dc208f61ade2\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.504152 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npt6r\" (UniqueName: \"kubernetes.io/projected/5c628150-3bde-4740-9c94-dc208f61ade2-kube-api-access-npt6r\") pod \"5c628150-3bde-4740-9c94-dc208f61ade2\" (UID: \"5c628150-3bde-4740-9c94-dc208f61ade2\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.504235 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-logs\") pod \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.504333 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-config-data\") pod \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.504399 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-public-tls-certs\") pod \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.504468 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"69d03df4-c334-4c64-a273-e4e307df5add\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.504533 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-config-data\") pod \"69d03df4-c334-4c64-a273-e4e307df5add\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.504610 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-internal-tls-certs\") pod \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.504869 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "eb737ca7-c18f-4ff9-9285-bb35ee17cd05" (UID: "eb737ca7-c18f-4ff9-9285-bb35ee17cd05"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.505160 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-config-data-custom\") pod \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.506093 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-scripts\") pod \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.506183 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-scripts\") pod \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.506268 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njggm\" (UniqueName: \"kubernetes.io/projected/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-kube-api-access-njggm\") pod \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.506331 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.506434 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-scripts\") pod \"69d03df4-c334-4c64-a273-e4e307df5add\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.506501 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-internal-tls-certs\") pod \"69d03df4-c334-4c64-a273-e4e307df5add\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.506617 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-public-tls-certs\") pod \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\" (UID: \"eb737ca7-c18f-4ff9-9285-bb35ee17cd05\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.506730 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nmbl\" (UniqueName: \"kubernetes.io/projected/69d03df4-c334-4c64-a273-e4e307df5add-kube-api-access-7nmbl\") pod \"69d03df4-c334-4c64-a273-e4e307df5add\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.506807 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d03df4-c334-4c64-a273-e4e307df5add-logs\") pod \"69d03df4-c334-4c64-a273-e4e307df5add\" (UID: \"69d03df4-c334-4c64-a273-e4e307df5add\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.506884 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-logs\") pod \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\" (UID: \"7d9eb562-3f84-458f-885a-e2fbb3e86bf3\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.507697 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.507769 5047 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.507852 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.507926 5047 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b66eae6-4565-4f56-8bdc-009aa1101a64-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.507982 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.508044 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f29958fb-3a36-427e-8094-62f7522b7a17-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.512896 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69d03df4-c334-4c64-a273-e4e307df5add-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "69d03df4-c334-4c64-a273-e4e307df5add" (UID: "69d03df4-c334-4c64-a273-e4e307df5add"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.513178 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7d9eb562-3f84-458f-885a-e2fbb3e86bf3" (UID: "7d9eb562-3f84-458f-885a-e2fbb3e86bf3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.515207 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-logs" (OuterVolumeSpecName: "logs") pod "eb737ca7-c18f-4ff9-9285-bb35ee17cd05" (UID: "eb737ca7-c18f-4ff9-9285-bb35ee17cd05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.516278 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eb737ca7-c18f-4ff9-9285-bb35ee17cd05" (UID: "eb737ca7-c18f-4ff9-9285-bb35ee17cd05"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.516723 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-logs" (OuterVolumeSpecName: "logs") pod "7d9eb562-3f84-458f-885a-e2fbb3e86bf3" (UID: "7d9eb562-3f84-458f-885a-e2fbb3e86bf3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.516937 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69d03df4-c334-4c64-a273-e4e307df5add-logs" (OuterVolumeSpecName: "logs") pod "69d03df4-c334-4c64-a273-e4e307df5add" (UID: "69d03df4-c334-4c64-a273-e4e307df5add"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.521263 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-scripts" (OuterVolumeSpecName: "scripts") pod "7d9eb562-3f84-458f-885a-e2fbb3e86bf3" (UID: "7d9eb562-3f84-458f-885a-e2fbb3e86bf3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.523009 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "69d03df4-c334-4c64-a273-e4e307df5add" (UID: "69d03df4-c334-4c64-a273-e4e307df5add"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.523074 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-scripts" (OuterVolumeSpecName: "scripts") pod "eb737ca7-c18f-4ff9-9285-bb35ee17cd05" (UID: "eb737ca7-c18f-4ff9-9285-bb35ee17cd05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.523219 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-kube-api-access-8txjc" (OuterVolumeSpecName: "kube-api-access-8txjc") pod "eb737ca7-c18f-4ff9-9285-bb35ee17cd05" (UID: "eb737ca7-c18f-4ff9-9285-bb35ee17cd05"). InnerVolumeSpecName "kube-api-access-8txjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.525814 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.529184 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a452777f-60a7-4cfc-9e9b-b262be0a27cf" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.530764 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.531201 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-kube-api-access-njggm" (OuterVolumeSpecName: "kube-api-access-njggm") pod "7d9eb562-3f84-458f-885a-e2fbb3e86bf3" (UID: "7d9eb562-3f84-458f-885a-e2fbb3e86bf3"). InnerVolumeSpecName "kube-api-access-njggm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.532913 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d03df4-c334-4c64-a273-e4e307df5add-kube-api-access-7nmbl" (OuterVolumeSpecName: "kube-api-access-7nmbl") pod "69d03df4-c334-4c64-a273-e4e307df5add" (UID: "69d03df4-c334-4c64-a273-e4e307df5add"). InnerVolumeSpecName "kube-api-access-7nmbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.533030 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c628150-3bde-4740-9c94-dc208f61ade2-kube-api-access-npt6r" (OuterVolumeSpecName: "kube-api-access-npt6r") pod "5c628150-3bde-4740-9c94-dc208f61ade2" (UID: "5c628150-3bde-4740-9c94-dc208f61ade2"). InnerVolumeSpecName "kube-api-access-npt6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.544845 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-scripts" (OuterVolumeSpecName: "scripts") pod "69d03df4-c334-4c64-a273-e4e307df5add" (UID: "69d03df4-c334-4c64-a273-e4e307df5add"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.553797 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.556221 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "7d9eb562-3f84-458f-885a-e2fbb3e86bf3" (UID: "7d9eb562-3f84-458f-885a-e2fbb3e86bf3"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.570120 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.576618 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-config-data" (OuterVolumeSpecName: "config-data") pod "4352c518-ada9-4e5e-9327-5bd3c34a2796" (UID: "4352c518-ada9-4e5e-9327-5bd3c34a2796"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.610988 5047 generic.go:334] "Generic (PLEG): container finished" podID="e307d3a1-af99-460b-bdd0-24e26de38751" containerID="1843426c8ae30cc809f15219c4a71a415a94a52a2bc0a9a35f7e3c5e3ca3cd62" exitCode=0 Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.611144 5047 generic.go:334] "Generic (PLEG): container finished" podID="e307d3a1-af99-460b-bdd0-24e26de38751" containerID="de1bae2fc8abe4518e053551089b717523cf45b88661909a16503dc7aee58baf" exitCode=2 Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.611163 5047 generic.go:334] "Generic (PLEG): container finished" podID="e307d3a1-af99-460b-bdd0-24e26de38751" containerID="ac42b72f2ef3e70086002c27d0507227c425dcef23dd72ee4b41f90f506d1f7e" exitCode=0 Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.613193 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e307d3a1-af99-460b-bdd0-24e26de38751","Type":"ContainerDied","Data":"1843426c8ae30cc809f15219c4a71a415a94a52a2bc0a9a35f7e3c5e3ca3cd62"} Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.613527 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e307d3a1-af99-460b-bdd0-24e26de38751","Type":"ContainerDied","Data":"de1bae2fc8abe4518e053551089b717523cf45b88661909a16503dc7aee58baf"} Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.615331 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e307d3a1-af99-460b-bdd0-24e26de38751","Type":"ContainerDied","Data":"ac42b72f2ef3e70086002c27d0507227c425dcef23dd72ee4b41f90f506d1f7e"} Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.617206 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lcj9\" (UniqueName: \"kubernetes.io/projected/db4c480a-88f1-42e1-bdce-21bdb85ecc48-kube-api-access-8lcj9\") pod \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.617265 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-combined-ca-bundle\") pod \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.617392 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p25s\" (UniqueName: \"kubernetes.io/projected/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-kube-api-access-2p25s\") pod \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.617446 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmrbb\" (UniqueName: \"kubernetes.io/projected/27668b66-4868-448a-b2dd-e270ed4bc677-kube-api-access-bmrbb\") pod \"27668b66-4868-448a-b2dd-e270ed4bc677\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.617530 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-logs\") pod \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.617564 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27668b66-4868-448a-b2dd-e270ed4bc677-logs\") pod \"27668b66-4868-448a-b2dd-e270ed4bc677\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.617587 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-combined-ca-bundle\") pod \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.617633 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-public-tls-certs\") pod \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.617706 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-internal-tls-certs\") pod \"27668b66-4868-448a-b2dd-e270ed4bc677\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.617769 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db4c480a-88f1-42e1-bdce-21bdb85ecc48-logs\") pod \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.617798 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-config-data\") pod \"27668b66-4868-448a-b2dd-e270ed4bc677\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.617948 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-internal-tls-certs\") pod \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.617984 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-public-tls-certs\") pod \"27668b66-4868-448a-b2dd-e270ed4bc677\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.618029 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-config-data-custom\") pod \"27668b66-4868-448a-b2dd-e270ed4bc677\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.618071 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-combined-ca-bundle\") pod \"27668b66-4868-448a-b2dd-e270ed4bc677\" (UID: \"27668b66-4868-448a-b2dd-e270ed4bc677\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.618109 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-nova-metadata-tls-certs\") pod \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.618273 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-config-data\") pod \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\" (UID: \"fa3349ce-92c0-4fc5-ae0e-6424be7ca179\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.618339 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-config-data\") pod \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\" (UID: \"db4c480a-88f1-42e1-bdce-21bdb85ecc48\") " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619326 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npt6r\" (UniqueName: \"kubernetes.io/projected/5c628150-3bde-4740-9c94-dc208f61ade2-kube-api-access-npt6r\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619345 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619371 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619385 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619397 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4352c518-ada9-4e5e-9327-5bd3c34a2796-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619408 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619419 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619430 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619441 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njggm\" (UniqueName: \"kubernetes.io/projected/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-kube-api-access-njggm\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619459 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619469 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619479 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nmbl\" (UniqueName: \"kubernetes.io/projected/69d03df4-c334-4c64-a273-e4e307df5add-kube-api-access-7nmbl\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619489 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d03df4-c334-4c64-a273-e4e307df5add-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619498 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619547 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619561 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8txjc\" (UniqueName: \"kubernetes.io/projected/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-kube-api-access-8txjc\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.619571 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69d03df4-c334-4c64-a273-e4e307df5add-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.622598 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-logs" (OuterVolumeSpecName: "logs") pod "fa3349ce-92c0-4fc5-ae0e-6424be7ca179" (UID: "fa3349ce-92c0-4fc5-ae0e-6424be7ca179"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.625797 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27668b66-4868-448a-b2dd-e270ed4bc677-logs" (OuterVolumeSpecName: "logs") pod "27668b66-4868-448a-b2dd-e270ed4bc677" (UID: "27668b66-4868-448a-b2dd-e270ed4bc677"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.628187 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eb737ca7-c18f-4ff9-9285-bb35ee17cd05","Type":"ContainerDied","Data":"e871d47dd294c1855adad5a4275d3faa21d845644bd94d8d7dea5de4a9f81d8f"} Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.628244 5047 scope.go:117] "RemoveContainer" containerID="28fb45634504bebe907f3061a04e53baf4e24bef8b044220704d400b3404f735" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.628362 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.628617 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db4c480a-88f1-42e1-bdce-21bdb85ecc48-logs" (OuterVolumeSpecName: "logs") pod "db4c480a-88f1-42e1-bdce-21bdb85ecc48" (UID: "db4c480a-88f1-42e1-bdce-21bdb85ecc48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.644384 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e7d5-account-create-update-89ndz"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.647607 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" containerName="galera" containerID="cri-o://585471b95568f45e2db9a4efdec5ea803c6a7ec6ad4820bfee8358da781424fe" gracePeriod=30 Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.660125 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e7d5-account-create-update-89ndz"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.681296 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69d03df4-c334-4c64-a273-e4e307df5add","Type":"ContainerDied","Data":"e519685364edf1623978c606be8efb8fc2ac904406f478ece6affa0ab3a2f58b"} Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.681405 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.684472 5047 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-bz2gr" secret="" err="secret \"galera-openstack-dockercfg-27gbm\" not found" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.684550 5047 scope.go:117] "RemoveContainer" containerID="0d84f1852bd6b7dbc4d9bdf4ea958827422207ab1118be596167434c3cd34775" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.684894 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-bz2gr_openstack(4cdbcdf8-785e-4383-ab02-4492360bf4b4)\"" pod="openstack/root-account-create-update-bz2gr" podUID="4cdbcdf8-785e-4383-ab02-4492360bf4b4" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.700152 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69d03df4-c334-4c64-a273-e4e307df5add" (UID: "69d03df4-c334-4c64-a273-e4e307df5add"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.701580 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb737ca7-c18f-4ff9-9285-bb35ee17cd05" (UID: "eb737ca7-c18f-4ff9-9285-bb35ee17cd05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.704252 5047 generic.go:334] "Generic (PLEG): container finished" podID="27668b66-4868-448a-b2dd-e270ed4bc677" containerID="71481b06f6db4dcf1c5efa4c83fd7a72403b3075d95de76c6a2614c7e972fbcd" exitCode=0 Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.704374 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cb54ddf68-t6k79" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.704391 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cb54ddf68-t6k79" event={"ID":"27668b66-4868-448a-b2dd-e270ed4bc677","Type":"ContainerDied","Data":"71481b06f6db4dcf1c5efa4c83fd7a72403b3075d95de76c6a2614c7e972fbcd"} Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.704744 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cb54ddf68-t6k79" event={"ID":"27668b66-4868-448a-b2dd-e270ed4bc677","Type":"ContainerDied","Data":"b4cbc66300dbd11959b05e9470d5418d9d599f15cb883c82d6ae0561b9afe677"} Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.724889 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-965e-account-create-update-r6tbz"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.725287 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.725314 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27668b66-4868-448a-b2dd-e270ed4bc677-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.725325 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.725334 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.725344 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db4c480a-88f1-42e1-bdce-21bdb85ecc48-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.725420 5047 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.725490 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4cdbcdf8-785e-4383-ab02-4492360bf4b4-operator-scripts podName:4cdbcdf8-785e-4383-ab02-4492360bf4b4 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:22.22546884 +0000 UTC m=+1604.476795974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4cdbcdf8-785e-4383-ab02-4492360bf4b4-operator-scripts") pod "root-account-create-update-bz2gr" (UID: "4cdbcdf8-785e-4383-ab02-4492360bf4b4") : configmap "openstack-scripts" not found Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.725942 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a452777f-60a7-4cfc-9e9b-b262be0a27cf" (UID: "a452777f-60a7-4cfc-9e9b-b262be0a27cf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.726004 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "27668b66-4868-448a-b2dd-e270ed4bc677" (UID: "27668b66-4868-448a-b2dd-e270ed4bc677"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.726066 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db4c480a-88f1-42e1-bdce-21bdb85ecc48-kube-api-access-8lcj9" (OuterVolumeSpecName: "kube-api-access-8lcj9") pod "db4c480a-88f1-42e1-bdce-21bdb85ecc48" (UID: "db4c480a-88f1-42e1-bdce-21bdb85ecc48"). InnerVolumeSpecName "kube-api-access-8lcj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.726220 5047 scope.go:117] "RemoveContainer" containerID="8fc7018d470fbeb3c101b59cebfc490bd0f1cc9cb98ca2bbc51f91d35a503470" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.729059 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-kube-api-access-2p25s" (OuterVolumeSpecName: "kube-api-access-2p25s") pod "fa3349ce-92c0-4fc5-ae0e-6424be7ca179" (UID: "fa3349ce-92c0-4fc5-ae0e-6424be7ca179"). InnerVolumeSpecName "kube-api-access-2p25s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.729126 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27668b66-4868-448a-b2dd-e270ed4bc677-kube-api-access-bmrbb" (OuterVolumeSpecName: "kube-api-access-bmrbb") pod "27668b66-4868-448a-b2dd-e270ed4bc677" (UID: "27668b66-4868-448a-b2dd-e270ed4bc677"). InnerVolumeSpecName "kube-api-access-bmrbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.735754 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db4c480a-88f1-42e1-bdce-21bdb85ecc48","Type":"ContainerDied","Data":"2a15af10b6fe7afc847351c4b45a65bee6b1b97516f6012fd8ae34523c830c7c"} Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.735889 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.741703 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d9eb562-3f84-458f-885a-e2fbb3e86bf3","Type":"ContainerDied","Data":"274a07768bdf167f837f50d8e0595e4b99f3367f27a526c0ec7cbb5e02613e3a"} Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.741812 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.746684 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-965e-account-create-update-r6tbz"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.764583 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa3349ce-92c0-4fc5-ae0e-6424be7ca179","Type":"ContainerDied","Data":"fc330f41b07a0a1f9d7cf14f2b8c1a2fdadd4ed855b8d39328b1ef99c577ad53"} Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.764657 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.778258 5047 generic.go:334] "Generic (PLEG): container finished" podID="95d08092-3eac-4289-b594-a77d5dfecfe9" containerID="897b14a78dfbcdad54f56571d5c5ba61205fe2678162c41bb69a05cb9c0da5b7" exitCode=2 Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.778411 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.778858 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"95d08092-3eac-4289-b594-a77d5dfecfe9","Type":"ContainerDied","Data":"897b14a78dfbcdad54f56571d5c5ba61205fe2678162c41bb69a05cb9c0da5b7"} Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.781191 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.782059 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.782059 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.782102 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b47-account-create-update-5zh78" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.782112 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d867c5cb7-qbvv7" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.782148 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f4db5cb66-tjpmr" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.782170 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac64-account-create-update-zn4xz" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.782694 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5b36-account-create-update-sr2bf" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.789769 5047 scope.go:117] "RemoveContainer" containerID="92383f49d49f2ba213699c653ceecca29ba9aba4cc99635c1909b4aa5789dcd9" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.791024 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c628150-3bde-4740-9c94-dc208f61ade2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c628150-3bde-4740-9c94-dc208f61ade2" (UID: "5c628150-3bde-4740-9c94-dc208f61ade2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.793699 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-08b6-account-create-update-hxmz5"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.815262 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b47-account-create-update-5zh78" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.830074 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-operator-scripts\") pod \"keystone-8b47-account-create-update-5zh78\" (UID: \"c9e6c86c-c8da-4c0b-8aec-758d3b2f5731\") " pod="openstack/keystone-8b47-account-create-update-5zh78" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.830266 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf9f5\" (UniqueName: \"kubernetes.io/projected/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-kube-api-access-xf9f5\") pod \"keystone-8b47-account-create-update-5zh78\" (UID: \"c9e6c86c-c8da-4c0b-8aec-758d3b2f5731\") " pod="openstack/keystone-8b47-account-create-update-5zh78" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.830475 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a452777f-60a7-4cfc-9e9b-b262be0a27cf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.830495 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c628150-3bde-4740-9c94-dc208f61ade2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.830514 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.830526 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lcj9\" (UniqueName: \"kubernetes.io/projected/db4c480a-88f1-42e1-bdce-21bdb85ecc48-kube-api-access-8lcj9\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.830537 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p25s\" (UniqueName: \"kubernetes.io/projected/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-kube-api-access-2p25s\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.830551 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmrbb\" (UniqueName: \"kubernetes.io/projected/27668b66-4868-448a-b2dd-e270ed4bc677-kube-api-access-bmrbb\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.830601 5047 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.830678 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-operator-scripts podName:c9e6c86c-c8da-4c0b-8aec-758d3b2f5731 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:22.830658193 +0000 UTC m=+1605.081985327 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-operator-scripts") pod "keystone-8b47-account-create-update-5zh78" (UID: "c9e6c86c-c8da-4c0b-8aec-758d3b2f5731") : configmap "openstack-scripts" not found Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.835757 5047 projected.go:194] Error preparing data for projected volume kube-api-access-xf9f5 for pod openstack/keystone-8b47-account-create-update-5zh78: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 07:11:21 crc kubenswrapper[5047]: E0223 07:11:21.835821 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-kube-api-access-xf9f5 podName:c9e6c86c-c8da-4c0b-8aec-758d3b2f5731 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:22.835807629 +0000 UTC m=+1605.087134763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xf9f5" (UniqueName: "kubernetes.io/projected/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-kube-api-access-xf9f5") pod "keystone-8b47-account-create-update-5zh78" (UID: "c9e6c86c-c8da-4c0b-8aec-758d3b2f5731") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.844095 5047 scope.go:117] "RemoveContainer" containerID="d3073ef9efc88121d60f580513fe1f931a6371611ccd81a1a8970553da615a84" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.855447 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-08b6-account-create-update-hxmz5"] Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.922272 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d9eb562-3f84-458f-885a-e2fbb3e86bf3" (UID: "7d9eb562-3f84-458f-885a-e2fbb3e86bf3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.936186 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnc4s\" (UniqueName: \"kubernetes.io/projected/9819ef7e-f619-45a5-b5d3-e67bb5214f24-kube-api-access-dnc4s\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.936297 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:21 crc kubenswrapper[5047]: I0223 07:11:21.936314 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9819ef7e-f619-45a5-b5d3-e67bb5214f24-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.121362 5047 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.140143 5047 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.148102 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c628150-3bde-4740-9c94-dc208f61ade2-config-data" (OuterVolumeSpecName: "config-data") pod "5c628150-3bde-4740-9c94-dc208f61ade2" (UID: "5c628150-3bde-4740-9c94-dc208f61ade2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.234585 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-config-data" (OuterVolumeSpecName: "config-data") pod "db4c480a-88f1-42e1-bdce-21bdb85ecc48" (UID: "db4c480a-88f1-42e1-bdce-21bdb85ecc48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.235187 5047 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.243113 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c628150-3bde-4740-9c94-dc208f61ade2-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.243144 5047 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.243156 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: E0223 07:11:22.243284 5047 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 23 07:11:22 crc kubenswrapper[5047]: E0223 07:11:22.243374 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4cdbcdf8-785e-4383-ab02-4492360bf4b4-operator-scripts podName:4cdbcdf8-785e-4383-ab02-4492360bf4b4 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:23.243350379 +0000 UTC m=+1605.494677513 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4cdbcdf8-785e-4383-ab02-4492360bf4b4-operator-scripts") pod "root-account-create-update-bz2gr" (UID: "4cdbcdf8-785e-4383-ab02-4492360bf4b4") : configmap "openstack-scripts" not found Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.285042 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "db4c480a-88f1-42e1-bdce-21bdb85ecc48" (UID: "db4c480a-88f1-42e1-bdce-21bdb85ecc48"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.314524 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db4c480a-88f1-42e1-bdce-21bdb85ecc48" (UID: "db4c480a-88f1-42e1-bdce-21bdb85ecc48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.325148 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa3349ce-92c0-4fc5-ae0e-6424be7ca179" (UID: "fa3349ce-92c0-4fc5-ae0e-6424be7ca179"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.326321 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27668b66-4868-448a-b2dd-e270ed4bc677" (UID: "27668b66-4868-448a-b2dd-e270ed4bc677"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.345285 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.345337 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.345352 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.345365 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.348296 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "27668b66-4868-448a-b2dd-e270ed4bc677" (UID: "27668b66-4868-448a-b2dd-e270ed4bc677"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.354489 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "69d03df4-c334-4c64-a273-e4e307df5add" (UID: "69d03df4-c334-4c64-a273-e4e307df5add"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.355048 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="199f6db1-c082-418e-9a55-330a39b15ed7" path="/var/lib/kubelet/pods/199f6db1-c082-418e-9a55-330a39b15ed7/volumes" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.355597 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a23e58e-d907-4f3e-92cb-be6652ceb9d6" path="/var/lib/kubelet/pods/2a23e58e-d907-4f3e-92cb-be6652ceb9d6/volumes" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.357543 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6665cd0b-0b9d-4485-8cdb-7abe364508d1" path="/var/lib/kubelet/pods/6665cd0b-0b9d-4485-8cdb-7abe364508d1/volumes" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.358339 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c6d00f2-ae85-4f68-9645-94e666f0e1c2" path="/var/lib/kubelet/pods/6c6d00f2-ae85-4f68-9645-94e666f0e1c2/volumes" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.358687 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef3c81d-1f2a-447d-b8c3-687002fa8c35" path="/var/lib/kubelet/pods/8ef3c81d-1f2a-447d-b8c3-687002fa8c35/volumes" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.358985 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9819ef7e-f619-45a5-b5d3-e67bb5214f24" path="/var/lib/kubelet/pods/9819ef7e-f619-45a5-b5d3-e67bb5214f24/volumes" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.359287 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7504baf-d384-4fb8-a053-0da0c933f1e4" path="/var/lib/kubelet/pods/b7504baf-d384-4fb8-a053-0da0c933f1e4/volumes" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.359649 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b78afe08-d5e4-45d8-9149-4c0a1020a4cf" path="/var/lib/kubelet/pods/b78afe08-d5e4-45d8-9149-4c0a1020a4cf/volumes" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.362966 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d513cfd3-cb98-440f-b564-d36d8f20f5a4" path="/var/lib/kubelet/pods/d513cfd3-cb98-440f-b564-d36d8f20f5a4/volumes" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.363553 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed1fbf93-445e-478b-9f00-347d83b83977" path="/var/lib/kubelet/pods/ed1fbf93-445e-478b-9f00-347d83b83977/volumes" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.364066 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbd8c46d-274a-4544-8d0e-16242f0d9673" path="/var/lib/kubelet/pods/fbd8c46d-274a-4544-8d0e-16242f0d9673/volumes" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.368990 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-config-data" (OuterVolumeSpecName: "config-data") pod "69d03df4-c334-4c64-a273-e4e307df5add" (UID: "69d03df4-c334-4c64-a273-e4e307df5add"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.374086 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7d9eb562-3f84-458f-885a-e2fbb3e86bf3" (UID: "7d9eb562-3f84-458f-885a-e2fbb3e86bf3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.401002 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-config-data" (OuterVolumeSpecName: "config-data") pod "27668b66-4868-448a-b2dd-e270ed4bc677" (UID: "27668b66-4868-448a-b2dd-e270ed4bc677"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.410953 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fa3349ce-92c0-4fc5-ae0e-6424be7ca179" (UID: "fa3349ce-92c0-4fc5-ae0e-6424be7ca179"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.432393 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "27668b66-4868-448a-b2dd-e270ed4bc677" (UID: "27668b66-4868-448a-b2dd-e270ed4bc677"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.434783 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-config-data" (OuterVolumeSpecName: "config-data") pod "fa3349ce-92c0-4fc5-ae0e-6424be7ca179" (UID: "fa3349ce-92c0-4fc5-ae0e-6424be7ca179"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.445305 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eb737ca7-c18f-4ff9-9285-bb35ee17cd05" (UID: "eb737ca7-c18f-4ff9-9285-bb35ee17cd05"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.448374 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.448415 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.448425 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.448435 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.448444 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27668b66-4868-448a-b2dd-e270ed4bc677-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.448455 5047 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.448466 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa3349ce-92c0-4fc5-ae0e-6424be7ca179-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.448475 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69d03df4-c334-4c64-a273-e4e307df5add-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.448483 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.454440 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "db4c480a-88f1-42e1-bdce-21bdb85ecc48" (UID: "db4c480a-88f1-42e1-bdce-21bdb85ecc48"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.460625 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eb737ca7-c18f-4ff9-9285-bb35ee17cd05" (UID: "eb737ca7-c18f-4ff9-9285-bb35ee17cd05"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.476848 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-config-data" (OuterVolumeSpecName: "config-data") pod "eb737ca7-c18f-4ff9-9285-bb35ee17cd05" (UID: "eb737ca7-c18f-4ff9-9285-bb35ee17cd05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.485507 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-config-data" (OuterVolumeSpecName: "config-data") pod "7d9eb562-3f84-458f-885a-e2fbb3e86bf3" (UID: "7d9eb562-3f84-458f-885a-e2fbb3e86bf3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.550316 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.550360 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db4c480a-88f1-42e1-bdce-21bdb85ecc48-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.550374 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb737ca7-c18f-4ff9-9285-bb35ee17cd05-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.550385 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9eb562-3f84-458f-885a-e2fbb3e86bf3-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:22 crc kubenswrapper[5047]: E0223 07:11:22.652264 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 07:11:22 crc kubenswrapper[5047]: E0223 07:11:22.652775 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data podName:6776acf2-e53f-4892-847d-8667669a5eb9 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:30.652733278 +0000 UTC m=+1612.904060582 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data") pod "rabbitmq-server-0" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9") : configmap "rabbitmq-config-data" not found Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.668035 5047 scope.go:117] "RemoveContainer" containerID="71481b06f6db4dcf1c5efa4c83fd7a72403b3075d95de76c6a2614c7e972fbcd" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.737179 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.765520 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.791289 5047 scope.go:117] "RemoveContainer" containerID="41b717817973f595812fe1eb20824b0b443f9f0d8cd641d6601c7057d79f6854" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.818128 5047 generic.go:334] "Generic (PLEG): container finished" podID="8fc70af3-2572-4019-adcb-8aecf538ae27" containerID="dd6c4ff9505f7964d3926febad9b793b4fe8fc12c04332f122f10bc230d78a63" exitCode=0 Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.818309 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dqll" event={"ID":"8fc70af3-2572-4019-adcb-8aecf538ae27","Type":"ContainerDied","Data":"dd6c4ff9505f7964d3926febad9b793b4fe8fc12c04332f122f10bc230d78a63"} Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.825767 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"95d08092-3eac-4289-b594-a77d5dfecfe9","Type":"ContainerDied","Data":"74fe8d1d5f8f73a388f5636574972cc01a2efd9b5d40891d42ee819b7fe495ba"} Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.825815 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74fe8d1d5f8f73a388f5636574972cc01a2efd9b5d40891d42ee819b7fe495ba" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.855093 5047 generic.go:334] "Generic (PLEG): container finished" podID="1e83857b-5e17-4878-8f9b-e8d1a65325ba" containerID="2ac5bc2b66d5e8526a58472b2784d5b39320ba72eae19f8bc017391874b9616a" exitCode=0 Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.855245 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e83857b-5e17-4878-8f9b-e8d1a65325ba","Type":"ContainerDied","Data":"2ac5bc2b66d5e8526a58472b2784d5b39320ba72eae19f8bc017391874b9616a"} Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.857103 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf9f5\" (UniqueName: \"kubernetes.io/projected/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-kube-api-access-xf9f5\") pod \"keystone-8b47-account-create-update-5zh78\" (UID: \"c9e6c86c-c8da-4c0b-8aec-758d3b2f5731\") " pod="openstack/keystone-8b47-account-create-update-5zh78" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.857249 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-operator-scripts\") pod \"keystone-8b47-account-create-update-5zh78\" (UID: \"c9e6c86c-c8da-4c0b-8aec-758d3b2f5731\") " pod="openstack/keystone-8b47-account-create-update-5zh78" Feb 23 07:11:22 crc kubenswrapper[5047]: E0223 07:11:22.857366 5047 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 23 07:11:22 crc kubenswrapper[5047]: E0223 07:11:22.857426 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-operator-scripts podName:c9e6c86c-c8da-4c0b-8aec-758d3b2f5731 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:24.857409503 +0000 UTC m=+1607.108736637 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-operator-scripts") pod "keystone-8b47-account-create-update-5zh78" (UID: "c9e6c86c-c8da-4c0b-8aec-758d3b2f5731") : configmap "openstack-scripts" not found Feb 23 07:11:22 crc kubenswrapper[5047]: E0223 07:11:22.864047 5047 projected.go:194] Error preparing data for projected volume kube-api-access-xf9f5 for pod openstack/keystone-8b47-account-create-update-5zh78: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 07:11:22 crc kubenswrapper[5047]: E0223 07:11:22.864116 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-kube-api-access-xf9f5 podName:c9e6c86c-c8da-4c0b-8aec-758d3b2f5731 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:24.864098129 +0000 UTC m=+1607.115425263 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xf9f5" (UniqueName: "kubernetes.io/projected/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-kube-api-access-xf9f5") pod "keystone-8b47-account-create-update-5zh78" (UID: "c9e6c86c-c8da-4c0b-8aec-758d3b2f5731") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.867393 5047 generic.go:334] "Generic (PLEG): container finished" podID="4c814692-df9e-470f-8aad-364d48f82b81" containerID="bcb9e71f1342662d147a950f9bce3acb70d72672c9ecd0106c2560c4f8b80214" exitCode=0 Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.867540 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4c814692-df9e-470f-8aad-364d48f82b81","Type":"ContainerDied","Data":"bcb9e71f1342662d147a950f9bce3acb70d72672c9ecd0106c2560c4f8b80214"} Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.867586 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4c814692-df9e-470f-8aad-364d48f82b81","Type":"ContainerDied","Data":"393bac14fc8ea3deed8f5dfee5c88ca4c99ed425ce94b96d2881cc600d293cf4"} Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.867605 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="393bac14fc8ea3deed8f5dfee5c88ca4c99ed425ce94b96d2881cc600d293cf4" Feb 23 07:11:22 crc kubenswrapper[5047]: I0223 07:11:22.870744 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b47-account-create-update-5zh78" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.202263 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.224320 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.242227 5047 scope.go:117] "RemoveContainer" containerID="71481b06f6db4dcf1c5efa4c83fd7a72403b3075d95de76c6a2614c7e972fbcd" Feb 23 07:11:23 crc kubenswrapper[5047]: E0223 07:11:23.246126 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71481b06f6db4dcf1c5efa4c83fd7a72403b3075d95de76c6a2614c7e972fbcd\": container with ID starting with 71481b06f6db4dcf1c5efa4c83fd7a72403b3075d95de76c6a2614c7e972fbcd not found: ID does not exist" containerID="71481b06f6db4dcf1c5efa4c83fd7a72403b3075d95de76c6a2614c7e972fbcd" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.246197 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71481b06f6db4dcf1c5efa4c83fd7a72403b3075d95de76c6a2614c7e972fbcd"} err="failed to get container status \"71481b06f6db4dcf1c5efa4c83fd7a72403b3075d95de76c6a2614c7e972fbcd\": rpc error: code = NotFound desc = could not find container \"71481b06f6db4dcf1c5efa4c83fd7a72403b3075d95de76c6a2614c7e972fbcd\": container with ID starting with 71481b06f6db4dcf1c5efa4c83fd7a72403b3075d95de76c6a2614c7e972fbcd not found: ID does not exist" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.246276 5047 scope.go:117] "RemoveContainer" containerID="41b717817973f595812fe1eb20824b0b443f9f0d8cd641d6601c7057d79f6854" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.247583 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:11:23 crc kubenswrapper[5047]: E0223 07:11:23.249731 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41b717817973f595812fe1eb20824b0b443f9f0d8cd641d6601c7057d79f6854\": container with ID starting with 41b717817973f595812fe1eb20824b0b443f9f0d8cd641d6601c7057d79f6854 not found: ID does not exist" containerID="41b717817973f595812fe1eb20824b0b443f9f0d8cd641d6601c7057d79f6854" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.249769 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b717817973f595812fe1eb20824b0b443f9f0d8cd641d6601c7057d79f6854"} err="failed to get container status \"41b717817973f595812fe1eb20824b0b443f9f0d8cd641d6601c7057d79f6854\": rpc error: code = NotFound desc = could not find container \"41b717817973f595812fe1eb20824b0b443f9f0d8cd641d6601c7057d79f6854\": container with ID starting with 41b717817973f595812fe1eb20824b0b443f9f0d8cd641d6601c7057d79f6854 not found: ID does not exist" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.249794 5047 scope.go:117] "RemoveContainer" containerID="9e039263a53bbeb35ac12ba86211b210fd996b0f94ebba2771ea8c71cf94c08e" Feb 23 07:11:23 crc kubenswrapper[5047]: E0223 07:11:23.267596 5047 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 23 07:11:23 crc kubenswrapper[5047]: E0223 07:11:23.267659 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4cdbcdf8-785e-4383-ab02-4492360bf4b4-operator-scripts podName:4cdbcdf8-785e-4383-ab02-4492360bf4b4 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:25.267642465 +0000 UTC m=+1607.518969599 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4cdbcdf8-785e-4383-ab02-4492360bf4b4-operator-scripts") pod "root-account-create-update-bz2gr" (UID: "4cdbcdf8-785e-4383-ab02-4492360bf4b4") : configmap "openstack-scripts" not found Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.365306 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bz2gr" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.370494 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4pqg\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-kube-api-access-q4pqg\") pod \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.372664 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-plugins\") pod \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " Feb 23 07:11:23 crc kubenswrapper[5047]: E0223 07:11:23.373516 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.373754 5047 scope.go:117] "RemoveContainer" containerID="832e9b67b68d0d91fdc72fdc39407b689d44735575143025b03550e2cf607d48" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375581 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-state-metrics-tls-certs\") pod \"95d08092-3eac-4289-b594-a77d5dfecfe9\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375615 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e83857b-5e17-4878-8f9b-e8d1a65325ba-pod-info\") pod \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375638 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfk2q\" (UniqueName: \"kubernetes.io/projected/4c814692-df9e-470f-8aad-364d48f82b81-kube-api-access-xfk2q\") pod \"4c814692-df9e-470f-8aad-364d48f82b81\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375669 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c814692-df9e-470f-8aad-364d48f82b81-kolla-config\") pod \"4c814692-df9e-470f-8aad-364d48f82b81\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375710 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data\") pod \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375731 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-plugins-conf\") pod \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375756 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-server-conf\") pod \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375786 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-confd\") pod \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375809 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-state-metrics-tls-config\") pod \"95d08092-3eac-4289-b594-a77d5dfecfe9\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375839 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-erlang-cookie\") pod \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375865 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-combined-ca-bundle\") pod \"95d08092-3eac-4289-b594-a77d5dfecfe9\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375891 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwc9m\" (UniqueName: \"kubernetes.io/projected/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-api-access-cwc9m\") pod \"95d08092-3eac-4289-b594-a77d5dfecfe9\" (UID: \"95d08092-3eac-4289-b594-a77d5dfecfe9\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375942 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cdbcdf8-785e-4383-ab02-4492360bf4b4-operator-scripts\") pod \"4cdbcdf8-785e-4383-ab02-4492360bf4b4\" (UID: \"4cdbcdf8-785e-4383-ab02-4492360bf4b4\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375968 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e83857b-5e17-4878-8f9b-e8d1a65325ba-erlang-cookie-secret\") pod \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.375991 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-tls\") pod \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.376009 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c814692-df9e-470f-8aad-364d48f82b81-combined-ca-bundle\") pod \"4c814692-df9e-470f-8aad-364d48f82b81\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.376032 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h2kg\" (UniqueName: \"kubernetes.io/projected/4cdbcdf8-785e-4383-ab02-4492360bf4b4-kube-api-access-6h2kg\") pod \"4cdbcdf8-785e-4383-ab02-4492360bf4b4\" (UID: \"4cdbcdf8-785e-4383-ab02-4492360bf4b4\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.376053 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\" (UID: \"1e83857b-5e17-4878-8f9b-e8d1a65325ba\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.376071 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c814692-df9e-470f-8aad-364d48f82b81-memcached-tls-certs\") pod \"4c814692-df9e-470f-8aad-364d48f82b81\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.376091 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c814692-df9e-470f-8aad-364d48f82b81-config-data\") pod \"4c814692-df9e-470f-8aad-364d48f82b81\" (UID: \"4c814692-df9e-470f-8aad-364d48f82b81\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.376515 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1e83857b-5e17-4878-8f9b-e8d1a65325ba" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.376874 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: E0223 07:11:23.381328 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.385844 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1e83857b-5e17-4878-8f9b-e8d1a65325ba" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.386480 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c814692-df9e-470f-8aad-364d48f82b81-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "4c814692-df9e-470f-8aad-364d48f82b81" (UID: "4c814692-df9e-470f-8aad-364d48f82b81"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.387405 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdbcdf8-785e-4383-ab02-4492360bf4b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cdbcdf8-785e-4383-ab02-4492360bf4b4" (UID: "4cdbcdf8-785e-4383-ab02-4492360bf4b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.393931 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c814692-df9e-470f-8aad-364d48f82b81-config-data" (OuterVolumeSpecName: "config-data") pod "4c814692-df9e-470f-8aad-364d48f82b81" (UID: "4c814692-df9e-470f-8aad-364d48f82b81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.397870 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1e83857b-5e17-4878-8f9b-e8d1a65325ba" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.405002 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.432977 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "1e83857b-5e17-4878-8f9b-e8d1a65325ba" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.436655 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-kube-api-access-q4pqg" (OuterVolumeSpecName: "kube-api-access-q4pqg") pod "1e83857b-5e17-4878-8f9b-e8d1a65325ba" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba"). InnerVolumeSpecName "kube-api-access-q4pqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: E0223 07:11:23.436813 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:11:23 crc kubenswrapper[5047]: E0223 07:11:23.436870 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="5408d3e3-d1ab-45e8-b226-0eb3b26fe183" containerName="nova-cell0-conductor-conductor" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.437418 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1e83857b-5e17-4878-8f9b-e8d1a65325ba-pod-info" (OuterVolumeSpecName: "pod-info") pod "1e83857b-5e17-4878-8f9b-e8d1a65325ba" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.437790 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c814692-df9e-470f-8aad-364d48f82b81-kube-api-access-xfk2q" (OuterVolumeSpecName: "kube-api-access-xfk2q") pod "4c814692-df9e-470f-8aad-364d48f82b81" (UID: "4c814692-df9e-470f-8aad-364d48f82b81"). InnerVolumeSpecName "kube-api-access-xfk2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.438764 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1e83857b-5e17-4878-8f9b-e8d1a65325ba" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.438930 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-api-access-cwc9m" (OuterVolumeSpecName: "kube-api-access-cwc9m") pod "95d08092-3eac-4289-b594-a77d5dfecfe9" (UID: "95d08092-3eac-4289-b594-a77d5dfecfe9"). InnerVolumeSpecName "kube-api-access-cwc9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.439064 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data" (OuterVolumeSpecName: "config-data") pod "1e83857b-5e17-4878-8f9b-e8d1a65325ba" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.445217 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.451668 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.453162 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e83857b-5e17-4878-8f9b-e8d1a65325ba-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1e83857b-5e17-4878-8f9b-e8d1a65325ba" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.456735 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cdbcdf8-785e-4383-ab02-4492360bf4b4-kube-api-access-6h2kg" (OuterVolumeSpecName: "kube-api-access-6h2kg") pod "4cdbcdf8-785e-4383-ab02-4492360bf4b4" (UID: "4cdbcdf8-785e-4383-ab02-4492360bf4b4"). InnerVolumeSpecName "kube-api-access-6h2kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.459728 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.475621 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-server-conf" (OuterVolumeSpecName: "server-conf") pod "1e83857b-5e17-4878-8f9b-e8d1a65325ba" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478485 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478527 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwc9m\" (UniqueName: \"kubernetes.io/projected/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-api-access-cwc9m\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478540 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cdbcdf8-785e-4383-ab02-4492360bf4b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478549 5047 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e83857b-5e17-4878-8f9b-e8d1a65325ba-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478559 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478567 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h2kg\" (UniqueName: \"kubernetes.io/projected/4cdbcdf8-785e-4383-ab02-4492360bf4b4-kube-api-access-6h2kg\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478601 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478616 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c814692-df9e-470f-8aad-364d48f82b81-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478624 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4pqg\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-kube-api-access-q4pqg\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478634 5047 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e83857b-5e17-4878-8f9b-e8d1a65325ba-pod-info\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478643 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfk2q\" (UniqueName: \"kubernetes.io/projected/4c814692-df9e-470f-8aad-364d48f82b81-kube-api-access-xfk2q\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478653 5047 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c814692-df9e-470f-8aad-364d48f82b81-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478663 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478674 5047 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.478682 5047 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e83857b-5e17-4878-8f9b-e8d1a65325ba-server-conf\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.479249 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.515441 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.515543 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b9ebb281-4310-483e-b599-3d3c8775e341/ovn-northd/0.log" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.515616 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.522194 5047 scope.go:117] "RemoveContainer" containerID="182404da8e3655734af75c4e164a4fa1026d1387cba22058cc3f4e7a9156ac6b" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.524517 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c814692-df9e-470f-8aad-364d48f82b81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c814692-df9e-470f-8aad-364d48f82b81" (UID: "4c814692-df9e-470f-8aad-364d48f82b81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.555344 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5b36-account-create-update-sr2bf"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.565997 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5b36-account-create-update-sr2bf"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.566958 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95d08092-3eac-4289-b594-a77d5dfecfe9" (UID: "95d08092-3eac-4289-b594-a77d5dfecfe9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.575367 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.579469 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ebb281-4310-483e-b599-3d3c8775e341-config\") pod \"b9ebb281-4310-483e-b599-3d3c8775e341\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.579551 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-ovn-northd-tls-certs\") pod \"b9ebb281-4310-483e-b599-3d3c8775e341\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.579571 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-combined-ca-bundle\") pod \"b9ebb281-4310-483e-b599-3d3c8775e341\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.579599 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9ebb281-4310-483e-b599-3d3c8775e341-scripts\") pod \"b9ebb281-4310-483e-b599-3d3c8775e341\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.579642 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-metrics-certs-tls-certs\") pod \"b9ebb281-4310-483e-b599-3d3c8775e341\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.579666 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gncfh\" (UniqueName: \"kubernetes.io/projected/b9ebb281-4310-483e-b599-3d3c8775e341-kube-api-access-gncfh\") pod \"b9ebb281-4310-483e-b599-3d3c8775e341\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.579718 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9ebb281-4310-483e-b599-3d3c8775e341-ovn-rundir\") pod \"b9ebb281-4310-483e-b599-3d3c8775e341\" (UID: \"b9ebb281-4310-483e-b599-3d3c8775e341\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.580931 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ebb281-4310-483e-b599-3d3c8775e341-scripts" (OuterVolumeSpecName: "scripts") pod "b9ebb281-4310-483e-b599-3d3c8775e341" (UID: "b9ebb281-4310-483e-b599-3d3c8775e341"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.581135 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9ebb281-4310-483e-b599-3d3c8775e341-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "b9ebb281-4310-483e-b599-3d3c8775e341" (UID: "b9ebb281-4310-483e-b599-3d3c8775e341"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.581603 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9ebb281-4310-483e-b599-3d3c8775e341-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.581624 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.581639 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c814692-df9e-470f-8aad-364d48f82b81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.581649 5047 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9ebb281-4310-483e-b599-3d3c8775e341-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.581668 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ebb281-4310-483e-b599-3d3c8775e341-config" (OuterVolumeSpecName: "config") pod "b9ebb281-4310-483e-b599-3d3c8775e341" (UID: "b9ebb281-4310-483e-b599-3d3c8775e341"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.585748 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.600004 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.619547 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.619603 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.633265 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.638646 5047 scope.go:117] "RemoveContainer" containerID="9fde3b6829cb3940810ad8ed9c3d9375377b1ab751496ae29f4114c773487dad" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.639866 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ebb281-4310-483e-b599-3d3c8775e341-kube-api-access-gncfh" (OuterVolumeSpecName: "kube-api-access-gncfh") pod "b9ebb281-4310-483e-b599-3d3c8775e341" (UID: "b9ebb281-4310-483e-b599-3d3c8775e341"). InnerVolumeSpecName "kube-api-access-gncfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.641717 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5f4db5cb66-tjpmr"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.649953 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5f4db5cb66-tjpmr"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.659686 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5cb54ddf68-t6k79"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.677682 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5cb54ddf68-t6k79"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.683160 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gncfh\" (UniqueName: \"kubernetes.io/projected/b9ebb281-4310-483e-b599-3d3c8775e341-kube-api-access-gncfh\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.683201 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ebb281-4310-483e-b599-3d3c8775e341-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.688713 5047 scope.go:117] "RemoveContainer" containerID="57928662f0ede1894731138f7257539c72b6683c8af427121b07e24b7dd35d1f" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.690642 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ac64-account-create-update-zn4xz"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.696961 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-ac64-account-create-update-zn4xz"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.714285 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.723678 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.733712 5047 scope.go:117] "RemoveContainer" containerID="0108825e07a1a995d5a32fde6ef858690c1ed2dd364cbab05781f1acab8f3fcb" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.770061 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.780826 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.791498 5047 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.796045 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-d867c5cb7-qbvv7"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.842812 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-d867c5cb7-qbvv7"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.847932 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9ebb281-4310-483e-b599-3d3c8775e341" (UID: "b9ebb281-4310-483e-b599-3d3c8775e341"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.873441 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c814692-df9e-470f-8aad-364d48f82b81-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "4c814692-df9e-470f-8aad-364d48f82b81" (UID: "4c814692-df9e-470f-8aad-364d48f82b81"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.873699 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.887787 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-combined-ca-bundle\") pod \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.887872 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-config-data-default\") pod \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.887916 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-galera-tls-certs\") pod \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.887948 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-kolla-config\") pod \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.887997 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-config-data-generated\") pod \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.888031 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw7sm\" (UniqueName: \"kubernetes.io/projected/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-kube-api-access-jw7sm\") pod \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.888083 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.888102 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-operator-scripts\") pod \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\" (UID: \"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b\") " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.888337 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.888355 5047 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.888366 5047 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c814692-df9e-470f-8aad-364d48f82b81-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.888948 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "95d08092-3eac-4289-b594-a77d5dfecfe9" (UID: "95d08092-3eac-4289-b594-a77d5dfecfe9"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.889330 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" (UID: "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.889461 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" (UID: "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.889945 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" (UID: "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.890471 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" (UID: "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.893471 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-kube-api-access-jw7sm" (OuterVolumeSpecName: "kube-api-access-jw7sm") pod "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" (UID: "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b"). InnerVolumeSpecName "kube-api-access-jw7sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.907644 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1e83857b-5e17-4878-8f9b-e8d1a65325ba" (UID: "1e83857b-5e17-4878-8f9b-e8d1a65325ba"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.919421 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" (UID: "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.919527 5047 generic.go:334] "Generic (PLEG): container finished" podID="cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" containerID="585471b95568f45e2db9a4efdec5ea803c6a7ec6ad4820bfee8358da781424fe" exitCode=0 Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.919653 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b","Type":"ContainerDied","Data":"585471b95568f45e2db9a4efdec5ea803c6a7ec6ad4820bfee8358da781424fe"} Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.919662 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.919694 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b","Type":"ContainerDied","Data":"18353dc955d9ec5be7959ce8e71b36cef9f279e75b3d00947b619e9d431c8749"} Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.919741 5047 scope.go:117] "RemoveContainer" containerID="585471b95568f45e2db9a4efdec5ea803c6a7ec6ad4820bfee8358da781424fe" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.923764 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dqll" event={"ID":"8fc70af3-2572-4019-adcb-8aecf538ae27","Type":"ContainerStarted","Data":"2e4aefe747a46d056fa6021910e1471c44a08daf79fea129bbebb23182524d92"} Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.935019 5047 generic.go:334] "Generic (PLEG): container finished" podID="6776acf2-e53f-4892-847d-8667669a5eb9" containerID="4ace61959f8493ad41766db44b20ac96b4147030ebc66648117da372593fea6c" exitCode=0 Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.935114 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6776acf2-e53f-4892-847d-8667669a5eb9","Type":"ContainerDied","Data":"4ace61959f8493ad41766db44b20ac96b4147030ebc66648117da372593fea6c"} Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.939511 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e83857b-5e17-4878-8f9b-e8d1a65325ba","Type":"ContainerDied","Data":"396ccfbf501c118e907af06c08d0447705957a8468cc48604d4379479b6bfce0"} Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.939593 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.943670 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8b47-account-create-update-5zh78"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.943838 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b9ebb281-4310-483e-b599-3d3c8775e341/ovn-northd/0.log" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.943885 5047 generic.go:334] "Generic (PLEG): container finished" podID="b9ebb281-4310-483e-b599-3d3c8775e341" containerID="5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22" exitCode=139 Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.943976 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9ebb281-4310-483e-b599-3d3c8775e341","Type":"ContainerDied","Data":"5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22"} Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.944009 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9ebb281-4310-483e-b599-3d3c8775e341","Type":"ContainerDied","Data":"b5afbdb946c72bb7bbbfe183450ae3c3aad2bc7a179458e53fb8a61799f6f391"} Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.944039 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.955473 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.955484 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" (UID: "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.955630 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bz2gr" event={"ID":"4cdbcdf8-785e-4383-ab02-4492360bf4b4","Type":"ContainerDied","Data":"a1574901785be60857ec18d629ff9b5cfdcf73694ff90509e4c6b4d31858496b"} Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.955812 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.957114 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bz2gr" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.974797 5047 scope.go:117] "RemoveContainer" containerID="49d027d860ba5223563212cb898b35637b4e15b37b156251bf043d0b350a6586" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.977615 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8b47-account-create-update-5zh78"] Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.989129 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "95d08092-3eac-4289-b594-a77d5dfecfe9" (UID: "95d08092-3eac-4289-b594-a77d5dfecfe9"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.989596 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.989623 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf9f5\" (UniqueName: \"kubernetes.io/projected/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-kube-api-access-xf9f5\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.989633 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw7sm\" (UniqueName: \"kubernetes.io/projected/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-kube-api-access-jw7sm\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.989643 5047 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.989663 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.989676 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.989685 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.989695 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e83857b-5e17-4878-8f9b-e8d1a65325ba-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.989705 5047 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/95d08092-3eac-4289-b594-a77d5dfecfe9-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.989715 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.989726 5047 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:23 crc kubenswrapper[5047]: I0223 07:11:23.989735 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.012722 5047 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.013456 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b9ebb281-4310-483e-b599-3d3c8775e341" (UID: "b9ebb281-4310-483e-b599-3d3c8775e341"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.014695 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2dqll" podStartSLOduration=7.241859255 podStartE2EDuration="10.014662012s" podCreationTimestamp="2026-02-23 07:11:14 +0000 UTC" firstStartedPulling="2026-02-23 07:11:20.54197131 +0000 UTC m=+1602.793298434" lastFinishedPulling="2026-02-23 07:11:23.314774057 +0000 UTC m=+1605.566101191" observedRunningTime="2026-02-23 07:11:23.967784557 +0000 UTC m=+1606.219111691" watchObservedRunningTime="2026-02-23 07:11:24.014662012 +0000 UTC m=+1606.265989146" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.025112 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "b9ebb281-4310-483e-b599-3d3c8775e341" (UID: "b9ebb281-4310-483e-b599-3d3c8775e341"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.037995 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" (UID: "cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.092248 5047 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.092296 5047 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.092313 5047 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.092326 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9ebb281-4310-483e-b599-3d3c8775e341-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.126826 5047 scope.go:117] "RemoveContainer" containerID="585471b95568f45e2db9a4efdec5ea803c6a7ec6ad4820bfee8358da781424fe" Feb 23 07:11:24 crc kubenswrapper[5047]: E0223 07:11:24.128254 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"585471b95568f45e2db9a4efdec5ea803c6a7ec6ad4820bfee8358da781424fe\": container with ID starting with 585471b95568f45e2db9a4efdec5ea803c6a7ec6ad4820bfee8358da781424fe not found: ID does not exist" containerID="585471b95568f45e2db9a4efdec5ea803c6a7ec6ad4820bfee8358da781424fe" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.128290 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"585471b95568f45e2db9a4efdec5ea803c6a7ec6ad4820bfee8358da781424fe"} err="failed to get container status \"585471b95568f45e2db9a4efdec5ea803c6a7ec6ad4820bfee8358da781424fe\": rpc error: code = NotFound desc = could not find container \"585471b95568f45e2db9a4efdec5ea803c6a7ec6ad4820bfee8358da781424fe\": container with ID starting with 585471b95568f45e2db9a4efdec5ea803c6a7ec6ad4820bfee8358da781424fe not found: ID does not exist" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.128314 5047 scope.go:117] "RemoveContainer" containerID="49d027d860ba5223563212cb898b35637b4e15b37b156251bf043d0b350a6586" Feb 23 07:11:24 crc kubenswrapper[5047]: E0223 07:11:24.128509 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49d027d860ba5223563212cb898b35637b4e15b37b156251bf043d0b350a6586\": container with ID starting with 49d027d860ba5223563212cb898b35637b4e15b37b156251bf043d0b350a6586 not found: ID does not exist" containerID="49d027d860ba5223563212cb898b35637b4e15b37b156251bf043d0b350a6586" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.128529 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49d027d860ba5223563212cb898b35637b4e15b37b156251bf043d0b350a6586"} err="failed to get container status \"49d027d860ba5223563212cb898b35637b4e15b37b156251bf043d0b350a6586\": rpc error: code = NotFound desc = could not find container \"49d027d860ba5223563212cb898b35637b4e15b37b156251bf043d0b350a6586\": container with ID starting with 49d027d860ba5223563212cb898b35637b4e15b37b156251bf043d0b350a6586 not found: ID does not exist" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.128542 5047 scope.go:117] "RemoveContainer" containerID="2ac5bc2b66d5e8526a58472b2784d5b39320ba72eae19f8bc017391874b9616a" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.176698 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.186975 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bz2gr"] Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.191242 5047 scope.go:117] "RemoveContainer" containerID="57718f64876e2968f7e2d3c286ddd88b4f77442fd1898477e29b9138ee17816a" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.199878 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bz2gr"] Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.215189 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:11:24 crc kubenswrapper[5047]: E0223 07:11:24.248229 5047 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 23 07:11:24 crc kubenswrapper[5047]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-23T07:11:16Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 23 07:11:24 crc kubenswrapper[5047]: /etc/init.d/functions: line 589: 449 Alarm clock "$@" Feb 23 07:11:24 crc kubenswrapper[5047]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-fk6gc" message=< Feb 23 07:11:24 crc kubenswrapper[5047]: Exiting ovn-controller (1) [FAILED] Feb 23 07:11:24 crc kubenswrapper[5047]: Killing ovn-controller (1) [ OK ] Feb 23 07:11:24 crc kubenswrapper[5047]: Killing ovn-controller (1) with SIGKILL [ OK ] Feb 23 07:11:24 crc kubenswrapper[5047]: 2026-02-23T07:11:16Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 23 07:11:24 crc kubenswrapper[5047]: /etc/init.d/functions: line 589: 449 Alarm clock "$@" Feb 23 07:11:24 crc kubenswrapper[5047]: > Feb 23 07:11:24 crc kubenswrapper[5047]: E0223 07:11:24.248279 5047 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 23 07:11:24 crc kubenswrapper[5047]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-23T07:11:16Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 23 07:11:24 crc kubenswrapper[5047]: /etc/init.d/functions: line 589: 449 Alarm clock "$@" Feb 23 07:11:24 crc kubenswrapper[5047]: > pod="openstack/ovn-controller-fk6gc" podUID="6f9e1257-3765-4d4e-8110-81c55d1546d4" containerName="ovn-controller" containerID="cri-o://6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.248334 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-fk6gc" podUID="6f9e1257-3765-4d4e-8110-81c55d1546d4" containerName="ovn-controller" containerID="cri-o://6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2" gracePeriod=22 Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.248645 5047 scope.go:117] "RemoveContainer" containerID="1495f22cd17554121ec2c04f6d6b705647c2b88d975d0e151cc974cc3354d8b0" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.248878 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.259138 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.268287 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.296307 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6776acf2-e53f-4892-847d-8667669a5eb9\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.296403 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-plugins\") pod \"6776acf2-e53f-4892-847d-8667669a5eb9\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.296435 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-server-conf\") pod \"6776acf2-e53f-4892-847d-8667669a5eb9\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.296461 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjgtg\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-kube-api-access-hjgtg\") pod \"6776acf2-e53f-4892-847d-8667669a5eb9\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.296486 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6776acf2-e53f-4892-847d-8667669a5eb9-erlang-cookie-secret\") pod \"6776acf2-e53f-4892-847d-8667669a5eb9\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.296515 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-confd\") pod \"6776acf2-e53f-4892-847d-8667669a5eb9\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.296537 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6776acf2-e53f-4892-847d-8667669a5eb9-pod-info\") pod \"6776acf2-e53f-4892-847d-8667669a5eb9\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.296624 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data\") pod \"6776acf2-e53f-4892-847d-8667669a5eb9\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.296666 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-erlang-cookie\") pod \"6776acf2-e53f-4892-847d-8667669a5eb9\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.296714 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-plugins-conf\") pod \"6776acf2-e53f-4892-847d-8667669a5eb9\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.296778 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-tls\") pod \"6776acf2-e53f-4892-847d-8667669a5eb9\" (UID: \"6776acf2-e53f-4892-847d-8667669a5eb9\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.310613 5047 scope.go:117] "RemoveContainer" containerID="5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.311353 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6776acf2-e53f-4892-847d-8667669a5eb9" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.313239 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6776acf2-e53f-4892-847d-8667669a5eb9" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.313664 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6776acf2-e53f-4892-847d-8667669a5eb9" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.314234 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6776acf2-e53f-4892-847d-8667669a5eb9" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.323784 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6776acf2-e53f-4892-847d-8667669a5eb9-pod-info" (OuterVolumeSpecName: "pod-info") pod "6776acf2-e53f-4892-847d-8667669a5eb9" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.324487 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "6776acf2-e53f-4892-847d-8667669a5eb9" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.334025 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-kube-api-access-hjgtg" (OuterVolumeSpecName: "kube-api-access-hjgtg") pod "6776acf2-e53f-4892-847d-8667669a5eb9" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9"). InnerVolumeSpecName "kube-api-access-hjgtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.337428 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6776acf2-e53f-4892-847d-8667669a5eb9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6776acf2-e53f-4892-847d-8667669a5eb9" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.368001 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data" (OuterVolumeSpecName: "config-data") pod "6776acf2-e53f-4892-847d-8667669a5eb9" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.372696 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e83857b-5e17-4878-8f9b-e8d1a65325ba" path="/var/lib/kubelet/pods/1e83857b-5e17-4878-8f9b-e8d1a65325ba/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.375135 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27668b66-4868-448a-b2dd-e270ed4bc677" path="/var/lib/kubelet/pods/27668b66-4868-448a-b2dd-e270ed4bc677/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.385635 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4352c518-ada9-4e5e-9327-5bd3c34a2796" path="/var/lib/kubelet/pods/4352c518-ada9-4e5e-9327-5bd3c34a2796/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.386708 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c814692-df9e-470f-8aad-364d48f82b81" path="/var/lib/kubelet/pods/4c814692-df9e-470f-8aad-364d48f82b81/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.387427 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cdbcdf8-785e-4383-ab02-4492360bf4b4" path="/var/lib/kubelet/pods/4cdbcdf8-785e-4383-ab02-4492360bf4b4/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.396462 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c628150-3bde-4740-9c94-dc208f61ade2" path="/var/lib/kubelet/pods/5c628150-3bde-4740-9c94-dc208f61ade2/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.397221 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-server-conf" (OuterVolumeSpecName: "server-conf") pod "6776acf2-e53f-4892-847d-8667669a5eb9" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.397922 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69d03df4-c334-4c64-a273-e4e307df5add" path="/var/lib/kubelet/pods/69d03df4-c334-4c64-a273-e4e307df5add/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.399487 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d9eb562-3f84-458f-885a-e2fbb3e86bf3" path="/var/lib/kubelet/pods/7d9eb562-3f84-458f-885a-e2fbb3e86bf3/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.399513 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.399557 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.399569 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.399578 5047 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-server-conf\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.399588 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjgtg\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-kube-api-access-hjgtg\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.399600 5047 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6776acf2-e53f-4892-847d-8667669a5eb9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.399609 5047 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6776acf2-e53f-4892-847d-8667669a5eb9-pod-info\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.399619 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.399628 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.399635 5047 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6776acf2-e53f-4892-847d-8667669a5eb9-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.400428 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82712c2b-a91c-4e0e-9400-0624b0459f57" path="/var/lib/kubelet/pods/82712c2b-a91c-4e0e-9400-0624b0459f57/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.400804 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="911535c0-45eb-4361-b169-fad54a54d78b" path="/var/lib/kubelet/pods/911535c0-45eb-4361-b169-fad54a54d78b/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.401310 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="969db92a-069e-4713-bca7-5e4ebb89d612" path="/var/lib/kubelet/pods/969db92a-069e-4713-bca7-5e4ebb89d612/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.405465 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b66eae6-4565-4f56-8bdc-009aa1101a64" path="/var/lib/kubelet/pods/9b66eae6-4565-4f56-8bdc-009aa1101a64/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.406220 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a452777f-60a7-4cfc-9e9b-b262be0a27cf" path="/var/lib/kubelet/pods/a452777f-60a7-4cfc-9e9b-b262be0a27cf/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.409578 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e6c86c-c8da-4c0b-8aec-758d3b2f5731" path="/var/lib/kubelet/pods/c9e6c86c-c8da-4c0b-8aec-758d3b2f5731/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.411603 5047 scope.go:117] "RemoveContainer" containerID="1495f22cd17554121ec2c04f6d6b705647c2b88d975d0e151cc974cc3354d8b0" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.413304 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db4c480a-88f1-42e1-bdce-21bdb85ecc48" path="/var/lib/kubelet/pods/db4c480a-88f1-42e1-bdce-21bdb85ecc48/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.426345 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb737ca7-c18f-4ff9-9285-bb35ee17cd05" path="/var/lib/kubelet/pods/eb737ca7-c18f-4ff9-9285-bb35ee17cd05/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.427709 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f29958fb-3a36-427e-8094-62f7522b7a17" path="/var/lib/kubelet/pods/f29958fb-3a36-427e-8094-62f7522b7a17/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: E0223 07:11:24.428639 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1495f22cd17554121ec2c04f6d6b705647c2b88d975d0e151cc974cc3354d8b0\": container with ID starting with 1495f22cd17554121ec2c04f6d6b705647c2b88d975d0e151cc974cc3354d8b0 not found: ID does not exist" containerID="1495f22cd17554121ec2c04f6d6b705647c2b88d975d0e151cc974cc3354d8b0" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.428699 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1495f22cd17554121ec2c04f6d6b705647c2b88d975d0e151cc974cc3354d8b0"} err="failed to get container status \"1495f22cd17554121ec2c04f6d6b705647c2b88d975d0e151cc974cc3354d8b0\": rpc error: code = NotFound desc = could not find container \"1495f22cd17554121ec2c04f6d6b705647c2b88d975d0e151cc974cc3354d8b0\": container with ID starting with 1495f22cd17554121ec2c04f6d6b705647c2b88d975d0e151cc974cc3354d8b0 not found: ID does not exist" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.428735 5047 scope.go:117] "RemoveContainer" containerID="5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.430695 5047 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 23 07:11:24 crc kubenswrapper[5047]: E0223 07:11:24.434090 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22\": container with ID starting with 5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22 not found: ID does not exist" containerID="5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.434142 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22"} err="failed to get container status \"5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22\": rpc error: code = NotFound desc = could not find container \"5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22\": container with ID starting with 5d5ae3af81afb5006470700716f06182bd22121762f51c43cfb8464d91357b22 not found: ID does not exist" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.434171 5047 scope.go:117] "RemoveContainer" containerID="0d84f1852bd6b7dbc4d9bdf4ea958827422207ab1118be596167434c3cd34775" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.434198 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" path="/var/lib/kubelet/pods/fa3349ce-92c0-4fc5-ae0e-6424be7ca179/volumes" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.443018 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.443061 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.443076 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.443088 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.443099 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.443110 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.486129 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6776acf2-e53f-4892-847d-8667669a5eb9" (UID: "6776acf2-e53f-4892-847d-8667669a5eb9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.509170 5047 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.510728 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6776acf2-e53f-4892-847d-8667669a5eb9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.536570 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/keystone-5864496f8c-9bg5d" podUID="f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.0.154:5000/v3\": read tcp 10.217.0.2:37664->10.217.0.154:5000: read: connection reset by peer" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.749700 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fk6gc_6f9e1257-3765-4d4e-8110-81c55d1546d4/ovn-controller/0.log" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.749821 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fk6gc" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.933720 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9e1257-3765-4d4e-8110-81c55d1546d4-combined-ca-bundle\") pod \"6f9e1257-3765-4d4e-8110-81c55d1546d4\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.934158 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79mt8\" (UniqueName: \"kubernetes.io/projected/6f9e1257-3765-4d4e-8110-81c55d1546d4-kube-api-access-79mt8\") pod \"6f9e1257-3765-4d4e-8110-81c55d1546d4\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.934198 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-log-ovn\") pod \"6f9e1257-3765-4d4e-8110-81c55d1546d4\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.934429 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-run\") pod \"6f9e1257-3765-4d4e-8110-81c55d1546d4\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.934489 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f9e1257-3765-4d4e-8110-81c55d1546d4-ovn-controller-tls-certs\") pod \"6f9e1257-3765-4d4e-8110-81c55d1546d4\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.934506 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-run-ovn\") pod \"6f9e1257-3765-4d4e-8110-81c55d1546d4\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.934654 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-run" (OuterVolumeSpecName: "var-run") pod "6f9e1257-3765-4d4e-8110-81c55d1546d4" (UID: "6f9e1257-3765-4d4e-8110-81c55d1546d4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.934713 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6f9e1257-3765-4d4e-8110-81c55d1546d4" (UID: "6f9e1257-3765-4d4e-8110-81c55d1546d4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.935171 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f9e1257-3765-4d4e-8110-81c55d1546d4-scripts\") pod \"6f9e1257-3765-4d4e-8110-81c55d1546d4\" (UID: \"6f9e1257-3765-4d4e-8110-81c55d1546d4\") " Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.935621 5047 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.935634 5047 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.939977 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9e1257-3765-4d4e-8110-81c55d1546d4-scripts" (OuterVolumeSpecName: "scripts") pod "6f9e1257-3765-4d4e-8110-81c55d1546d4" (UID: "6f9e1257-3765-4d4e-8110-81c55d1546d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.940037 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6f9e1257-3765-4d4e-8110-81c55d1546d4" (UID: "6f9e1257-3765-4d4e-8110-81c55d1546d4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.940962 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9e1257-3765-4d4e-8110-81c55d1546d4-kube-api-access-79mt8" (OuterVolumeSpecName: "kube-api-access-79mt8") pod "6f9e1257-3765-4d4e-8110-81c55d1546d4" (UID: "6f9e1257-3765-4d4e-8110-81c55d1546d4"). InnerVolumeSpecName "kube-api-access-79mt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.988989 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9e1257-3765-4d4e-8110-81c55d1546d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f9e1257-3765-4d4e-8110-81c55d1546d4" (UID: "6f9e1257-3765-4d4e-8110-81c55d1546d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.992208 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fk6gc_6f9e1257-3765-4d4e-8110-81c55d1546d4/ovn-controller/0.log" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.992247 5047 generic.go:334] "Generic (PLEG): container finished" podID="6f9e1257-3765-4d4e-8110-81c55d1546d4" containerID="6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2" exitCode=137 Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.992297 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fk6gc" event={"ID":"6f9e1257-3765-4d4e-8110-81c55d1546d4","Type":"ContainerDied","Data":"6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2"} Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.992330 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fk6gc" event={"ID":"6f9e1257-3765-4d4e-8110-81c55d1546d4","Type":"ContainerDied","Data":"37a65ce9726d6562bc400aab960c5380596ebe4d5aeeb9f93d6f608a1755cf92"} Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.992349 5047 scope.go:117] "RemoveContainer" containerID="6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2" Feb 23 07:11:24 crc kubenswrapper[5047]: I0223 07:11:24.992501 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fk6gc" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.002860 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6776acf2-e53f-4892-847d-8667669a5eb9","Type":"ContainerDied","Data":"4cb31b7c6c4c44a36739dd396af6d10d4f7cad518a35bfea978bc2caa4d8e43e"} Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.003187 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.012339 5047 generic.go:334] "Generic (PLEG): container finished" podID="e307d3a1-af99-460b-bdd0-24e26de38751" containerID="235052c048504a3689eeba925718b5995ae8bb50fcdd7c36cd0a37a3cac313c6" exitCode=0 Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.012419 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e307d3a1-af99-460b-bdd0-24e26de38751","Type":"ContainerDied","Data":"235052c048504a3689eeba925718b5995ae8bb50fcdd7c36cd0a37a3cac313c6"} Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.025216 5047 generic.go:334] "Generic (PLEG): container finished" podID="f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" containerID="fc5d3551724091dfc09fc2b99a9eac93b56874e2010d67bb0d6736fc0d101cdb" exitCode=0 Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.026032 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5864496f8c-9bg5d" event={"ID":"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08","Type":"ContainerDied","Data":"fc5d3551724091dfc09fc2b99a9eac93b56874e2010d67bb0d6736fc0d101cdb"} Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.045757 5047 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f9e1257-3765-4d4e-8110-81c55d1546d4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.045785 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f9e1257-3765-4d4e-8110-81c55d1546d4-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.045796 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9e1257-3765-4d4e-8110-81c55d1546d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.045805 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79mt8\" (UniqueName: \"kubernetes.io/projected/6f9e1257-3765-4d4e-8110-81c55d1546d4-kube-api-access-79mt8\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.054737 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9e1257-3765-4d4e-8110-81c55d1546d4-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "6f9e1257-3765-4d4e-8110-81c55d1546d4" (UID: "6f9e1257-3765-4d4e-8110-81c55d1546d4"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.150925 5047 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f9e1257-3765-4d4e-8110-81c55d1546d4-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: E0223 07:11:25.204885 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:25 crc kubenswrapper[5047]: E0223 07:11:25.205465 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:25 crc kubenswrapper[5047]: E0223 07:11:25.205833 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:25 crc kubenswrapper[5047]: E0223 07:11:25.205860 5047 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovsdb-server" Feb 23 07:11:25 crc kubenswrapper[5047]: E0223 07:11:25.206621 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:25 crc kubenswrapper[5047]: E0223 07:11:25.208128 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:25 crc kubenswrapper[5047]: E0223 07:11:25.213061 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:25 crc kubenswrapper[5047]: E0223 07:11:25.213144 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovs-vswitchd" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.311372 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.313328 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.314606 5047 scope.go:117] "RemoveContainer" containerID="6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2" Feb 23 07:11:25 crc kubenswrapper[5047]: E0223 07:11:25.322183 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2\": container with ID starting with 6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2 not found: ID does not exist" containerID="6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.322241 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2"} err="failed to get container status \"6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2\": rpc error: code = NotFound desc = could not find container \"6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2\": container with ID starting with 6f3319bfcc7d65090dc9df189b620cc8ae6191aa2dd58d96ba872a90f1c567e2 not found: ID does not exist" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.322276 5047 scope.go:117] "RemoveContainer" containerID="4ace61959f8493ad41766db44b20ac96b4147030ebc66648117da372593fea6c" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.324791 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.341264 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.354377 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e307d3a1-af99-460b-bdd0-24e26de38751-run-httpd\") pod \"e307d3a1-af99-460b-bdd0-24e26de38751\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.354604 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-credential-keys\") pod \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.355828 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e307d3a1-af99-460b-bdd0-24e26de38751-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e307d3a1-af99-460b-bdd0-24e26de38751" (UID: "e307d3a1-af99-460b-bdd0-24e26de38751"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.358941 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-internal-tls-certs\") pod \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.359256 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-scripts\") pod \"e307d3a1-af99-460b-bdd0-24e26de38751\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.359319 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-ceilometer-tls-certs\") pod \"e307d3a1-af99-460b-bdd0-24e26de38751\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.359375 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9xwq\" (UniqueName: \"kubernetes.io/projected/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-kube-api-access-r9xwq\") pod \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.359404 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-config-data\") pod \"e307d3a1-af99-460b-bdd0-24e26de38751\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.359437 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-public-tls-certs\") pod \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.359490 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-fernet-keys\") pod \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.360153 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e307d3a1-af99-460b-bdd0-24e26de38751-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.380145 5047 scope.go:117] "RemoveContainer" containerID="1550a785ef91ff8b4c17d9b0e6b0acd706caa1828e577823dabee876eb27c6e4" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.380143 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-kube-api-access-r9xwq" (OuterVolumeSpecName: "kube-api-access-r9xwq") pod "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" (UID: "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08"). InnerVolumeSpecName "kube-api-access-r9xwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.380231 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-scripts" (OuterVolumeSpecName: "scripts") pod "e307d3a1-af99-460b-bdd0-24e26de38751" (UID: "e307d3a1-af99-460b-bdd0-24e26de38751"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.380834 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" (UID: "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.384013 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" (UID: "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.399004 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fk6gc"] Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.408921 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fk6gc"] Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.414050 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e307d3a1-af99-460b-bdd0-24e26de38751" (UID: "e307d3a1-af99-460b-bdd0-24e26de38751"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.434225 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" (UID: "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.458130 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" (UID: "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.462127 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2d5z\" (UniqueName: \"kubernetes.io/projected/e307d3a1-af99-460b-bdd0-24e26de38751-kube-api-access-z2d5z\") pod \"e307d3a1-af99-460b-bdd0-24e26de38751\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.462198 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-scripts\") pod \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.462228 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-combined-ca-bundle\") pod \"e307d3a1-af99-460b-bdd0-24e26de38751\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.462264 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-combined-ca-bundle\") pod \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.462311 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-config-data\") pod \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\" (UID: \"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.462334 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-sg-core-conf-yaml\") pod \"e307d3a1-af99-460b-bdd0-24e26de38751\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.463077 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e307d3a1-af99-460b-bdd0-24e26de38751-log-httpd\") pod \"e307d3a1-af99-460b-bdd0-24e26de38751\" (UID: \"e307d3a1-af99-460b-bdd0-24e26de38751\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.463546 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.463565 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.463578 5047 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.463590 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9xwq\" (UniqueName: \"kubernetes.io/projected/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-kube-api-access-r9xwq\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.463604 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.463634 5047 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.463644 5047 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.464054 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e307d3a1-af99-460b-bdd0-24e26de38751-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e307d3a1-af99-460b-bdd0-24e26de38751" (UID: "e307d3a1-af99-460b-bdd0-24e26de38751"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.480828 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e307d3a1-af99-460b-bdd0-24e26de38751-kube-api-access-z2d5z" (OuterVolumeSpecName: "kube-api-access-z2d5z") pod "e307d3a1-af99-460b-bdd0-24e26de38751" (UID: "e307d3a1-af99-460b-bdd0-24e26de38751"). InnerVolumeSpecName "kube-api-access-z2d5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.486650 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-scripts" (OuterVolumeSpecName: "scripts") pod "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" (UID: "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.500658 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" (UID: "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.507686 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-config-data" (OuterVolumeSpecName: "config-data") pod "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" (UID: "f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.507616 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-config-data" (OuterVolumeSpecName: "config-data") pod "e307d3a1-af99-460b-bdd0-24e26de38751" (UID: "e307d3a1-af99-460b-bdd0-24e26de38751"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.530700 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e307d3a1-af99-460b-bdd0-24e26de38751" (UID: "e307d3a1-af99-460b-bdd0-24e26de38751"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.566515 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.566555 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.566566 5047 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.566575 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e307d3a1-af99-460b-bdd0-24e26de38751-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.566585 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.566594 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2d5z\" (UniqueName: \"kubernetes.io/projected/e307d3a1-af99-460b-bdd0-24e26de38751-kube-api-access-z2d5z\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.566602 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.583541 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.583683 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e307d3a1-af99-460b-bdd0-24e26de38751" (UID: "e307d3a1-af99-460b-bdd0-24e26de38751"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.667937 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-config-data-custom\") pod \"8af89633-d2bc-4f80-9e1e-0eb183f11462\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.668049 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af89633-d2bc-4f80-9e1e-0eb183f11462-logs\") pod \"8af89633-d2bc-4f80-9e1e-0eb183f11462\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.668114 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-combined-ca-bundle\") pod \"8af89633-d2bc-4f80-9e1e-0eb183f11462\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.668168 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-config-data\") pod \"8af89633-d2bc-4f80-9e1e-0eb183f11462\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.668208 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r4sq\" (UniqueName: \"kubernetes.io/projected/8af89633-d2bc-4f80-9e1e-0eb183f11462-kube-api-access-6r4sq\") pod \"8af89633-d2bc-4f80-9e1e-0eb183f11462\" (UID: \"8af89633-d2bc-4f80-9e1e-0eb183f11462\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.668540 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d3a1-af99-460b-bdd0-24e26de38751-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.669006 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8af89633-d2bc-4f80-9e1e-0eb183f11462-logs" (OuterVolumeSpecName: "logs") pod "8af89633-d2bc-4f80-9e1e-0eb183f11462" (UID: "8af89633-d2bc-4f80-9e1e-0eb183f11462"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.672195 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.693307 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8af89633-d2bc-4f80-9e1e-0eb183f11462" (UID: "8af89633-d2bc-4f80-9e1e-0eb183f11462"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.712159 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8af89633-d2bc-4f80-9e1e-0eb183f11462-kube-api-access-6r4sq" (OuterVolumeSpecName: "kube-api-access-6r4sq") pod "8af89633-d2bc-4f80-9e1e-0eb183f11462" (UID: "8af89633-d2bc-4f80-9e1e-0eb183f11462"). InnerVolumeSpecName "kube-api-access-6r4sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.729181 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8af89633-d2bc-4f80-9e1e-0eb183f11462" (UID: "8af89633-d2bc-4f80-9e1e-0eb183f11462"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.749562 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.750087 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-config-data" (OuterVolumeSpecName: "config-data") pod "8af89633-d2bc-4f80-9e1e-0eb183f11462" (UID: "8af89633-d2bc-4f80-9e1e-0eb183f11462"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.769469 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8x4z\" (UniqueName: \"kubernetes.io/projected/556e6ac3-8c64-4ee2-95f2-511a07bf220b-kube-api-access-b8x4z\") pod \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.769595 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwzvd\" (UniqueName: \"kubernetes.io/projected/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-kube-api-access-kwzvd\") pod \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\" (UID: \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.771485 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/556e6ac3-8c64-4ee2-95f2-511a07bf220b-logs\") pod \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.771574 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-combined-ca-bundle\") pod \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\" (UID: \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.771660 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-combined-ca-bundle\") pod \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.771735 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-config-data\") pod \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\" (UID: \"5408d3e3-d1ab-45e8-b226-0eb3b26fe183\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.771757 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-config-data\") pod \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.771818 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-config-data-custom\") pod \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\" (UID: \"556e6ac3-8c64-4ee2-95f2-511a07bf220b\") " Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.772352 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af89633-d2bc-4f80-9e1e-0eb183f11462-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.772641 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.772659 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.772674 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r4sq\" (UniqueName: \"kubernetes.io/projected/8af89633-d2bc-4f80-9e1e-0eb183f11462-kube-api-access-6r4sq\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.772684 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8af89633-d2bc-4f80-9e1e-0eb183f11462-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.776871 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/556e6ac3-8c64-4ee2-95f2-511a07bf220b-logs" (OuterVolumeSpecName: "logs") pod "556e6ac3-8c64-4ee2-95f2-511a07bf220b" (UID: "556e6ac3-8c64-4ee2-95f2-511a07bf220b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.777466 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "556e6ac3-8c64-4ee2-95f2-511a07bf220b" (UID: "556e6ac3-8c64-4ee2-95f2-511a07bf220b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.779869 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/556e6ac3-8c64-4ee2-95f2-511a07bf220b-kube-api-access-b8x4z" (OuterVolumeSpecName: "kube-api-access-b8x4z") pod "556e6ac3-8c64-4ee2-95f2-511a07bf220b" (UID: "556e6ac3-8c64-4ee2-95f2-511a07bf220b"). InnerVolumeSpecName "kube-api-access-b8x4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.787193 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-kube-api-access-kwzvd" (OuterVolumeSpecName: "kube-api-access-kwzvd") pod "5408d3e3-d1ab-45e8-b226-0eb3b26fe183" (UID: "5408d3e3-d1ab-45e8-b226-0eb3b26fe183"). InnerVolumeSpecName "kube-api-access-kwzvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.813743 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5408d3e3-d1ab-45e8-b226-0eb3b26fe183" (UID: "5408d3e3-d1ab-45e8-b226-0eb3b26fe183"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.813867 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "556e6ac3-8c64-4ee2-95f2-511a07bf220b" (UID: "556e6ac3-8c64-4ee2-95f2-511a07bf220b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.817577 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-config-data" (OuterVolumeSpecName: "config-data") pod "5408d3e3-d1ab-45e8-b226-0eb3b26fe183" (UID: "5408d3e3-d1ab-45e8-b226-0eb3b26fe183"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.846346 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-config-data" (OuterVolumeSpecName: "config-data") pod "556e6ac3-8c64-4ee2-95f2-511a07bf220b" (UID: "556e6ac3-8c64-4ee2-95f2-511a07bf220b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.881815 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.881861 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.881873 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.881882 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/556e6ac3-8c64-4ee2-95f2-511a07bf220b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.881893 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8x4z\" (UniqueName: \"kubernetes.io/projected/556e6ac3-8c64-4ee2-95f2-511a07bf220b-kube-api-access-b8x4z\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.881923 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwzvd\" (UniqueName: \"kubernetes.io/projected/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-kube-api-access-kwzvd\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.881933 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/556e6ac3-8c64-4ee2-95f2-511a07bf220b-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.881943 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5408d3e3-d1ab-45e8-b226-0eb3b26fe183-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.894395 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:25 crc kubenswrapper[5047]: I0223 07:11:25.894638 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.054570 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e307d3a1-af99-460b-bdd0-24e26de38751","Type":"ContainerDied","Data":"db94c1d793dd31119a747270bdd5f33a4fe8a08855386251bf346d8de1893013"} Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.054608 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.054660 5047 scope.go:117] "RemoveContainer" containerID="1843426c8ae30cc809f15219c4a71a415a94a52a2bc0a9a35f7e3c5e3ca3cd62" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.061056 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5864496f8c-9bg5d" event={"ID":"f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08","Type":"ContainerDied","Data":"8c777d86ec08197b30580a725928ba2f64aeba450777ffc5f6f6246ad3e4b4ea"} Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.061580 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5864496f8c-9bg5d" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.072659 5047 generic.go:334] "Generic (PLEG): container finished" podID="556e6ac3-8c64-4ee2-95f2-511a07bf220b" containerID="d7297c9cf9ea2fd2ebb4ac47a524950334eb77e75ff71aaa81cc766db2c30786" exitCode=0 Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.072731 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" event={"ID":"556e6ac3-8c64-4ee2-95f2-511a07bf220b","Type":"ContainerDied","Data":"d7297c9cf9ea2fd2ebb4ac47a524950334eb77e75ff71aaa81cc766db2c30786"} Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.072762 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" event={"ID":"556e6ac3-8c64-4ee2-95f2-511a07bf220b","Type":"ContainerDied","Data":"1b97eb217f1c10d1c7abd316a031554173458d0214ab00813822aaa1492304ec"} Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.072767 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67c5f5b45b-szhrl" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.082239 5047 generic.go:334] "Generic (PLEG): container finished" podID="8b703a8a-7e8f-4565-abc7-86f93a83e742" containerID="09006301b0f6020685672d00703b449ef9d808aef32ba06e059c9e2356603a84" exitCode=0 Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.082311 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-785b8b86cf-6rvf8" event={"ID":"8b703a8a-7e8f-4565-abc7-86f93a83e742","Type":"ContainerDied","Data":"09006301b0f6020685672d00703b449ef9d808aef32ba06e059c9e2356603a84"} Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.089334 5047 generic.go:334] "Generic (PLEG): container finished" podID="8af89633-d2bc-4f80-9e1e-0eb183f11462" containerID="9aac78820b50823a3db600bfbb072af48e62a5a3552b7f3389ee00434e00efbe" exitCode=0 Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.089421 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-799dddc985-b669w" event={"ID":"8af89633-d2bc-4f80-9e1e-0eb183f11462","Type":"ContainerDied","Data":"9aac78820b50823a3db600bfbb072af48e62a5a3552b7f3389ee00434e00efbe"} Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.089456 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-799dddc985-b669w" event={"ID":"8af89633-d2bc-4f80-9e1e-0eb183f11462","Type":"ContainerDied","Data":"00e565f531572fa03633468e40118a0379a2309d335c1660f9e708811cfb3222"} Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.089522 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-799dddc985-b669w" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.094243 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.095451 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5408d3e3-d1ab-45e8-b226-0eb3b26fe183","Type":"ContainerDied","Data":"64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd"} Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.098231 5047 generic.go:334] "Generic (PLEG): container finished" podID="5408d3e3-d1ab-45e8-b226-0eb3b26fe183" containerID="64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd" exitCode=0 Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.098491 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5408d3e3-d1ab-45e8-b226-0eb3b26fe183","Type":"ContainerDied","Data":"3b60870e78547f020b35c009e24ea5aeb788ace7ef0605c9ce610748dce17860"} Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.106711 5047 scope.go:117] "RemoveContainer" containerID="de1bae2fc8abe4518e053551089b717523cf45b88661909a16503dc7aee58baf" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.108358 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5864496f8c-9bg5d"] Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.136683 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5864496f8c-9bg5d"] Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.142081 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.149687 5047 scope.go:117] "RemoveContainer" containerID="235052c048504a3689eeba925718b5995ae8bb50fcdd7c36cd0a37a3cac313c6" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.151539 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.160666 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-67c5f5b45b-szhrl"] Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.163400 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-67c5f5b45b-szhrl"] Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.179934 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.193967 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.200194 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-799dddc985-b669w"] Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.205343 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-799dddc985-b669w"] Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.225009 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.230128 5047 scope.go:117] "RemoveContainer" containerID="ac42b72f2ef3e70086002c27d0507227c425dcef23dd72ee4b41f90f506d1f7e" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.263422 5047 scope.go:117] "RemoveContainer" containerID="fc5d3551724091dfc09fc2b99a9eac93b56874e2010d67bb0d6736fc0d101cdb" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.288363 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-public-tls-certs\") pod \"8b703a8a-7e8f-4565-abc7-86f93a83e742\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.288415 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-httpd-config\") pod \"8b703a8a-7e8f-4565-abc7-86f93a83e742\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.288439 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-ovndb-tls-certs\") pod \"8b703a8a-7e8f-4565-abc7-86f93a83e742\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.288573 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-combined-ca-bundle\") pod \"8b703a8a-7e8f-4565-abc7-86f93a83e742\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.288612 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnvxl\" (UniqueName: \"kubernetes.io/projected/8b703a8a-7e8f-4565-abc7-86f93a83e742-kube-api-access-hnvxl\") pod \"8b703a8a-7e8f-4565-abc7-86f93a83e742\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.288637 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-config\") pod \"8b703a8a-7e8f-4565-abc7-86f93a83e742\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.288695 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-internal-tls-certs\") pod \"8b703a8a-7e8f-4565-abc7-86f93a83e742\" (UID: \"8b703a8a-7e8f-4565-abc7-86f93a83e742\") " Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.297323 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b703a8a-7e8f-4565-abc7-86f93a83e742-kube-api-access-hnvxl" (OuterVolumeSpecName: "kube-api-access-hnvxl") pod "8b703a8a-7e8f-4565-abc7-86f93a83e742" (UID: "8b703a8a-7e8f-4565-abc7-86f93a83e742"). InnerVolumeSpecName "kube-api-access-hnvxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.307251 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8b703a8a-7e8f-4565-abc7-86f93a83e742" (UID: "8b703a8a-7e8f-4565-abc7-86f93a83e742"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.307489 5047 scope.go:117] "RemoveContainer" containerID="d7297c9cf9ea2fd2ebb4ac47a524950334eb77e75ff71aaa81cc766db2c30786" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.326290 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-config" (OuterVolumeSpecName: "config") pod "8b703a8a-7e8f-4565-abc7-86f93a83e742" (UID: "8b703a8a-7e8f-4565-abc7-86f93a83e742"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.328618 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b703a8a-7e8f-4565-abc7-86f93a83e742" (UID: "8b703a8a-7e8f-4565-abc7-86f93a83e742"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.330126 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8b703a8a-7e8f-4565-abc7-86f93a83e742" (UID: "8b703a8a-7e8f-4565-abc7-86f93a83e742"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.348646 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8b703a8a-7e8f-4565-abc7-86f93a83e742" (UID: "8b703a8a-7e8f-4565-abc7-86f93a83e742"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.352393 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5408d3e3-d1ab-45e8-b226-0eb3b26fe183" path="/var/lib/kubelet/pods/5408d3e3-d1ab-45e8-b226-0eb3b26fe183/volumes" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.353145 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="556e6ac3-8c64-4ee2-95f2-511a07bf220b" path="/var/lib/kubelet/pods/556e6ac3-8c64-4ee2-95f2-511a07bf220b/volumes" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.353842 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6776acf2-e53f-4892-847d-8667669a5eb9" path="/var/lib/kubelet/pods/6776acf2-e53f-4892-847d-8667669a5eb9/volumes" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.354420 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9e1257-3765-4d4e-8110-81c55d1546d4" path="/var/lib/kubelet/pods/6f9e1257-3765-4d4e-8110-81c55d1546d4/volumes" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.355655 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8af89633-d2bc-4f80-9e1e-0eb183f11462" path="/var/lib/kubelet/pods/8af89633-d2bc-4f80-9e1e-0eb183f11462/volumes" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.356267 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d08092-3eac-4289-b594-a77d5dfecfe9" path="/var/lib/kubelet/pods/95d08092-3eac-4289-b594-a77d5dfecfe9/volumes" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.357360 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9ebb281-4310-483e-b599-3d3c8775e341" path="/var/lib/kubelet/pods/b9ebb281-4310-483e-b599-3d3c8775e341/volumes" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.358058 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" path="/var/lib/kubelet/pods/cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b/volumes" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.358623 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" path="/var/lib/kubelet/pods/e307d3a1-af99-460b-bdd0-24e26de38751/volumes" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.359830 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" path="/var/lib/kubelet/pods/f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08/volumes" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.364880 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8b703a8a-7e8f-4565-abc7-86f93a83e742" (UID: "8b703a8a-7e8f-4565-abc7-86f93a83e742"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.368406 5047 scope.go:117] "RemoveContainer" containerID="aa4a5146307cec757547a6d59c96484c78787b2844a2b96dc84e891c5cb4091e" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.386157 5047 scope.go:117] "RemoveContainer" containerID="d7297c9cf9ea2fd2ebb4ac47a524950334eb77e75ff71aaa81cc766db2c30786" Feb 23 07:11:26 crc kubenswrapper[5047]: E0223 07:11:26.386638 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7297c9cf9ea2fd2ebb4ac47a524950334eb77e75ff71aaa81cc766db2c30786\": container with ID starting with d7297c9cf9ea2fd2ebb4ac47a524950334eb77e75ff71aaa81cc766db2c30786 not found: ID does not exist" containerID="d7297c9cf9ea2fd2ebb4ac47a524950334eb77e75ff71aaa81cc766db2c30786" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.386739 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7297c9cf9ea2fd2ebb4ac47a524950334eb77e75ff71aaa81cc766db2c30786"} err="failed to get container status \"d7297c9cf9ea2fd2ebb4ac47a524950334eb77e75ff71aaa81cc766db2c30786\": rpc error: code = NotFound desc = could not find container \"d7297c9cf9ea2fd2ebb4ac47a524950334eb77e75ff71aaa81cc766db2c30786\": container with ID starting with d7297c9cf9ea2fd2ebb4ac47a524950334eb77e75ff71aaa81cc766db2c30786 not found: ID does not exist" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.386848 5047 scope.go:117] "RemoveContainer" containerID="aa4a5146307cec757547a6d59c96484c78787b2844a2b96dc84e891c5cb4091e" Feb 23 07:11:26 crc kubenswrapper[5047]: E0223 07:11:26.387680 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4a5146307cec757547a6d59c96484c78787b2844a2b96dc84e891c5cb4091e\": container with ID starting with aa4a5146307cec757547a6d59c96484c78787b2844a2b96dc84e891c5cb4091e not found: ID does not exist" containerID="aa4a5146307cec757547a6d59c96484c78787b2844a2b96dc84e891c5cb4091e" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.387735 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4a5146307cec757547a6d59c96484c78787b2844a2b96dc84e891c5cb4091e"} err="failed to get container status \"aa4a5146307cec757547a6d59c96484c78787b2844a2b96dc84e891c5cb4091e\": rpc error: code = NotFound desc = could not find container \"aa4a5146307cec757547a6d59c96484c78787b2844a2b96dc84e891c5cb4091e\": container with ID starting with aa4a5146307cec757547a6d59c96484c78787b2844a2b96dc84e891c5cb4091e not found: ID does not exist" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.387774 5047 scope.go:117] "RemoveContainer" containerID="9aac78820b50823a3db600bfbb072af48e62a5a3552b7f3389ee00434e00efbe" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.390851 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.391090 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnvxl\" (UniqueName: \"kubernetes.io/projected/8b703a8a-7e8f-4565-abc7-86f93a83e742-kube-api-access-hnvxl\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.391163 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.391234 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.391329 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.391396 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.391460 5047 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b703a8a-7e8f-4565-abc7-86f93a83e742-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.409868 5047 scope.go:117] "RemoveContainer" containerID="4408ad01f05541b66645dba36a664dd57a94590d9f81a623e6f9d239cac9e77e" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.430984 5047 scope.go:117] "RemoveContainer" containerID="9aac78820b50823a3db600bfbb072af48e62a5a3552b7f3389ee00434e00efbe" Feb 23 07:11:26 crc kubenswrapper[5047]: E0223 07:11:26.431995 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aac78820b50823a3db600bfbb072af48e62a5a3552b7f3389ee00434e00efbe\": container with ID starting with 9aac78820b50823a3db600bfbb072af48e62a5a3552b7f3389ee00434e00efbe not found: ID does not exist" containerID="9aac78820b50823a3db600bfbb072af48e62a5a3552b7f3389ee00434e00efbe" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.432037 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aac78820b50823a3db600bfbb072af48e62a5a3552b7f3389ee00434e00efbe"} err="failed to get container status \"9aac78820b50823a3db600bfbb072af48e62a5a3552b7f3389ee00434e00efbe\": rpc error: code = NotFound desc = could not find container \"9aac78820b50823a3db600bfbb072af48e62a5a3552b7f3389ee00434e00efbe\": container with ID starting with 9aac78820b50823a3db600bfbb072af48e62a5a3552b7f3389ee00434e00efbe not found: ID does not exist" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.432087 5047 scope.go:117] "RemoveContainer" containerID="4408ad01f05541b66645dba36a664dd57a94590d9f81a623e6f9d239cac9e77e" Feb 23 07:11:26 crc kubenswrapper[5047]: E0223 07:11:26.433414 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4408ad01f05541b66645dba36a664dd57a94590d9f81a623e6f9d239cac9e77e\": container with ID starting with 4408ad01f05541b66645dba36a664dd57a94590d9f81a623e6f9d239cac9e77e not found: ID does not exist" containerID="4408ad01f05541b66645dba36a664dd57a94590d9f81a623e6f9d239cac9e77e" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.433440 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4408ad01f05541b66645dba36a664dd57a94590d9f81a623e6f9d239cac9e77e"} err="failed to get container status \"4408ad01f05541b66645dba36a664dd57a94590d9f81a623e6f9d239cac9e77e\": rpc error: code = NotFound desc = could not find container \"4408ad01f05541b66645dba36a664dd57a94590d9f81a623e6f9d239cac9e77e\": container with ID starting with 4408ad01f05541b66645dba36a664dd57a94590d9f81a623e6f9d239cac9e77e not found: ID does not exist" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.433456 5047 scope.go:117] "RemoveContainer" containerID="64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.455474 5047 scope.go:117] "RemoveContainer" containerID="64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd" Feb 23 07:11:26 crc kubenswrapper[5047]: E0223 07:11:26.455894 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd\": container with ID starting with 64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd not found: ID does not exist" containerID="64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.455948 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd"} err="failed to get container status \"64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd\": rpc error: code = NotFound desc = could not find container \"64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd\": container with ID starting with 64f76f96760c93b68fce0ad0c1eb56a19867ed104a4ab67f0db37f57fe4392bd not found: ID does not exist" Feb 23 07:11:26 crc kubenswrapper[5047]: I0223 07:11:26.953239 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-2dqll" podUID="8fc70af3-2572-4019-adcb-8aecf538ae27" containerName="registry-server" probeResult="failure" output=< Feb 23 07:11:26 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 07:11:26 crc kubenswrapper[5047]: > Feb 23 07:11:27 crc kubenswrapper[5047]: I0223 07:11:27.110672 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-785b8b86cf-6rvf8" event={"ID":"8b703a8a-7e8f-4565-abc7-86f93a83e742","Type":"ContainerDied","Data":"f20d701da5a4e0807519f207c49a42c6639b83dfbf4cfe7619425012b38e9f00"} Feb 23 07:11:27 crc kubenswrapper[5047]: I0223 07:11:27.110733 5047 scope.go:117] "RemoveContainer" containerID="563e1e2512e494abff485a89912e960e6a601055dece369633c27c745f0ea956" Feb 23 07:11:27 crc kubenswrapper[5047]: I0223 07:11:27.110891 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-785b8b86cf-6rvf8" Feb 23 07:11:27 crc kubenswrapper[5047]: I0223 07:11:27.152852 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-785b8b86cf-6rvf8"] Feb 23 07:11:27 crc kubenswrapper[5047]: I0223 07:11:27.159967 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-785b8b86cf-6rvf8"] Feb 23 07:11:27 crc kubenswrapper[5047]: I0223 07:11:27.161225 5047 scope.go:117] "RemoveContainer" containerID="09006301b0f6020685672d00703b449ef9d808aef32ba06e059c9e2356603a84" Feb 23 07:11:28 crc kubenswrapper[5047]: I0223 07:11:28.358543 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b703a8a-7e8f-4565-abc7-86f93a83e742" path="/var/lib/kubelet/pods/8b703a8a-7e8f-4565-abc7-86f93a83e742/volumes" Feb 23 07:11:30 crc kubenswrapper[5047]: E0223 07:11:30.201378 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:30 crc kubenswrapper[5047]: E0223 07:11:30.202965 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:30 crc kubenswrapper[5047]: E0223 07:11:30.203001 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:30 crc kubenswrapper[5047]: E0223 07:11:30.204036 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:30 crc kubenswrapper[5047]: E0223 07:11:30.204092 5047 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovsdb-server" Feb 23 07:11:30 crc kubenswrapper[5047]: E0223 07:11:30.204763 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:30 crc kubenswrapper[5047]: E0223 07:11:30.207139 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:30 crc kubenswrapper[5047]: E0223 07:11:30.207178 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovs-vswitchd" Feb 23 07:11:35 crc kubenswrapper[5047]: E0223 07:11:35.203124 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:35 crc kubenswrapper[5047]: E0223 07:11:35.204342 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:35 crc kubenswrapper[5047]: E0223 07:11:35.204520 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:35 crc kubenswrapper[5047]: E0223 07:11:35.205256 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:35 crc kubenswrapper[5047]: E0223 07:11:35.205329 5047 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovsdb-server" Feb 23 07:11:35 crc kubenswrapper[5047]: E0223 07:11:35.206803 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:35 crc kubenswrapper[5047]: E0223 07:11:35.210693 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:35 crc kubenswrapper[5047]: E0223 07:11:35.210779 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovs-vswitchd" Feb 23 07:11:35 crc kubenswrapper[5047]: I0223 07:11:35.980998 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:36 crc kubenswrapper[5047]: I0223 07:11:36.048122 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:36 crc kubenswrapper[5047]: I0223 07:11:36.233299 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dqll"] Feb 23 07:11:37 crc kubenswrapper[5047]: I0223 07:11:37.264260 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2dqll" podUID="8fc70af3-2572-4019-adcb-8aecf538ae27" containerName="registry-server" containerID="cri-o://2e4aefe747a46d056fa6021910e1471c44a08daf79fea129bbebb23182524d92" gracePeriod=2 Feb 23 07:11:37 crc kubenswrapper[5047]: I0223 07:11:37.802986 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.009492 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc70af3-2572-4019-adcb-8aecf538ae27-catalog-content\") pod \"8fc70af3-2572-4019-adcb-8aecf538ae27\" (UID: \"8fc70af3-2572-4019-adcb-8aecf538ae27\") " Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.009663 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4hn8\" (UniqueName: \"kubernetes.io/projected/8fc70af3-2572-4019-adcb-8aecf538ae27-kube-api-access-c4hn8\") pod \"8fc70af3-2572-4019-adcb-8aecf538ae27\" (UID: \"8fc70af3-2572-4019-adcb-8aecf538ae27\") " Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.009743 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc70af3-2572-4019-adcb-8aecf538ae27-utilities\") pod \"8fc70af3-2572-4019-adcb-8aecf538ae27\" (UID: \"8fc70af3-2572-4019-adcb-8aecf538ae27\") " Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.011533 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc70af3-2572-4019-adcb-8aecf538ae27-utilities" (OuterVolumeSpecName: "utilities") pod "8fc70af3-2572-4019-adcb-8aecf538ae27" (UID: "8fc70af3-2572-4019-adcb-8aecf538ae27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.025048 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc70af3-2572-4019-adcb-8aecf538ae27-kube-api-access-c4hn8" (OuterVolumeSpecName: "kube-api-access-c4hn8") pod "8fc70af3-2572-4019-adcb-8aecf538ae27" (UID: "8fc70af3-2572-4019-adcb-8aecf538ae27"). InnerVolumeSpecName "kube-api-access-c4hn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.068554 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc70af3-2572-4019-adcb-8aecf538ae27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fc70af3-2572-4019-adcb-8aecf538ae27" (UID: "8fc70af3-2572-4019-adcb-8aecf538ae27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.113001 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc70af3-2572-4019-adcb-8aecf538ae27-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.113075 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4hn8\" (UniqueName: \"kubernetes.io/projected/8fc70af3-2572-4019-adcb-8aecf538ae27-kube-api-access-c4hn8\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.113103 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc70af3-2572-4019-adcb-8aecf538ae27-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.279695 5047 generic.go:334] "Generic (PLEG): container finished" podID="8fc70af3-2572-4019-adcb-8aecf538ae27" containerID="2e4aefe747a46d056fa6021910e1471c44a08daf79fea129bbebb23182524d92" exitCode=0 Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.279752 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dqll" event={"ID":"8fc70af3-2572-4019-adcb-8aecf538ae27","Type":"ContainerDied","Data":"2e4aefe747a46d056fa6021910e1471c44a08daf79fea129bbebb23182524d92"} Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.279794 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dqll" event={"ID":"8fc70af3-2572-4019-adcb-8aecf538ae27","Type":"ContainerDied","Data":"b4540fa9d05dda117642801ed880bc527e2c94ba7d53cadaed2d9d9d9b2022b4"} Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.279819 5047 scope.go:117] "RemoveContainer" containerID="2e4aefe747a46d056fa6021910e1471c44a08daf79fea129bbebb23182524d92" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.279863 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dqll" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.326964 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dqll"] Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.330710 5047 scope.go:117] "RemoveContainer" containerID="dd6c4ff9505f7964d3926febad9b793b4fe8fc12c04332f122f10bc230d78a63" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.332537 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dqll"] Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.365221 5047 scope.go:117] "RemoveContainer" containerID="7d8407173e772c769800365332bb20da85cafec8a5d5122f7467ba940f1ccaab" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.384152 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc70af3-2572-4019-adcb-8aecf538ae27" path="/var/lib/kubelet/pods/8fc70af3-2572-4019-adcb-8aecf538ae27/volumes" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.394112 5047 scope.go:117] "RemoveContainer" containerID="2e4aefe747a46d056fa6021910e1471c44a08daf79fea129bbebb23182524d92" Feb 23 07:11:38 crc kubenswrapper[5047]: E0223 07:11:38.395243 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e4aefe747a46d056fa6021910e1471c44a08daf79fea129bbebb23182524d92\": container with ID starting with 2e4aefe747a46d056fa6021910e1471c44a08daf79fea129bbebb23182524d92 not found: ID does not exist" containerID="2e4aefe747a46d056fa6021910e1471c44a08daf79fea129bbebb23182524d92" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.395371 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e4aefe747a46d056fa6021910e1471c44a08daf79fea129bbebb23182524d92"} err="failed to get container status \"2e4aefe747a46d056fa6021910e1471c44a08daf79fea129bbebb23182524d92\": rpc error: code = NotFound desc = could not find container \"2e4aefe747a46d056fa6021910e1471c44a08daf79fea129bbebb23182524d92\": container with ID starting with 2e4aefe747a46d056fa6021910e1471c44a08daf79fea129bbebb23182524d92 not found: ID does not exist" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.395459 5047 scope.go:117] "RemoveContainer" containerID="dd6c4ff9505f7964d3926febad9b793b4fe8fc12c04332f122f10bc230d78a63" Feb 23 07:11:38 crc kubenswrapper[5047]: E0223 07:11:38.396011 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd6c4ff9505f7964d3926febad9b793b4fe8fc12c04332f122f10bc230d78a63\": container with ID starting with dd6c4ff9505f7964d3926febad9b793b4fe8fc12c04332f122f10bc230d78a63 not found: ID does not exist" containerID="dd6c4ff9505f7964d3926febad9b793b4fe8fc12c04332f122f10bc230d78a63" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.396067 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd6c4ff9505f7964d3926febad9b793b4fe8fc12c04332f122f10bc230d78a63"} err="failed to get container status \"dd6c4ff9505f7964d3926febad9b793b4fe8fc12c04332f122f10bc230d78a63\": rpc error: code = NotFound desc = could not find container \"dd6c4ff9505f7964d3926febad9b793b4fe8fc12c04332f122f10bc230d78a63\": container with ID starting with dd6c4ff9505f7964d3926febad9b793b4fe8fc12c04332f122f10bc230d78a63 not found: ID does not exist" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.396126 5047 scope.go:117] "RemoveContainer" containerID="7d8407173e772c769800365332bb20da85cafec8a5d5122f7467ba940f1ccaab" Feb 23 07:11:38 crc kubenswrapper[5047]: E0223 07:11:38.396695 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d8407173e772c769800365332bb20da85cafec8a5d5122f7467ba940f1ccaab\": container with ID starting with 7d8407173e772c769800365332bb20da85cafec8a5d5122f7467ba940f1ccaab not found: ID does not exist" containerID="7d8407173e772c769800365332bb20da85cafec8a5d5122f7467ba940f1ccaab" Feb 23 07:11:38 crc kubenswrapper[5047]: I0223 07:11:38.396768 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8407173e772c769800365332bb20da85cafec8a5d5122f7467ba940f1ccaab"} err="failed to get container status \"7d8407173e772c769800365332bb20da85cafec8a5d5122f7467ba940f1ccaab\": rpc error: code = NotFound desc = could not find container \"7d8407173e772c769800365332bb20da85cafec8a5d5122f7467ba940f1ccaab\": container with ID starting with 7d8407173e772c769800365332bb20da85cafec8a5d5122f7467ba940f1ccaab not found: ID does not exist" Feb 23 07:11:40 crc kubenswrapper[5047]: E0223 07:11:40.201235 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:40 crc kubenswrapper[5047]: E0223 07:11:40.202260 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:40 crc kubenswrapper[5047]: E0223 07:11:40.202981 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:40 crc kubenswrapper[5047]: E0223 07:11:40.203082 5047 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovsdb-server" Feb 23 07:11:40 crc kubenswrapper[5047]: E0223 07:11:40.204454 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:40 crc kubenswrapper[5047]: E0223 07:11:40.206114 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:40 crc kubenswrapper[5047]: E0223 07:11:40.208490 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:40 crc kubenswrapper[5047]: E0223 07:11:40.208609 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovs-vswitchd" Feb 23 07:11:45 crc kubenswrapper[5047]: E0223 07:11:45.201471 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:45 crc kubenswrapper[5047]: E0223 07:11:45.202578 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:45 crc kubenswrapper[5047]: E0223 07:11:45.203065 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:11:45 crc kubenswrapper[5047]: E0223 07:11:45.203141 5047 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovsdb-server" Feb 23 07:11:45 crc kubenswrapper[5047]: E0223 07:11:45.204468 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:45 crc kubenswrapper[5047]: E0223 07:11:45.207395 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:45 crc kubenswrapper[5047]: E0223 07:11:45.210863 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:11:45 crc kubenswrapper[5047]: E0223 07:11:45.210964 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-f7lbh" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovs-vswitchd" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.397349 5047 generic.go:334] "Generic (PLEG): container finished" podID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerID="5438c98e6edb083f8efe8f16e3d1152881677a6739c97c34bfdb8575aac23a30" exitCode=137 Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.397414 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"5438c98e6edb083f8efe8f16e3d1152881677a6739c97c34bfdb8575aac23a30"} Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.604100 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.669497 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift\") pod \"62c288fd-a798-4337-882a-ab4ebb8331cb\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.669612 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/62c288fd-a798-4337-882a-ab4ebb8331cb-cache\") pod \"62c288fd-a798-4337-882a-ab4ebb8331cb\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.669751 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"62c288fd-a798-4337-882a-ab4ebb8331cb\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.669854 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/62c288fd-a798-4337-882a-ab4ebb8331cb-lock\") pod \"62c288fd-a798-4337-882a-ab4ebb8331cb\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.670010 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wffzr\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-kube-api-access-wffzr\") pod \"62c288fd-a798-4337-882a-ab4ebb8331cb\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.670056 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c288fd-a798-4337-882a-ab4ebb8331cb-combined-ca-bundle\") pod \"62c288fd-a798-4337-882a-ab4ebb8331cb\" (UID: \"62c288fd-a798-4337-882a-ab4ebb8331cb\") " Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.670494 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62c288fd-a798-4337-882a-ab4ebb8331cb-lock" (OuterVolumeSpecName: "lock") pod "62c288fd-a798-4337-882a-ab4ebb8331cb" (UID: "62c288fd-a798-4337-882a-ab4ebb8331cb"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.671068 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62c288fd-a798-4337-882a-ab4ebb8331cb-cache" (OuterVolumeSpecName: "cache") pod "62c288fd-a798-4337-882a-ab4ebb8331cb" (UID: "62c288fd-a798-4337-882a-ab4ebb8331cb"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.671211 5047 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/62c288fd-a798-4337-882a-ab4ebb8331cb-lock\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.671233 5047 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/62c288fd-a798-4337-882a-ab4ebb8331cb-cache\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.678886 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "swift") pod "62c288fd-a798-4337-882a-ab4ebb8331cb" (UID: "62c288fd-a798-4337-882a-ab4ebb8331cb"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.679582 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "62c288fd-a798-4337-882a-ab4ebb8331cb" (UID: "62c288fd-a798-4337-882a-ab4ebb8331cb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.681100 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-kube-api-access-wffzr" (OuterVolumeSpecName: "kube-api-access-wffzr") pod "62c288fd-a798-4337-882a-ab4ebb8331cb" (UID: "62c288fd-a798-4337-882a-ab4ebb8331cb"). InnerVolumeSpecName "kube-api-access-wffzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.772514 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wffzr\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-kube-api-access-wffzr\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.772574 5047 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62c288fd-a798-4337-882a-ab4ebb8331cb-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.772609 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.787421 5047 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.876971 5047 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.966147 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f7lbh_df2d42b2-545c-47ab-ba87-ff81a4cced8d/ovs-vswitchd/0.log" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.967187 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.977638 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwq4t\" (UniqueName: \"kubernetes.io/projected/df2d42b2-545c-47ab-ba87-ff81a4cced8d-kube-api-access-mwq4t\") pod \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.977757 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-log\") pod \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.977797 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-run\") pod \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.977834 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-etc-ovs\") pod \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.977926 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-log" (OuterVolumeSpecName: "var-log") pod "df2d42b2-545c-47ab-ba87-ff81a4cced8d" (UID: "df2d42b2-545c-47ab-ba87-ff81a4cced8d"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.977934 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-run" (OuterVolumeSpecName: "var-run") pod "df2d42b2-545c-47ab-ba87-ff81a4cced8d" (UID: "df2d42b2-545c-47ab-ba87-ff81a4cced8d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.977996 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-lib\") pod \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.978023 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df2d42b2-545c-47ab-ba87-ff81a4cced8d-scripts\") pod \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\" (UID: \"df2d42b2-545c-47ab-ba87-ff81a4cced8d\") " Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.978011 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "df2d42b2-545c-47ab-ba87-ff81a4cced8d" (UID: "df2d42b2-545c-47ab-ba87-ff81a4cced8d"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.978057 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-lib" (OuterVolumeSpecName: "var-lib") pod "df2d42b2-545c-47ab-ba87-ff81a4cced8d" (UID: "df2d42b2-545c-47ab-ba87-ff81a4cced8d"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.978457 5047 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-log\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.978477 5047 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.978490 5047 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-etc-ovs\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.978503 5047 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/df2d42b2-545c-47ab-ba87-ff81a4cced8d-var-lib\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.979255 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df2d42b2-545c-47ab-ba87-ff81a4cced8d-scripts" (OuterVolumeSpecName: "scripts") pod "df2d42b2-545c-47ab-ba87-ff81a4cced8d" (UID: "df2d42b2-545c-47ab-ba87-ff81a4cced8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.982019 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df2d42b2-545c-47ab-ba87-ff81a4cced8d-kube-api-access-mwq4t" (OuterVolumeSpecName: "kube-api-access-mwq4t") pod "df2d42b2-545c-47ab-ba87-ff81a4cced8d" (UID: "df2d42b2-545c-47ab-ba87-ff81a4cced8d"). InnerVolumeSpecName "kube-api-access-mwq4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:46 crc kubenswrapper[5047]: I0223 07:11:46.985668 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c288fd-a798-4337-882a-ab4ebb8331cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62c288fd-a798-4337-882a-ab4ebb8331cb" (UID: "62c288fd-a798-4337-882a-ab4ebb8331cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.079816 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c288fd-a798-4337-882a-ab4ebb8331cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.080066 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df2d42b2-545c-47ab-ba87-ff81a4cced8d-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.080129 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwq4t\" (UniqueName: \"kubernetes.io/projected/df2d42b2-545c-47ab-ba87-ff81a4cced8d-kube-api-access-mwq4t\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.439997 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"62c288fd-a798-4337-882a-ab4ebb8331cb","Type":"ContainerDied","Data":"69eae6d331ca4b69ebdf013793798bcaf88077b4d26b91c8edac4565e0eb41e4"} Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.440060 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.440096 5047 scope.go:117] "RemoveContainer" containerID="5438c98e6edb083f8efe8f16e3d1152881677a6739c97c34bfdb8575aac23a30" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.445589 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f7lbh_df2d42b2-545c-47ab-ba87-ff81a4cced8d/ovs-vswitchd/0.log" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.447364 5047 generic.go:334] "Generic (PLEG): container finished" podID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" exitCode=137 Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.447416 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f7lbh" event={"ID":"df2d42b2-545c-47ab-ba87-ff81a4cced8d","Type":"ContainerDied","Data":"0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48"} Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.447466 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f7lbh" event={"ID":"df2d42b2-545c-47ab-ba87-ff81a4cced8d","Type":"ContainerDied","Data":"afc14c445101fdef81b94ea4fa12525cc441fc7d87326336d97f9bbbadb51fef"} Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.447500 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-f7lbh" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.488095 5047 scope.go:117] "RemoveContainer" containerID="1497b14f7cabd5e74a05e2a73df9cf90fa7bf74823b616c5182cd8dc8c2842e9" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.518961 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.525446 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.548273 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-f7lbh"] Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.550129 5047 scope.go:117] "RemoveContainer" containerID="610194c27d45dd83cb9a83ae67eb964cc41e536d39f862f18b7fbc1666c8f18e" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.555425 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-f7lbh"] Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.573844 5047 scope.go:117] "RemoveContainer" containerID="fb9fbedf021ff6c481977b592703012f584d141b404e356a7e74031c5b137c6a" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.591046 5047 scope.go:117] "RemoveContainer" containerID="5f0a6444a7b19329569218b09c622e3e4c94a546c76d994e6e7f55372800737f" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.607462 5047 scope.go:117] "RemoveContainer" containerID="e8eadc030f3e0f8a80e20468bb35964a892723d0a32063f32fdb445ccfa9a64c" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.628262 5047 scope.go:117] "RemoveContainer" containerID="2271c0a7b8b654682e42f554500d0387d96647dadb59e76a372f57a74ce531ad" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.646157 5047 scope.go:117] "RemoveContainer" containerID="cf249292c8f2ae891705ce8b850d38cf0ae227f4304c261b1ef67518dde444f6" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.663261 5047 scope.go:117] "RemoveContainer" containerID="3322a1a63d982b007337c016df9dadd09417398badcaa9dcc4d0a6bf4fe12d41" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.680484 5047 scope.go:117] "RemoveContainer" containerID="7f2b74932ff71cab49843de5a99395f2f3995ed6305e9aef4d30ec156d1541a1" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.700961 5047 scope.go:117] "RemoveContainer" containerID="bd9a6c11cbd3bdfd513be881b8c42a51f7ab2e9b360a02ddb2a5e8383e963bd4" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.724456 5047 scope.go:117] "RemoveContainer" containerID="17ebe3519431a9953fc33fed5f2c323c9c8f0bbaad32401ddf2b831449414025" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.751812 5047 scope.go:117] "RemoveContainer" containerID="a19fd42fd14901f6743c36bb99a68aa21bd85e41d93af3960094cb69a6484383" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.777721 5047 scope.go:117] "RemoveContainer" containerID="ef7d56ab512ba7ddcef234fef06dfaa7575dadd48c522e8ea9b685da5f62a286" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.801673 5047 scope.go:117] "RemoveContainer" containerID="07872e17cdb6ef25b82b90ca7744f33fec2b7c83d31131b37d885019a804232f" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.823957 5047 scope.go:117] "RemoveContainer" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.847261 5047 scope.go:117] "RemoveContainer" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.868299 5047 scope.go:117] "RemoveContainer" containerID="7153e512ce13449b45e83ffde83e02e89d094a91d5e62134e5877251c16f084c" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.897961 5047 scope.go:117] "RemoveContainer" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" Feb 23 07:11:47 crc kubenswrapper[5047]: E0223 07:11:47.899419 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48\": container with ID starting with 0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48 not found: ID does not exist" containerID="0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.899472 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48"} err="failed to get container status \"0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48\": rpc error: code = NotFound desc = could not find container \"0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48\": container with ID starting with 0571525a3f8929418d1517d0ce768e8de4566bf879c79ae6e097b2458cb8cf48 not found: ID does not exist" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.899507 5047 scope.go:117] "RemoveContainer" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" Feb 23 07:11:47 crc kubenswrapper[5047]: E0223 07:11:47.899806 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945\": container with ID starting with 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 not found: ID does not exist" containerID="62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.899839 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945"} err="failed to get container status \"62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945\": rpc error: code = NotFound desc = could not find container \"62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945\": container with ID starting with 62b57b1fc8f2e0510bf57ae9852cb797145b7e6ed0a97fca6990c358186ea945 not found: ID does not exist" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.899859 5047 scope.go:117] "RemoveContainer" containerID="7153e512ce13449b45e83ffde83e02e89d094a91d5e62134e5877251c16f084c" Feb 23 07:11:47 crc kubenswrapper[5047]: E0223 07:11:47.900583 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7153e512ce13449b45e83ffde83e02e89d094a91d5e62134e5877251c16f084c\": container with ID starting with 7153e512ce13449b45e83ffde83e02e89d094a91d5e62134e5877251c16f084c not found: ID does not exist" containerID="7153e512ce13449b45e83ffde83e02e89d094a91d5e62134e5877251c16f084c" Feb 23 07:11:47 crc kubenswrapper[5047]: I0223 07:11:47.900626 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7153e512ce13449b45e83ffde83e02e89d094a91d5e62134e5877251c16f084c"} err="failed to get container status \"7153e512ce13449b45e83ffde83e02e89d094a91d5e62134e5877251c16f084c\": rpc error: code = NotFound desc = could not find container \"7153e512ce13449b45e83ffde83e02e89d094a91d5e62134e5877251c16f084c\": container with ID starting with 7153e512ce13449b45e83ffde83e02e89d094a91d5e62134e5877251c16f084c not found: ID does not exist" Feb 23 07:11:48 crc kubenswrapper[5047]: I0223 07:11:48.350395 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" path="/var/lib/kubelet/pods/62c288fd-a798-4337-882a-ab4ebb8331cb/volumes" Feb 23 07:11:48 crc kubenswrapper[5047]: I0223 07:11:48.353105 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" path="/var/lib/kubelet/pods/df2d42b2-545c-47ab-ba87-ff81a4cced8d/volumes" Feb 23 07:11:48 crc kubenswrapper[5047]: I0223 07:11:48.689225 5047 scope.go:117] "RemoveContainer" containerID="162cb7f8c6e55e75e580ec58b388952191f54181ae019a1fa7f507a11cbff878" Feb 23 07:11:48 crc kubenswrapper[5047]: I0223 07:11:48.724400 5047 scope.go:117] "RemoveContainer" containerID="bcb9e71f1342662d147a950f9bce3acb70d72672c9ecd0106c2560c4f8b80214" Feb 23 07:11:48 crc kubenswrapper[5047]: I0223 07:11:48.770584 5047 scope.go:117] "RemoveContainer" containerID="8b62822f363fb0c003c60c543c9eb3209f0b6f65b4cfe08cf2245bd1ec54bdae" Feb 23 07:11:53 crc kubenswrapper[5047]: I0223 07:11:53.219243 5047 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod5c628150-3bde-4740-9c94-dc208f61ade2"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod5c628150-3bde-4740-9c94-dc208f61ade2] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5c628150_3bde_4740_9c94_dc208f61ade2.slice" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.649354 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pvgqm"] Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650630 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-server" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650650 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-server" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650664 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27668b66-4868-448a-b2dd-e270ed4bc677" containerName="barbican-api-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650673 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="27668b66-4868-448a-b2dd-e270ed4bc677" containerName="barbican-api-log" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650683 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="ceilometer-central-agent" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650693 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="ceilometer-central-agent" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650712 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4c480a-88f1-42e1-bdce-21bdb85ecc48" containerName="nova-api-api" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650720 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4c480a-88f1-42e1-bdce-21bdb85ecc48" containerName="nova-api-api" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650729 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="ceilometer-notification-agent" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650738 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="ceilometer-notification-agent" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650749 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-reaper" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650757 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-reaper" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650766 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-replicator" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650774 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-replicator" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650786 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-expirer" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650793 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-expirer" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650807 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="swift-recon-cron" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650814 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="swift-recon-cron" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650823 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d03df4-c334-4c64-a273-e4e307df5add" containerName="glance-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650831 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d03df4-c334-4c64-a273-e4e307df5add" containerName="glance-log" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650844 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerName="nova-metadata-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650852 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerName="nova-metadata-log" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650865 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc70af3-2572-4019-adcb-8aecf538ae27" containerName="registry-server" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650873 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc70af3-2572-4019-adcb-8aecf538ae27" containerName="registry-server" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650885 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b703a8a-7e8f-4565-abc7-86f93a83e742" containerName="neutron-httpd" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650895 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b703a8a-7e8f-4565-abc7-86f93a83e742" containerName="neutron-httpd" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650940 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5408d3e3-d1ab-45e8-b226-0eb3b26fe183" containerName="nova-cell0-conductor-conductor" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650951 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5408d3e3-d1ab-45e8-b226-0eb3b26fe183" containerName="nova-cell0-conductor-conductor" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650965 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-replicator" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650974 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-replicator" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.650988 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" containerName="keystone-api" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.650996 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" containerName="keystone-api" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651011 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4c480a-88f1-42e1-bdce-21bdb85ecc48" containerName="nova-api-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651019 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4c480a-88f1-42e1-bdce-21bdb85ecc48" containerName="nova-api-log" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651029 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556e6ac3-8c64-4ee2-95f2-511a07bf220b" containerName="barbican-keystone-listener" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651036 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="556e6ac3-8c64-4ee2-95f2-511a07bf220b" containerName="barbican-keystone-listener" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651048 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-auditor" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651055 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-auditor" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651068 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="rsync" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651075 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="rsync" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651083 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="proxy-httpd" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651090 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="proxy-httpd" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651102 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb737ca7-c18f-4ff9-9285-bb35ee17cd05" containerName="cinder-api-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651110 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb737ca7-c18f-4ff9-9285-bb35ee17cd05" containerName="cinder-api-log" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651118 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-updater" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651125 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-updater" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651135 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ebb281-4310-483e-b599-3d3c8775e341" containerName="ovn-northd" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651143 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ebb281-4310-483e-b599-3d3c8775e341" containerName="ovn-northd" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651157 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-updater" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651166 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-updater" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651174 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8af89633-d2bc-4f80-9e1e-0eb183f11462" containerName="barbican-worker-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651182 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af89633-d2bc-4f80-9e1e-0eb183f11462" containerName="barbican-worker-log" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651191 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" containerName="mysql-bootstrap" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651198 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" containerName="mysql-bootstrap" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651207 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9eb562-3f84-458f-885a-e2fbb3e86bf3" containerName="glance-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651215 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9eb562-3f84-458f-885a-e2fbb3e86bf3" containerName="glance-log" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651227 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9eb562-3f84-458f-885a-e2fbb3e86bf3" containerName="glance-httpd" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651235 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9eb562-3f84-458f-885a-e2fbb3e86bf3" containerName="glance-httpd" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651247 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6776acf2-e53f-4892-847d-8667669a5eb9" containerName="setup-container" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651257 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6776acf2-e53f-4892-847d-8667669a5eb9" containerName="setup-container" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651267 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b703a8a-7e8f-4565-abc7-86f93a83e742" containerName="neutron-api" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651274 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b703a8a-7e8f-4565-abc7-86f93a83e742" containerName="neutron-api" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651284 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8af89633-d2bc-4f80-9e1e-0eb183f11462" containerName="barbican-worker" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651291 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af89633-d2bc-4f80-9e1e-0eb183f11462" containerName="barbican-worker" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651301 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="sg-core" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651309 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="sg-core" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651318 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ebb281-4310-483e-b599-3d3c8775e341" containerName="openstack-network-exporter" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651336 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ebb281-4310-483e-b599-3d3c8775e341" containerName="openstack-network-exporter" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651349 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556e6ac3-8c64-4ee2-95f2-511a07bf220b" containerName="barbican-keystone-listener-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651356 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="556e6ac3-8c64-4ee2-95f2-511a07bf220b" containerName="barbican-keystone-listener-log" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651368 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e83857b-5e17-4878-8f9b-e8d1a65325ba" containerName="setup-container" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651376 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e83857b-5e17-4878-8f9b-e8d1a65325ba" containerName="setup-container" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651389 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cdbcdf8-785e-4383-ab02-4492360bf4b4" containerName="mariadb-account-create-update" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651397 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cdbcdf8-785e-4383-ab02-4492360bf4b4" containerName="mariadb-account-create-update" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651407 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-replicator" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651415 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-replicator" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651424 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d08092-3eac-4289-b594-a77d5dfecfe9" containerName="kube-state-metrics" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651435 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d08092-3eac-4289-b594-a77d5dfecfe9" containerName="kube-state-metrics" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651443 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-server" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651453 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-server" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651467 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb737ca7-c18f-4ff9-9285-bb35ee17cd05" containerName="cinder-api" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651474 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb737ca7-c18f-4ff9-9285-bb35ee17cd05" containerName="cinder-api" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651485 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d03df4-c334-4c64-a273-e4e307df5add" containerName="glance-httpd" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651493 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d03df4-c334-4c64-a273-e4e307df5add" containerName="glance-httpd" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651504 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc70af3-2572-4019-adcb-8aecf538ae27" containerName="extract-content" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651512 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc70af3-2572-4019-adcb-8aecf538ae27" containerName="extract-content" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651524 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovsdb-server-init" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651532 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovsdb-server-init" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651544 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerName="nova-metadata-metadata" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651553 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerName="nova-metadata-metadata" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651567 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-auditor" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651575 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-auditor" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651590 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovsdb-server" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651600 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovsdb-server" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651611 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6776acf2-e53f-4892-847d-8667669a5eb9" containerName="rabbitmq" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651621 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6776acf2-e53f-4892-847d-8667669a5eb9" containerName="rabbitmq" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651632 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c814692-df9e-470f-8aad-364d48f82b81" containerName="memcached" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651640 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c814692-df9e-470f-8aad-364d48f82b81" containerName="memcached" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651652 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" containerName="galera" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651659 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" containerName="galera" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651668 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cdbcdf8-785e-4383-ab02-4492360bf4b4" containerName="mariadb-account-create-update" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651673 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cdbcdf8-785e-4383-ab02-4492360bf4b4" containerName="mariadb-account-create-update" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651679 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc70af3-2572-4019-adcb-8aecf538ae27" containerName="extract-utilities" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651686 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc70af3-2572-4019-adcb-8aecf538ae27" containerName="extract-utilities" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651693 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-auditor" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651699 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-auditor" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651706 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-server" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651712 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-server" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651720 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27668b66-4868-448a-b2dd-e270ed4bc677" containerName="barbican-api" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651726 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="27668b66-4868-448a-b2dd-e270ed4bc677" containerName="barbican-api" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651738 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e83857b-5e17-4878-8f9b-e8d1a65325ba" containerName="rabbitmq" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651744 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e83857b-5e17-4878-8f9b-e8d1a65325ba" containerName="rabbitmq" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651753 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9e1257-3765-4d4e-8110-81c55d1546d4" containerName="ovn-controller" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651759 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9e1257-3765-4d4e-8110-81c55d1546d4" containerName="ovn-controller" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651765 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovs-vswitchd" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651772 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovs-vswitchd" Feb 23 07:12:38 crc kubenswrapper[5047]: E0223 07:12:38.651780 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c628150-3bde-4740-9c94-dc208f61ade2" containerName="nova-scheduler-scheduler" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651786 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c628150-3bde-4740-9c94-dc208f61ade2" containerName="nova-scheduler-scheduler" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651947 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-updater" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651960 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovsdb-server" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651969 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-server" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651976 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8af89633-d2bc-4f80-9e1e-0eb183f11462" containerName="barbican-worker" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651985 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c628150-3bde-4740-9c94-dc208f61ade2" containerName="nova-scheduler-scheduler" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.651994 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ebb281-4310-483e-b599-3d3c8775e341" containerName="openstack-network-exporter" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652001 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerName="nova-metadata-metadata" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652011 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="5408d3e3-d1ab-45e8-b226-0eb3b26fe183" containerName="nova-cell0-conductor-conductor" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652017 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="swift-recon-cron" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652025 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc70af3-2572-4019-adcb-8aecf538ae27" containerName="registry-server" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652036 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-replicator" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652045 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="proxy-httpd" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652053 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd35e1f1-f6b3-42ca-9f2c-72ad4807e57b" containerName="galera" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652060 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d9eb562-3f84-458f-885a-e2fbb3e86bf3" containerName="glance-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652066 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="27668b66-4868-448a-b2dd-e270ed4bc677" containerName="barbican-api" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652072 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d03df4-c334-4c64-a273-e4e307df5add" containerName="glance-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652080 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-updater" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652091 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c814692-df9e-470f-8aad-364d48f82b81" containerName="memcached" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652097 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="6776acf2-e53f-4892-847d-8667669a5eb9" containerName="rabbitmq" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652104 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="556e6ac3-8c64-4ee2-95f2-511a07bf220b" containerName="barbican-keystone-listener-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652114 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d08092-3eac-4289-b594-a77d5dfecfe9" containerName="kube-state-metrics" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652125 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d03df4-c334-4c64-a273-e4e307df5add" containerName="glance-httpd" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652133 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8af89633-d2bc-4f80-9e1e-0eb183f11462" containerName="barbican-worker-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652143 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="df2d42b2-545c-47ab-ba87-ff81a4cced8d" containerName="ovs-vswitchd" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652151 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-auditor" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652160 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-replicator" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652167 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="rsync" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652176 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="27668b66-4868-448a-b2dd-e270ed4bc677" containerName="barbican-api-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652183 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="ceilometer-notification-agent" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652189 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="db4c480a-88f1-42e1-bdce-21bdb85ecc48" containerName="nova-api-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652199 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cdbcdf8-785e-4383-ab02-4492360bf4b4" containerName="mariadb-account-create-update" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652205 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e83857b-5e17-4878-8f9b-e8d1a65325ba" containerName="rabbitmq" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652215 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="db4c480a-88f1-42e1-bdce-21bdb85ecc48" containerName="nova-api-api" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652224 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-server" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652232 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-expirer" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652243 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="sg-core" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652250 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cdbcdf8-785e-4383-ab02-4492360bf4b4" containerName="mariadb-account-create-update" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652257 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d3d9df-fbd7-4f07-84b3-a51fff5d5e08" containerName="keystone-api" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652264 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb737ca7-c18f-4ff9-9285-bb35ee17cd05" containerName="cinder-api-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652270 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e307d3a1-af99-460b-bdd0-24e26de38751" containerName="ceilometer-central-agent" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652276 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b703a8a-7e8f-4565-abc7-86f93a83e742" containerName="neutron-httpd" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652282 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d9eb562-3f84-458f-885a-e2fbb3e86bf3" containerName="glance-httpd" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652289 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-reaper" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652297 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-auditor" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652303 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="container-replicator" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652310 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b703a8a-7e8f-4565-abc7-86f93a83e742" containerName="neutron-api" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652317 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3349ce-92c0-4fc5-ae0e-6424be7ca179" containerName="nova-metadata-log" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652324 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ebb281-4310-483e-b599-3d3c8775e341" containerName="ovn-northd" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652331 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9e1257-3765-4d4e-8110-81c55d1546d4" containerName="ovn-controller" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652340 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="object-auditor" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652350 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb737ca7-c18f-4ff9-9285-bb35ee17cd05" containerName="cinder-api" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652358 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c288fd-a798-4337-882a-ab4ebb8331cb" containerName="account-server" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.652366 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="556e6ac3-8c64-4ee2-95f2-511a07bf220b" containerName="barbican-keystone-listener" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.653434 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.658774 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pvgqm"] Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.737996 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-catalog-content\") pod \"community-operators-pvgqm\" (UID: \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\") " pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.738081 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-utilities\") pod \"community-operators-pvgqm\" (UID: \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\") " pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.738393 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4rv2\" (UniqueName: \"kubernetes.io/projected/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-kube-api-access-l4rv2\") pod \"community-operators-pvgqm\" (UID: \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\") " pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.839258 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4rv2\" (UniqueName: \"kubernetes.io/projected/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-kube-api-access-l4rv2\") pod \"community-operators-pvgqm\" (UID: \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\") " pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.839345 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-catalog-content\") pod \"community-operators-pvgqm\" (UID: \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\") " pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.839415 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-utilities\") pod \"community-operators-pvgqm\" (UID: \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\") " pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.840051 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-utilities\") pod \"community-operators-pvgqm\" (UID: \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\") " pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.840182 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-catalog-content\") pod \"community-operators-pvgqm\" (UID: \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\") " pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.864054 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4rv2\" (UniqueName: \"kubernetes.io/projected/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-kube-api-access-l4rv2\") pod \"community-operators-pvgqm\" (UID: \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\") " pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:38 crc kubenswrapper[5047]: I0223 07:12:38.974960 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:39 crc kubenswrapper[5047]: I0223 07:12:39.538442 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pvgqm"] Feb 23 07:12:39 crc kubenswrapper[5047]: I0223 07:12:39.998386 5047 generic.go:334] "Generic (PLEG): container finished" podID="fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" containerID="6fbd43322c22996efb1036e6ccc39178f23d679f314dc9bf99e835b997cd587a" exitCode=0 Feb 23 07:12:40 crc kubenswrapper[5047]: I0223 07:12:39.998513 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvgqm" event={"ID":"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5","Type":"ContainerDied","Data":"6fbd43322c22996efb1036e6ccc39178f23d679f314dc9bf99e835b997cd587a"} Feb 23 07:12:40 crc kubenswrapper[5047]: I0223 07:12:40.002107 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvgqm" event={"ID":"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5","Type":"ContainerStarted","Data":"c9cd1dc950024e3dbd8180ba24a339de50f33c40fd5e852c9b7b25f7a7c490eb"} Feb 23 07:12:41 crc kubenswrapper[5047]: I0223 07:12:41.013657 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvgqm" event={"ID":"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5","Type":"ContainerStarted","Data":"f9a5e9ffe752995f0ea8d86dc53351f91491ccbb60e472b8c618cfbe7cf995af"} Feb 23 07:12:42 crc kubenswrapper[5047]: I0223 07:12:42.033305 5047 generic.go:334] "Generic (PLEG): container finished" podID="fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" containerID="f9a5e9ffe752995f0ea8d86dc53351f91491ccbb60e472b8c618cfbe7cf995af" exitCode=0 Feb 23 07:12:42 crc kubenswrapper[5047]: I0223 07:12:42.033702 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvgqm" event={"ID":"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5","Type":"ContainerDied","Data":"f9a5e9ffe752995f0ea8d86dc53351f91491ccbb60e472b8c618cfbe7cf995af"} Feb 23 07:12:43 crc kubenswrapper[5047]: I0223 07:12:43.049006 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvgqm" event={"ID":"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5","Type":"ContainerStarted","Data":"76818cfd98e55867be5cc11144245faa292a97fce8a27515f2f70275f6a5ed0b"} Feb 23 07:12:43 crc kubenswrapper[5047]: I0223 07:12:43.085665 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pvgqm" podStartSLOduration=2.688970585 podStartE2EDuration="5.085630008s" podCreationTimestamp="2026-02-23 07:12:38 +0000 UTC" firstStartedPulling="2026-02-23 07:12:40.000795598 +0000 UTC m=+1682.252122742" lastFinishedPulling="2026-02-23 07:12:42.397455011 +0000 UTC m=+1684.648782165" observedRunningTime="2026-02-23 07:12:43.073389736 +0000 UTC m=+1685.324716870" watchObservedRunningTime="2026-02-23 07:12:43.085630008 +0000 UTC m=+1685.336957182" Feb 23 07:12:46 crc kubenswrapper[5047]: I0223 07:12:46.760530 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:12:46 crc kubenswrapper[5047]: I0223 07:12:46.761131 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:12:48 crc kubenswrapper[5047]: I0223 07:12:48.975740 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:48 crc kubenswrapper[5047]: I0223 07:12:48.975946 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:49 crc kubenswrapper[5047]: I0223 07:12:49.053329 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:49 crc kubenswrapper[5047]: I0223 07:12:49.204369 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:49 crc kubenswrapper[5047]: I0223 07:12:49.305519 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pvgqm"] Feb 23 07:12:49 crc kubenswrapper[5047]: I0223 07:12:49.605825 5047 scope.go:117] "RemoveContainer" containerID="af59b9535f389d7d7a06a601dbefe85600115748ae0d09b9bba1ac8ff86c9b73" Feb 23 07:12:49 crc kubenswrapper[5047]: I0223 07:12:49.641407 5047 scope.go:117] "RemoveContainer" containerID="78644fef8da1bd0cdac4bf80a09b3f519c55c9bc25ca47733d8b832f3f4e6e06" Feb 23 07:12:49 crc kubenswrapper[5047]: I0223 07:12:49.675104 5047 scope.go:117] "RemoveContainer" containerID="9019902801f87614a70096198dfd3e7b6168c72d3eeaac06fc70c4c41623d9e7" Feb 23 07:12:49 crc kubenswrapper[5047]: I0223 07:12:49.731195 5047 scope.go:117] "RemoveContainer" containerID="d69b5e2523509c2ac1197e8192f774f48b4003ecd9c6f511dc9d68dbb5ace140" Feb 23 07:12:49 crc kubenswrapper[5047]: I0223 07:12:49.781121 5047 scope.go:117] "RemoveContainer" containerID="ce63d2873c33fcb8fe6a6f80101295092c9629334e7ca2d463fb1fa48336a19d" Feb 23 07:12:49 crc kubenswrapper[5047]: I0223 07:12:49.842997 5047 scope.go:117] "RemoveContainer" containerID="5a75079717a470b29de40ec2a4f36e2754c02b302a2199b3ba843bf798995f82" Feb 23 07:12:49 crc kubenswrapper[5047]: I0223 07:12:49.873279 5047 scope.go:117] "RemoveContainer" containerID="7f3ef48ce4065d19a564ca8ebcc229080a58904b941c27c470fd874c0374ed7f" Feb 23 07:12:49 crc kubenswrapper[5047]: I0223 07:12:49.902395 5047 scope.go:117] "RemoveContainer" containerID="5bea11547183c48486f53b580384d648b13b5009d4bcf4723ac380d79f01172f" Feb 23 07:12:49 crc kubenswrapper[5047]: I0223 07:12:49.938526 5047 scope.go:117] "RemoveContainer" containerID="891892cac348d183815d7ffe54e852d6a7c8f16d8e5289f24d91b7f0c6b96fac" Feb 23 07:12:49 crc kubenswrapper[5047]: I0223 07:12:49.967322 5047 scope.go:117] "RemoveContainer" containerID="db02e6016211bb19ce14e8399eb56289a991d891d3f38632e18d21c7226ecf33" Feb 23 07:12:50 crc kubenswrapper[5047]: I0223 07:12:50.006573 5047 scope.go:117] "RemoveContainer" containerID="c6a6e7d909b55a3b0c1abec01195b34c6c1aa06fec45edbc14f4f0e71b821777" Feb 23 07:12:50 crc kubenswrapper[5047]: I0223 07:12:50.041305 5047 scope.go:117] "RemoveContainer" containerID="79c4349a7439839109ef0f83c46bc4618e14ddc90456258b12240e41e67a98fb" Feb 23 07:12:50 crc kubenswrapper[5047]: I0223 07:12:50.073863 5047 scope.go:117] "RemoveContainer" containerID="752c7e2e7f7fbc780ee3298c5345b10c649bdfe5f98e119767fa42545912892e" Feb 23 07:12:50 crc kubenswrapper[5047]: I0223 07:12:50.118389 5047 scope.go:117] "RemoveContainer" containerID="0ad3ee1b7a850db2d2c262c0e56cbbd4c72b1c1fab94b408cf7c5149a500b02c" Feb 23 07:12:50 crc kubenswrapper[5047]: I0223 07:12:50.148420 5047 scope.go:117] "RemoveContainer" containerID="2336c09e2d0dcabaf0c9bd5811bb1fa5b57db421e70518d3ce36701f7bc66f57" Feb 23 07:12:50 crc kubenswrapper[5047]: I0223 07:12:50.184604 5047 scope.go:117] "RemoveContainer" containerID="a1572c2ddd422b04e8cd0b8fc284f5c4a477827e929ddc442a91b9dc6f919ca6" Feb 23 07:12:50 crc kubenswrapper[5047]: I0223 07:12:50.206859 5047 scope.go:117] "RemoveContainer" containerID="d14d86530a147cf0d67613e782e89086bd0d587992f1573cb82b255d08564b9a" Feb 23 07:12:50 crc kubenswrapper[5047]: I0223 07:12:50.251342 5047 scope.go:117] "RemoveContainer" containerID="a03cc75d00468d42e82d8719df0248c48ca5c394864f1ed186b8504f10f3ac61" Feb 23 07:12:50 crc kubenswrapper[5047]: I0223 07:12:50.278006 5047 scope.go:117] "RemoveContainer" containerID="1267325c6c80fc648355dcf0f22835eb1816566e49499882efde0dc53f579027" Feb 23 07:12:50 crc kubenswrapper[5047]: I0223 07:12:50.306577 5047 scope.go:117] "RemoveContainer" containerID="f8b0272cd2fe35be240a50c0e711eb436bc05335fe0527ea4e69dae436cba0f8" Feb 23 07:12:50 crc kubenswrapper[5047]: I0223 07:12:50.347488 5047 scope.go:117] "RemoveContainer" containerID="2838da4fc4581c30df7297386992da7a7d97e1c93377f3b9bedfba25d4cf24ed" Feb 23 07:12:50 crc kubenswrapper[5047]: I0223 07:12:50.400722 5047 scope.go:117] "RemoveContainer" containerID="2d797d6a9c01a1cb9479944763429bf1039d9581a793335a6b8e1f9d395e3154" Feb 23 07:12:50 crc kubenswrapper[5047]: I0223 07:12:50.445152 5047 scope.go:117] "RemoveContainer" containerID="c322513d0f04b9b096c0fe9678a7d5ad8b11f3442f3ff8b0d76101352a14947a" Feb 23 07:12:51 crc kubenswrapper[5047]: I0223 07:12:51.202236 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pvgqm" podUID="fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" containerName="registry-server" containerID="cri-o://76818cfd98e55867be5cc11144245faa292a97fce8a27515f2f70275f6a5ed0b" gracePeriod=2 Feb 23 07:12:51 crc kubenswrapper[5047]: I0223 07:12:51.712567 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:51 crc kubenswrapper[5047]: I0223 07:12:51.890155 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-catalog-content\") pod \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\" (UID: \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\") " Feb 23 07:12:51 crc kubenswrapper[5047]: I0223 07:12:51.890339 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4rv2\" (UniqueName: \"kubernetes.io/projected/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-kube-api-access-l4rv2\") pod \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\" (UID: \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\") " Feb 23 07:12:51 crc kubenswrapper[5047]: I0223 07:12:51.890443 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-utilities\") pod \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\" (UID: \"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5\") " Feb 23 07:12:51 crc kubenswrapper[5047]: I0223 07:12:51.892787 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-utilities" (OuterVolumeSpecName: "utilities") pod "fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" (UID: "fd08396e-be5a-451c-8da7-b4b0d9e0b4c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:12:51 crc kubenswrapper[5047]: I0223 07:12:51.901417 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-kube-api-access-l4rv2" (OuterVolumeSpecName: "kube-api-access-l4rv2") pod "fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" (UID: "fd08396e-be5a-451c-8da7-b4b0d9e0b4c5"). InnerVolumeSpecName "kube-api-access-l4rv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:12:51 crc kubenswrapper[5047]: I0223 07:12:51.978945 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" (UID: "fd08396e-be5a-451c-8da7-b4b0d9e0b4c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:12:51 crc kubenswrapper[5047]: I0223 07:12:51.992370 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:51 crc kubenswrapper[5047]: I0223 07:12:51.992428 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4rv2\" (UniqueName: \"kubernetes.io/projected/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-kube-api-access-l4rv2\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:51 crc kubenswrapper[5047]: I0223 07:12:51.992449 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.215052 5047 generic.go:334] "Generic (PLEG): container finished" podID="fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" containerID="76818cfd98e55867be5cc11144245faa292a97fce8a27515f2f70275f6a5ed0b" exitCode=0 Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.215188 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvgqm" event={"ID":"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5","Type":"ContainerDied","Data":"76818cfd98e55867be5cc11144245faa292a97fce8a27515f2f70275f6a5ed0b"} Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.215216 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvgqm" Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.215272 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvgqm" event={"ID":"fd08396e-be5a-451c-8da7-b4b0d9e0b4c5","Type":"ContainerDied","Data":"c9cd1dc950024e3dbd8180ba24a339de50f33c40fd5e852c9b7b25f7a7c490eb"} Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.215310 5047 scope.go:117] "RemoveContainer" containerID="76818cfd98e55867be5cc11144245faa292a97fce8a27515f2f70275f6a5ed0b" Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.242101 5047 scope.go:117] "RemoveContainer" containerID="f9a5e9ffe752995f0ea8d86dc53351f91491ccbb60e472b8c618cfbe7cf995af" Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.263104 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pvgqm"] Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.273118 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pvgqm"] Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.286729 5047 scope.go:117] "RemoveContainer" containerID="6fbd43322c22996efb1036e6ccc39178f23d679f314dc9bf99e835b997cd587a" Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.315922 5047 scope.go:117] "RemoveContainer" containerID="76818cfd98e55867be5cc11144245faa292a97fce8a27515f2f70275f6a5ed0b" Feb 23 07:12:52 crc kubenswrapper[5047]: E0223 07:12:52.316538 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76818cfd98e55867be5cc11144245faa292a97fce8a27515f2f70275f6a5ed0b\": container with ID starting with 76818cfd98e55867be5cc11144245faa292a97fce8a27515f2f70275f6a5ed0b not found: ID does not exist" containerID="76818cfd98e55867be5cc11144245faa292a97fce8a27515f2f70275f6a5ed0b" Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.316595 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76818cfd98e55867be5cc11144245faa292a97fce8a27515f2f70275f6a5ed0b"} err="failed to get container status \"76818cfd98e55867be5cc11144245faa292a97fce8a27515f2f70275f6a5ed0b\": rpc error: code = NotFound desc = could not find container \"76818cfd98e55867be5cc11144245faa292a97fce8a27515f2f70275f6a5ed0b\": container with ID starting with 76818cfd98e55867be5cc11144245faa292a97fce8a27515f2f70275f6a5ed0b not found: ID does not exist" Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.316635 5047 scope.go:117] "RemoveContainer" containerID="f9a5e9ffe752995f0ea8d86dc53351f91491ccbb60e472b8c618cfbe7cf995af" Feb 23 07:12:52 crc kubenswrapper[5047]: E0223 07:12:52.317170 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a5e9ffe752995f0ea8d86dc53351f91491ccbb60e472b8c618cfbe7cf995af\": container with ID starting with f9a5e9ffe752995f0ea8d86dc53351f91491ccbb60e472b8c618cfbe7cf995af not found: ID does not exist" containerID="f9a5e9ffe752995f0ea8d86dc53351f91491ccbb60e472b8c618cfbe7cf995af" Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.317220 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a5e9ffe752995f0ea8d86dc53351f91491ccbb60e472b8c618cfbe7cf995af"} err="failed to get container status \"f9a5e9ffe752995f0ea8d86dc53351f91491ccbb60e472b8c618cfbe7cf995af\": rpc error: code = NotFound desc = could not find container \"f9a5e9ffe752995f0ea8d86dc53351f91491ccbb60e472b8c618cfbe7cf995af\": container with ID starting with f9a5e9ffe752995f0ea8d86dc53351f91491ccbb60e472b8c618cfbe7cf995af not found: ID does not exist" Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.317255 5047 scope.go:117] "RemoveContainer" containerID="6fbd43322c22996efb1036e6ccc39178f23d679f314dc9bf99e835b997cd587a" Feb 23 07:12:52 crc kubenswrapper[5047]: E0223 07:12:52.317523 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fbd43322c22996efb1036e6ccc39178f23d679f314dc9bf99e835b997cd587a\": container with ID starting with 6fbd43322c22996efb1036e6ccc39178f23d679f314dc9bf99e835b997cd587a not found: ID does not exist" containerID="6fbd43322c22996efb1036e6ccc39178f23d679f314dc9bf99e835b997cd587a" Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.317554 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fbd43322c22996efb1036e6ccc39178f23d679f314dc9bf99e835b997cd587a"} err="failed to get container status \"6fbd43322c22996efb1036e6ccc39178f23d679f314dc9bf99e835b997cd587a\": rpc error: code = NotFound desc = could not find container \"6fbd43322c22996efb1036e6ccc39178f23d679f314dc9bf99e835b997cd587a\": container with ID starting with 6fbd43322c22996efb1036e6ccc39178f23d679f314dc9bf99e835b997cd587a not found: ID does not exist" Feb 23 07:12:52 crc kubenswrapper[5047]: I0223 07:12:52.349451 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" path="/var/lib/kubelet/pods/fd08396e-be5a-451c-8da7-b4b0d9e0b4c5/volumes" Feb 23 07:13:16 crc kubenswrapper[5047]: I0223 07:13:16.760061 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:13:16 crc kubenswrapper[5047]: I0223 07:13:16.760935 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:13:46 crc kubenswrapper[5047]: I0223 07:13:46.760067 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:13:46 crc kubenswrapper[5047]: I0223 07:13:46.761124 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:13:46 crc kubenswrapper[5047]: I0223 07:13:46.761234 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 07:13:46 crc kubenswrapper[5047]: I0223 07:13:46.762594 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:13:46 crc kubenswrapper[5047]: I0223 07:13:46.762720 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" gracePeriod=600 Feb 23 07:13:46 crc kubenswrapper[5047]: E0223 07:13:46.897391 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:13:47 crc kubenswrapper[5047]: I0223 07:13:47.856898 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" exitCode=0 Feb 23 07:13:47 crc kubenswrapper[5047]: I0223 07:13:47.857140 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0"} Feb 23 07:13:47 crc kubenswrapper[5047]: I0223 07:13:47.857652 5047 scope.go:117] "RemoveContainer" containerID="837f7c0cf58cb1629ad928ad357807b43700d666b61b873505ed015b092129de" Feb 23 07:13:47 crc kubenswrapper[5047]: I0223 07:13:47.859028 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:13:47 crc kubenswrapper[5047]: E0223 07:13:47.859649 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:13:51 crc kubenswrapper[5047]: I0223 07:13:51.184543 5047 scope.go:117] "RemoveContainer" containerID="cd436dcf3211130ff764f076fed340d93284320db3baceec202f8ab4ad5a1294" Feb 23 07:13:51 crc kubenswrapper[5047]: I0223 07:13:51.258176 5047 scope.go:117] "RemoveContainer" containerID="999e38cfe887aedaec12de40ca31ea006bf118394964b12586bbd7546473dc73" Feb 23 07:13:51 crc kubenswrapper[5047]: I0223 07:13:51.284005 5047 scope.go:117] "RemoveContainer" containerID="b1156e1b07cb20a8e9857fce83fae53cb9cfb48c367023bcb30470fba1c4f122" Feb 23 07:13:51 crc kubenswrapper[5047]: I0223 07:13:51.313070 5047 scope.go:117] "RemoveContainer" containerID="31926a2974cd4ca21172fb1cd25667b92d592918db322cb316be55add0a17a1d" Feb 23 07:13:51 crc kubenswrapper[5047]: I0223 07:13:51.348597 5047 scope.go:117] "RemoveContainer" containerID="1bff42b06a8741633963197012e9bcd6401b4df5472444c488f4134bb829bd6d" Feb 23 07:13:51 crc kubenswrapper[5047]: I0223 07:13:51.376389 5047 scope.go:117] "RemoveContainer" containerID="80ad6f181096756b5b17df7bddd5e7b9f97d35cea4ff8798e80c1362b7bcf2a8" Feb 23 07:13:51 crc kubenswrapper[5047]: I0223 07:13:51.433785 5047 scope.go:117] "RemoveContainer" containerID="bb85a2f1ff6def18a756205ee4f5c162129c0424fc0d632f8cb5d4b7d746121f" Feb 23 07:13:51 crc kubenswrapper[5047]: I0223 07:13:51.465075 5047 scope.go:117] "RemoveContainer" containerID="a0e78cccdc69fcb4e0931f1814bebbb8de46564c39b911b4f4864d69cd56c138" Feb 23 07:13:59 crc kubenswrapper[5047]: I0223 07:13:59.341057 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:13:59 crc kubenswrapper[5047]: E0223 07:13:59.342117 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:14:12 crc kubenswrapper[5047]: I0223 07:14:12.341336 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:14:12 crc kubenswrapper[5047]: E0223 07:14:12.342714 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:14:26 crc kubenswrapper[5047]: I0223 07:14:26.341643 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:14:26 crc kubenswrapper[5047]: E0223 07:14:26.342622 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:14:37 crc kubenswrapper[5047]: I0223 07:14:37.342282 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:14:37 crc kubenswrapper[5047]: E0223 07:14:37.345824 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:14:49 crc kubenswrapper[5047]: I0223 07:14:49.340971 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:14:49 crc kubenswrapper[5047]: E0223 07:14:49.342142 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:14:51 crc kubenswrapper[5047]: I0223 07:14:51.634547 5047 scope.go:117] "RemoveContainer" containerID="95da99e7abef27d0904bfe545c3f30b1b0448c94b49b798f69241c902b5aaf0e" Feb 23 07:14:51 crc kubenswrapper[5047]: I0223 07:14:51.667134 5047 scope.go:117] "RemoveContainer" containerID="61434e9e4d467386a78cd46fa836a42ead7ba33809fa4c13e1ad548d69e1b881" Feb 23 07:14:51 crc kubenswrapper[5047]: I0223 07:14:51.712690 5047 scope.go:117] "RemoveContainer" containerID="b5495e9f8841c7dec1c3b19f82d389219f186ba7641ce0bc83c99ddb4383352d" Feb 23 07:14:51 crc kubenswrapper[5047]: I0223 07:14:51.740188 5047 scope.go:117] "RemoveContainer" containerID="370e63e6a062f56f822bb2fff7821e2ee1f32ee82378743cc49b225b72866f01" Feb 23 07:14:51 crc kubenswrapper[5047]: I0223 07:14:51.762730 5047 scope.go:117] "RemoveContainer" containerID="85d35c7768b566f85eab0cf6425b4ba1d8ca97cbc5b6722b7a066d3865422d8e" Feb 23 07:14:51 crc kubenswrapper[5047]: I0223 07:14:51.817407 5047 scope.go:117] "RemoveContainer" containerID="a45e4023fdc3f5de91bd12746e0ec06ad8dc73f14a73ff3efba60ce6bbe4cb68" Feb 23 07:14:51 crc kubenswrapper[5047]: I0223 07:14:51.849494 5047 scope.go:117] "RemoveContainer" containerID="b2223304e21e1939208327e76234fdea34d918e35b270aa7f780d4e3c732e5cb" Feb 23 07:14:51 crc kubenswrapper[5047]: I0223 07:14:51.878866 5047 scope.go:117] "RemoveContainer" containerID="b44f6b2c8f1e447760bbca2e2eef8fead4c67ab049896aef3ccca0fea1b3b631" Feb 23 07:14:51 crc kubenswrapper[5047]: I0223 07:14:51.916265 5047 scope.go:117] "RemoveContainer" containerID="0e3b47db4cbec9007d82a1b680b785316dfde205b6f0c34008d7119621f5b73a" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.172058 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz"] Feb 23 07:15:00 crc kubenswrapper[5047]: E0223 07:15:00.173656 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" containerName="extract-utilities" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.173691 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" containerName="extract-utilities" Feb 23 07:15:00 crc kubenswrapper[5047]: E0223 07:15:00.173726 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" containerName="registry-server" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.173743 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" containerName="registry-server" Feb 23 07:15:00 crc kubenswrapper[5047]: E0223 07:15:00.173779 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" containerName="extract-content" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.173793 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" containerName="extract-content" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.174434 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd08396e-be5a-451c-8da7-b4b0d9e0b4c5" containerName="registry-server" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.175379 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.179495 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.184791 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.185737 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/698c5219-ee1d-4076-a7fd-db26594dc4a8-config-volume\") pod \"collect-profiles-29530515-t6vcz\" (UID: \"698c5219-ee1d-4076-a7fd-db26594dc4a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.185879 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/698c5219-ee1d-4076-a7fd-db26594dc4a8-secret-volume\") pod \"collect-profiles-29530515-t6vcz\" (UID: \"698c5219-ee1d-4076-a7fd-db26594dc4a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.185959 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2w4t\" (UniqueName: \"kubernetes.io/projected/698c5219-ee1d-4076-a7fd-db26594dc4a8-kube-api-access-h2w4t\") pod \"collect-profiles-29530515-t6vcz\" (UID: \"698c5219-ee1d-4076-a7fd-db26594dc4a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.191411 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz"] Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.287340 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2w4t\" (UniqueName: \"kubernetes.io/projected/698c5219-ee1d-4076-a7fd-db26594dc4a8-kube-api-access-h2w4t\") pod \"collect-profiles-29530515-t6vcz\" (UID: \"698c5219-ee1d-4076-a7fd-db26594dc4a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.287437 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/698c5219-ee1d-4076-a7fd-db26594dc4a8-config-volume\") pod \"collect-profiles-29530515-t6vcz\" (UID: \"698c5219-ee1d-4076-a7fd-db26594dc4a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.287529 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/698c5219-ee1d-4076-a7fd-db26594dc4a8-secret-volume\") pod \"collect-profiles-29530515-t6vcz\" (UID: \"698c5219-ee1d-4076-a7fd-db26594dc4a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.289682 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/698c5219-ee1d-4076-a7fd-db26594dc4a8-config-volume\") pod \"collect-profiles-29530515-t6vcz\" (UID: \"698c5219-ee1d-4076-a7fd-db26594dc4a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.299380 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/698c5219-ee1d-4076-a7fd-db26594dc4a8-secret-volume\") pod \"collect-profiles-29530515-t6vcz\" (UID: \"698c5219-ee1d-4076-a7fd-db26594dc4a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.321017 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2w4t\" (UniqueName: \"kubernetes.io/projected/698c5219-ee1d-4076-a7fd-db26594dc4a8-kube-api-access-h2w4t\") pod \"collect-profiles-29530515-t6vcz\" (UID: \"698c5219-ee1d-4076-a7fd-db26594dc4a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.550175 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.823350 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz"] Feb 23 07:15:00 crc kubenswrapper[5047]: I0223 07:15:00.930098 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" event={"ID":"698c5219-ee1d-4076-a7fd-db26594dc4a8","Type":"ContainerStarted","Data":"79fd5cf2a34f2d8d993d8e9b73700e7098314f873129675df0525e5a98ce20ce"} Feb 23 07:15:01 crc kubenswrapper[5047]: I0223 07:15:01.944824 5047 generic.go:334] "Generic (PLEG): container finished" podID="698c5219-ee1d-4076-a7fd-db26594dc4a8" containerID="a60c336d0e0b6f5656493549d938a4e8047d61f249e7bffc4c7a30c54466afe4" exitCode=0 Feb 23 07:15:01 crc kubenswrapper[5047]: I0223 07:15:01.944941 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" event={"ID":"698c5219-ee1d-4076-a7fd-db26594dc4a8","Type":"ContainerDied","Data":"a60c336d0e0b6f5656493549d938a4e8047d61f249e7bffc4c7a30c54466afe4"} Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.329875 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.341337 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.341838 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2w4t\" (UniqueName: \"kubernetes.io/projected/698c5219-ee1d-4076-a7fd-db26594dc4a8-kube-api-access-h2w4t\") pod \"698c5219-ee1d-4076-a7fd-db26594dc4a8\" (UID: \"698c5219-ee1d-4076-a7fd-db26594dc4a8\") " Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.342083 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/698c5219-ee1d-4076-a7fd-db26594dc4a8-secret-volume\") pod \"698c5219-ee1d-4076-a7fd-db26594dc4a8\" (UID: \"698c5219-ee1d-4076-a7fd-db26594dc4a8\") " Feb 23 07:15:03 crc kubenswrapper[5047]: E0223 07:15:03.345713 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.346684 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/698c5219-ee1d-4076-a7fd-db26594dc4a8-config-volume\") pod \"698c5219-ee1d-4076-a7fd-db26594dc4a8\" (UID: \"698c5219-ee1d-4076-a7fd-db26594dc4a8\") " Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.347417 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698c5219-ee1d-4076-a7fd-db26594dc4a8-config-volume" (OuterVolumeSpecName: "config-volume") pod "698c5219-ee1d-4076-a7fd-db26594dc4a8" (UID: "698c5219-ee1d-4076-a7fd-db26594dc4a8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.351275 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698c5219-ee1d-4076-a7fd-db26594dc4a8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "698c5219-ee1d-4076-a7fd-db26594dc4a8" (UID: "698c5219-ee1d-4076-a7fd-db26594dc4a8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.351518 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698c5219-ee1d-4076-a7fd-db26594dc4a8-kube-api-access-h2w4t" (OuterVolumeSpecName: "kube-api-access-h2w4t") pod "698c5219-ee1d-4076-a7fd-db26594dc4a8" (UID: "698c5219-ee1d-4076-a7fd-db26594dc4a8"). InnerVolumeSpecName "kube-api-access-h2w4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.451144 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2w4t\" (UniqueName: \"kubernetes.io/projected/698c5219-ee1d-4076-a7fd-db26594dc4a8-kube-api-access-h2w4t\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.451178 5047 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/698c5219-ee1d-4076-a7fd-db26594dc4a8-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.451187 5047 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/698c5219-ee1d-4076-a7fd-db26594dc4a8-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.963590 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" event={"ID":"698c5219-ee1d-4076-a7fd-db26594dc4a8","Type":"ContainerDied","Data":"79fd5cf2a34f2d8d993d8e9b73700e7098314f873129675df0525e5a98ce20ce"} Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.963642 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79fd5cf2a34f2d8d993d8e9b73700e7098314f873129675df0525e5a98ce20ce" Feb 23 07:15:03 crc kubenswrapper[5047]: I0223 07:15:03.963705 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz" Feb 23 07:15:16 crc kubenswrapper[5047]: I0223 07:15:16.341865 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:15:16 crc kubenswrapper[5047]: E0223 07:15:16.343280 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:15:29 crc kubenswrapper[5047]: I0223 07:15:29.342594 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:15:29 crc kubenswrapper[5047]: E0223 07:15:29.343854 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:15:42 crc kubenswrapper[5047]: I0223 07:15:42.341637 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:15:42 crc kubenswrapper[5047]: E0223 07:15:42.343282 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:15:52 crc kubenswrapper[5047]: I0223 07:15:52.061926 5047 scope.go:117] "RemoveContainer" containerID="74b40c582ae12999f42d1eef301f44f0da1417815226e698bfe5b38ce08e9a34" Feb 23 07:15:52 crc kubenswrapper[5047]: I0223 07:15:52.646696 5047 scope.go:117] "RemoveContainer" containerID="954c1f1ee794e4e77d08d6c1a2df71e2d46a99260aa2ccf3f544ba5c564093a1" Feb 23 07:15:52 crc kubenswrapper[5047]: I0223 07:15:52.710657 5047 scope.go:117] "RemoveContainer" containerID="fd49a3d170be31b10745d22fe236e4a8ddd84be4aa8f10cffd23d28b0ec0dcb6" Feb 23 07:15:52 crc kubenswrapper[5047]: I0223 07:15:52.735462 5047 scope.go:117] "RemoveContainer" containerID="897b14a78dfbcdad54f56571d5c5ba61205fe2678162c41bb69a05cb9c0da5b7" Feb 23 07:15:57 crc kubenswrapper[5047]: I0223 07:15:57.341013 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:15:57 crc kubenswrapper[5047]: E0223 07:15:57.341953 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:16:10 crc kubenswrapper[5047]: I0223 07:16:10.341627 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:16:10 crc kubenswrapper[5047]: E0223 07:16:10.343551 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:16:24 crc kubenswrapper[5047]: I0223 07:16:24.345548 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:16:24 crc kubenswrapper[5047]: E0223 07:16:24.346720 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:16:35 crc kubenswrapper[5047]: I0223 07:16:35.341463 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:16:35 crc kubenswrapper[5047]: E0223 07:16:35.342342 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:16:50 crc kubenswrapper[5047]: I0223 07:16:50.342272 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:16:50 crc kubenswrapper[5047]: E0223 07:16:50.343623 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:16:52 crc kubenswrapper[5047]: I0223 07:16:52.826123 5047 scope.go:117] "RemoveContainer" containerID="106b2bdb20d1a2d2e867ad06c463fa1d14f76f5b815d49afbd6405b8e6d0af57" Feb 23 07:16:52 crc kubenswrapper[5047]: I0223 07:16:52.863092 5047 scope.go:117] "RemoveContainer" containerID="16b656be4c9cc38ec648ec0e389c1956835cc8867344cd5dc329014321e6c9af" Feb 23 07:16:52 crc kubenswrapper[5047]: I0223 07:16:52.932012 5047 scope.go:117] "RemoveContainer" containerID="673e725806a724d8ff4b5a6b0941c0c98cab0db5716267243f589266bfe8fc9f" Feb 23 07:17:01 crc kubenswrapper[5047]: I0223 07:17:01.341848 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:17:01 crc kubenswrapper[5047]: E0223 07:17:01.343222 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:17:16 crc kubenswrapper[5047]: I0223 07:17:16.341947 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:17:16 crc kubenswrapper[5047]: E0223 07:17:16.343328 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:17:29 crc kubenswrapper[5047]: I0223 07:17:29.342399 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:17:29 crc kubenswrapper[5047]: E0223 07:17:29.343825 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:17:42 crc kubenswrapper[5047]: I0223 07:17:42.341440 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:17:42 crc kubenswrapper[5047]: E0223 07:17:42.342731 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:17:56 crc kubenswrapper[5047]: I0223 07:17:56.341987 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:17:56 crc kubenswrapper[5047]: E0223 07:17:56.343270 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:18:08 crc kubenswrapper[5047]: I0223 07:18:08.346595 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:18:08 crc kubenswrapper[5047]: E0223 07:18:08.347808 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:18:20 crc kubenswrapper[5047]: I0223 07:18:20.340592 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:18:20 crc kubenswrapper[5047]: E0223 07:18:20.341630 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:18:35 crc kubenswrapper[5047]: I0223 07:18:35.341561 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:18:35 crc kubenswrapper[5047]: E0223 07:18:35.342707 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:18:49 crc kubenswrapper[5047]: I0223 07:18:49.341140 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:18:50 crc kubenswrapper[5047]: I0223 07:18:50.190768 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"0d13b86f3c29a55a8034ca0dd7a08a0b8d33896a1a4f64bdde470e123ac2b1e9"} Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.417851 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pbw2h"] Feb 23 07:20:03 crc kubenswrapper[5047]: E0223 07:20:03.418854 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698c5219-ee1d-4076-a7fd-db26594dc4a8" containerName="collect-profiles" Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.418872 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="698c5219-ee1d-4076-a7fd-db26594dc4a8" containerName="collect-profiles" Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.419070 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="698c5219-ee1d-4076-a7fd-db26594dc4a8" containerName="collect-profiles" Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.420289 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.436100 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pbw2h"] Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.537661 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8369b050-4b0f-461d-8629-9425f1997ee5-utilities\") pod \"certified-operators-pbw2h\" (UID: \"8369b050-4b0f-461d-8629-9425f1997ee5\") " pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.537733 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8369b050-4b0f-461d-8629-9425f1997ee5-catalog-content\") pod \"certified-operators-pbw2h\" (UID: \"8369b050-4b0f-461d-8629-9425f1997ee5\") " pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.537843 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trm64\" (UniqueName: \"kubernetes.io/projected/8369b050-4b0f-461d-8629-9425f1997ee5-kube-api-access-trm64\") pod \"certified-operators-pbw2h\" (UID: \"8369b050-4b0f-461d-8629-9425f1997ee5\") " pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.639292 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trm64\" (UniqueName: \"kubernetes.io/projected/8369b050-4b0f-461d-8629-9425f1997ee5-kube-api-access-trm64\") pod \"certified-operators-pbw2h\" (UID: \"8369b050-4b0f-461d-8629-9425f1997ee5\") " pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.639395 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8369b050-4b0f-461d-8629-9425f1997ee5-utilities\") pod \"certified-operators-pbw2h\" (UID: \"8369b050-4b0f-461d-8629-9425f1997ee5\") " pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.639435 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8369b050-4b0f-461d-8629-9425f1997ee5-catalog-content\") pod \"certified-operators-pbw2h\" (UID: \"8369b050-4b0f-461d-8629-9425f1997ee5\") " pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.640056 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8369b050-4b0f-461d-8629-9425f1997ee5-catalog-content\") pod \"certified-operators-pbw2h\" (UID: \"8369b050-4b0f-461d-8629-9425f1997ee5\") " pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.640161 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8369b050-4b0f-461d-8629-9425f1997ee5-utilities\") pod \"certified-operators-pbw2h\" (UID: \"8369b050-4b0f-461d-8629-9425f1997ee5\") " pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.665712 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trm64\" (UniqueName: \"kubernetes.io/projected/8369b050-4b0f-461d-8629-9425f1997ee5-kube-api-access-trm64\") pod \"certified-operators-pbw2h\" (UID: \"8369b050-4b0f-461d-8629-9425f1997ee5\") " pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:03 crc kubenswrapper[5047]: I0223 07:20:03.740675 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:04 crc kubenswrapper[5047]: I0223 07:20:04.280626 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pbw2h"] Feb 23 07:20:04 crc kubenswrapper[5047]: I0223 07:20:04.907012 5047 generic.go:334] "Generic (PLEG): container finished" podID="8369b050-4b0f-461d-8629-9425f1997ee5" containerID="7629217cf27102e82b699222cd453f57c814930c072343ec8a0681fa04da834a" exitCode=0 Feb 23 07:20:04 crc kubenswrapper[5047]: I0223 07:20:04.907092 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pbw2h" event={"ID":"8369b050-4b0f-461d-8629-9425f1997ee5","Type":"ContainerDied","Data":"7629217cf27102e82b699222cd453f57c814930c072343ec8a0681fa04da834a"} Feb 23 07:20:04 crc kubenswrapper[5047]: I0223 07:20:04.907139 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pbw2h" event={"ID":"8369b050-4b0f-461d-8629-9425f1997ee5","Type":"ContainerStarted","Data":"04589be7af32832da09e76ec0416fbc27dac3c08727a258c76bda710c854e5aa"} Feb 23 07:20:04 crc kubenswrapper[5047]: I0223 07:20:04.909561 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:20:09 crc kubenswrapper[5047]: I0223 07:20:09.966651 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pbw2h" event={"ID":"8369b050-4b0f-461d-8629-9425f1997ee5","Type":"ContainerStarted","Data":"74115537d48d8b46dc4a8538699620ff025eedb88a1be410eafffe6fa80378f1"} Feb 23 07:20:10 crc kubenswrapper[5047]: I0223 07:20:10.979498 5047 generic.go:334] "Generic (PLEG): container finished" podID="8369b050-4b0f-461d-8629-9425f1997ee5" containerID="74115537d48d8b46dc4a8538699620ff025eedb88a1be410eafffe6fa80378f1" exitCode=0 Feb 23 07:20:10 crc kubenswrapper[5047]: I0223 07:20:10.979613 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pbw2h" event={"ID":"8369b050-4b0f-461d-8629-9425f1997ee5","Type":"ContainerDied","Data":"74115537d48d8b46dc4a8538699620ff025eedb88a1be410eafffe6fa80378f1"} Feb 23 07:20:11 crc kubenswrapper[5047]: I0223 07:20:11.988452 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pbw2h" event={"ID":"8369b050-4b0f-461d-8629-9425f1997ee5","Type":"ContainerStarted","Data":"34e806794cee0a4d59791eb0e2d18bc3781ccecee2fe710ac14159571f2ba8b6"} Feb 23 07:20:12 crc kubenswrapper[5047]: I0223 07:20:12.017437 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pbw2h" podStartSLOduration=2.552779095 podStartE2EDuration="9.017407206s" podCreationTimestamp="2026-02-23 07:20:03 +0000 UTC" firstStartedPulling="2026-02-23 07:20:04.909264465 +0000 UTC m=+2127.160591599" lastFinishedPulling="2026-02-23 07:20:11.373892576 +0000 UTC m=+2133.625219710" observedRunningTime="2026-02-23 07:20:12.009533002 +0000 UTC m=+2134.260860136" watchObservedRunningTime="2026-02-23 07:20:12.017407206 +0000 UTC m=+2134.268734350" Feb 23 07:20:13 crc kubenswrapper[5047]: I0223 07:20:13.741810 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:13 crc kubenswrapper[5047]: I0223 07:20:13.742247 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:13 crc kubenswrapper[5047]: I0223 07:20:13.799305 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:23 crc kubenswrapper[5047]: I0223 07:20:23.818572 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pbw2h" Feb 23 07:20:23 crc kubenswrapper[5047]: I0223 07:20:23.917868 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pbw2h"] Feb 23 07:20:23 crc kubenswrapper[5047]: I0223 07:20:23.958960 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xp9kv"] Feb 23 07:20:23 crc kubenswrapper[5047]: I0223 07:20:23.959367 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xp9kv" podUID="e6cf7a53-d471-4e95-b648-967929583e12" containerName="registry-server" containerID="cri-o://2f784c1289001583da65857f7f2824fd08fe5bd021b2f96edbc05b9e3910702d" gracePeriod=2 Feb 23 07:20:24 crc kubenswrapper[5047]: I0223 07:20:24.106269 5047 generic.go:334] "Generic (PLEG): container finished" podID="e6cf7a53-d471-4e95-b648-967929583e12" containerID="2f784c1289001583da65857f7f2824fd08fe5bd021b2f96edbc05b9e3910702d" exitCode=0 Feb 23 07:20:24 crc kubenswrapper[5047]: I0223 07:20:24.106575 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp9kv" event={"ID":"e6cf7a53-d471-4e95-b648-967929583e12","Type":"ContainerDied","Data":"2f784c1289001583da65857f7f2824fd08fe5bd021b2f96edbc05b9e3910702d"} Feb 23 07:20:24 crc kubenswrapper[5047]: I0223 07:20:24.427639 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 07:20:24 crc kubenswrapper[5047]: I0223 07:20:24.517063 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk68p\" (UniqueName: \"kubernetes.io/projected/e6cf7a53-d471-4e95-b648-967929583e12-kube-api-access-xk68p\") pod \"e6cf7a53-d471-4e95-b648-967929583e12\" (UID: \"e6cf7a53-d471-4e95-b648-967929583e12\") " Feb 23 07:20:24 crc kubenswrapper[5047]: I0223 07:20:24.517190 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cf7a53-d471-4e95-b648-967929583e12-utilities\") pod \"e6cf7a53-d471-4e95-b648-967929583e12\" (UID: \"e6cf7a53-d471-4e95-b648-967929583e12\") " Feb 23 07:20:24 crc kubenswrapper[5047]: I0223 07:20:24.517255 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cf7a53-d471-4e95-b648-967929583e12-catalog-content\") pod \"e6cf7a53-d471-4e95-b648-967929583e12\" (UID: \"e6cf7a53-d471-4e95-b648-967929583e12\") " Feb 23 07:20:24 crc kubenswrapper[5047]: I0223 07:20:24.520662 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6cf7a53-d471-4e95-b648-967929583e12-utilities" (OuterVolumeSpecName: "utilities") pod "e6cf7a53-d471-4e95-b648-967929583e12" (UID: "e6cf7a53-d471-4e95-b648-967929583e12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:20:24 crc kubenswrapper[5047]: I0223 07:20:24.539685 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6cf7a53-d471-4e95-b648-967929583e12-kube-api-access-xk68p" (OuterVolumeSpecName: "kube-api-access-xk68p") pod "e6cf7a53-d471-4e95-b648-967929583e12" (UID: "e6cf7a53-d471-4e95-b648-967929583e12"). InnerVolumeSpecName "kube-api-access-xk68p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:20:24 crc kubenswrapper[5047]: I0223 07:20:24.573556 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6cf7a53-d471-4e95-b648-967929583e12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6cf7a53-d471-4e95-b648-967929583e12" (UID: "e6cf7a53-d471-4e95-b648-967929583e12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:20:24 crc kubenswrapper[5047]: I0223 07:20:24.619618 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cf7a53-d471-4e95-b648-967929583e12-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:24 crc kubenswrapper[5047]: I0223 07:20:24.619655 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cf7a53-d471-4e95-b648-967929583e12-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:24 crc kubenswrapper[5047]: I0223 07:20:24.619669 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk68p\" (UniqueName: \"kubernetes.io/projected/e6cf7a53-d471-4e95-b648-967929583e12-kube-api-access-xk68p\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:25 crc kubenswrapper[5047]: I0223 07:20:25.121648 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp9kv" event={"ID":"e6cf7a53-d471-4e95-b648-967929583e12","Type":"ContainerDied","Data":"eb111bc5dcc2755dbf1e65ccda0bb162b4527b02f921e332f9f7c3f326e69425"} Feb 23 07:20:25 crc kubenswrapper[5047]: I0223 07:20:25.122041 5047 scope.go:117] "RemoveContainer" containerID="2f784c1289001583da65857f7f2824fd08fe5bd021b2f96edbc05b9e3910702d" Feb 23 07:20:25 crc kubenswrapper[5047]: I0223 07:20:25.122279 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xp9kv" Feb 23 07:20:25 crc kubenswrapper[5047]: I0223 07:20:25.149347 5047 scope.go:117] "RemoveContainer" containerID="971dcb84903d6a4031431cc42275b1125ceeec48b6502d13e4458f2981523f7e" Feb 23 07:20:25 crc kubenswrapper[5047]: I0223 07:20:25.166296 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xp9kv"] Feb 23 07:20:25 crc kubenswrapper[5047]: I0223 07:20:25.179836 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xp9kv"] Feb 23 07:20:25 crc kubenswrapper[5047]: I0223 07:20:25.185841 5047 scope.go:117] "RemoveContainer" containerID="2fc1fe6c48327f950dde8169a3751effa057a86b57ea5171620f75cd43ab3525" Feb 23 07:20:26 crc kubenswrapper[5047]: I0223 07:20:26.357542 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6cf7a53-d471-4e95-b648-967929583e12" path="/var/lib/kubelet/pods/e6cf7a53-d471-4e95-b648-967929583e12/volumes" Feb 23 07:20:32 crc kubenswrapper[5047]: I0223 07:20:32.864782 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qnmfc"] Feb 23 07:20:32 crc kubenswrapper[5047]: E0223 07:20:32.866050 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cf7a53-d471-4e95-b648-967929583e12" containerName="registry-server" Feb 23 07:20:32 crc kubenswrapper[5047]: I0223 07:20:32.866078 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cf7a53-d471-4e95-b648-967929583e12" containerName="registry-server" Feb 23 07:20:32 crc kubenswrapper[5047]: E0223 07:20:32.866114 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cf7a53-d471-4e95-b648-967929583e12" containerName="extract-utilities" Feb 23 07:20:32 crc kubenswrapper[5047]: I0223 07:20:32.866127 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cf7a53-d471-4e95-b648-967929583e12" containerName="extract-utilities" Feb 23 07:20:32 crc kubenswrapper[5047]: E0223 07:20:32.866157 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cf7a53-d471-4e95-b648-967929583e12" containerName="extract-content" Feb 23 07:20:32 crc kubenswrapper[5047]: I0223 07:20:32.866172 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cf7a53-d471-4e95-b648-967929583e12" containerName="extract-content" Feb 23 07:20:32 crc kubenswrapper[5047]: I0223 07:20:32.866473 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6cf7a53-d471-4e95-b648-967929583e12" containerName="registry-server" Feb 23 07:20:32 crc kubenswrapper[5047]: I0223 07:20:32.870677 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:32 crc kubenswrapper[5047]: I0223 07:20:32.944810 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qnmfc"] Feb 23 07:20:32 crc kubenswrapper[5047]: I0223 07:20:32.967091 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bae2e2d-3734-420b-9c02-90980eac8859-catalog-content\") pod \"redhat-operators-qnmfc\" (UID: \"8bae2e2d-3734-420b-9c02-90980eac8859\") " pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:32 crc kubenswrapper[5047]: I0223 07:20:32.967158 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bae2e2d-3734-420b-9c02-90980eac8859-utilities\") pod \"redhat-operators-qnmfc\" (UID: \"8bae2e2d-3734-420b-9c02-90980eac8859\") " pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:32 crc kubenswrapper[5047]: I0223 07:20:32.967216 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dnxg\" (UniqueName: \"kubernetes.io/projected/8bae2e2d-3734-420b-9c02-90980eac8859-kube-api-access-7dnxg\") pod \"redhat-operators-qnmfc\" (UID: \"8bae2e2d-3734-420b-9c02-90980eac8859\") " pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:33 crc kubenswrapper[5047]: I0223 07:20:33.068491 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dnxg\" (UniqueName: \"kubernetes.io/projected/8bae2e2d-3734-420b-9c02-90980eac8859-kube-api-access-7dnxg\") pod \"redhat-operators-qnmfc\" (UID: \"8bae2e2d-3734-420b-9c02-90980eac8859\") " pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:33 crc kubenswrapper[5047]: I0223 07:20:33.068633 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bae2e2d-3734-420b-9c02-90980eac8859-catalog-content\") pod \"redhat-operators-qnmfc\" (UID: \"8bae2e2d-3734-420b-9c02-90980eac8859\") " pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:33 crc kubenswrapper[5047]: I0223 07:20:33.068673 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bae2e2d-3734-420b-9c02-90980eac8859-utilities\") pod \"redhat-operators-qnmfc\" (UID: \"8bae2e2d-3734-420b-9c02-90980eac8859\") " pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:33 crc kubenswrapper[5047]: I0223 07:20:33.069388 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bae2e2d-3734-420b-9c02-90980eac8859-utilities\") pod \"redhat-operators-qnmfc\" (UID: \"8bae2e2d-3734-420b-9c02-90980eac8859\") " pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:33 crc kubenswrapper[5047]: I0223 07:20:33.069569 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bae2e2d-3734-420b-9c02-90980eac8859-catalog-content\") pod \"redhat-operators-qnmfc\" (UID: \"8bae2e2d-3734-420b-9c02-90980eac8859\") " pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:33 crc kubenswrapper[5047]: I0223 07:20:33.102283 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dnxg\" (UniqueName: \"kubernetes.io/projected/8bae2e2d-3734-420b-9c02-90980eac8859-kube-api-access-7dnxg\") pod \"redhat-operators-qnmfc\" (UID: \"8bae2e2d-3734-420b-9c02-90980eac8859\") " pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:33 crc kubenswrapper[5047]: I0223 07:20:33.193261 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:33 crc kubenswrapper[5047]: I0223 07:20:33.674572 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qnmfc"] Feb 23 07:20:34 crc kubenswrapper[5047]: I0223 07:20:34.218762 5047 generic.go:334] "Generic (PLEG): container finished" podID="8bae2e2d-3734-420b-9c02-90980eac8859" containerID="702baf6936349ce01e46a157f2a821f5703a663354572d4531a530e3e11cab32" exitCode=0 Feb 23 07:20:34 crc kubenswrapper[5047]: I0223 07:20:34.218869 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnmfc" event={"ID":"8bae2e2d-3734-420b-9c02-90980eac8859","Type":"ContainerDied","Data":"702baf6936349ce01e46a157f2a821f5703a663354572d4531a530e3e11cab32"} Feb 23 07:20:34 crc kubenswrapper[5047]: I0223 07:20:34.219227 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnmfc" event={"ID":"8bae2e2d-3734-420b-9c02-90980eac8859","Type":"ContainerStarted","Data":"b49cff3eacb6f4ee2ce77188a2186af0b152fa3a83d9e3b21dc4563f84b189a6"} Feb 23 07:20:36 crc kubenswrapper[5047]: I0223 07:20:36.240566 5047 generic.go:334] "Generic (PLEG): container finished" podID="8bae2e2d-3734-420b-9c02-90980eac8859" containerID="74d3a54d5f6b20825a326de20c1513d15b64919c71d449a37fc6f6f8e568b28a" exitCode=0 Feb 23 07:20:36 crc kubenswrapper[5047]: I0223 07:20:36.240865 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnmfc" event={"ID":"8bae2e2d-3734-420b-9c02-90980eac8859","Type":"ContainerDied","Data":"74d3a54d5f6b20825a326de20c1513d15b64919c71d449a37fc6f6f8e568b28a"} Feb 23 07:20:37 crc kubenswrapper[5047]: I0223 07:20:37.251342 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnmfc" event={"ID":"8bae2e2d-3734-420b-9c02-90980eac8859","Type":"ContainerStarted","Data":"aa731eaa5e5abbee9819e50352c3ef1f1f1dcfb11c423245322d19b28f4ec2f2"} Feb 23 07:20:37 crc kubenswrapper[5047]: I0223 07:20:37.279384 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qnmfc" podStartSLOduration=2.84568762 podStartE2EDuration="5.279362966s" podCreationTimestamp="2026-02-23 07:20:32 +0000 UTC" firstStartedPulling="2026-02-23 07:20:34.221087114 +0000 UTC m=+2156.472414248" lastFinishedPulling="2026-02-23 07:20:36.65476246 +0000 UTC m=+2158.906089594" observedRunningTime="2026-02-23 07:20:37.276751725 +0000 UTC m=+2159.528078919" watchObservedRunningTime="2026-02-23 07:20:37.279362966 +0000 UTC m=+2159.530690120" Feb 23 07:20:43 crc kubenswrapper[5047]: I0223 07:20:43.194134 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:43 crc kubenswrapper[5047]: I0223 07:20:43.196419 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:44 crc kubenswrapper[5047]: I0223 07:20:44.265620 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qnmfc" podUID="8bae2e2d-3734-420b-9c02-90980eac8859" containerName="registry-server" probeResult="failure" output=< Feb 23 07:20:44 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 07:20:44 crc kubenswrapper[5047]: > Feb 23 07:20:53 crc kubenswrapper[5047]: I0223 07:20:53.267850 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:53 crc kubenswrapper[5047]: I0223 07:20:53.335440 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:53 crc kubenswrapper[5047]: I0223 07:20:53.522784 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qnmfc"] Feb 23 07:20:54 crc kubenswrapper[5047]: I0223 07:20:54.598421 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qnmfc" podUID="8bae2e2d-3734-420b-9c02-90980eac8859" containerName="registry-server" containerID="cri-o://aa731eaa5e5abbee9819e50352c3ef1f1f1dcfb11c423245322d19b28f4ec2f2" gracePeriod=2 Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.079696 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.208339 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bae2e2d-3734-420b-9c02-90980eac8859-catalog-content\") pod \"8bae2e2d-3734-420b-9c02-90980eac8859\" (UID: \"8bae2e2d-3734-420b-9c02-90980eac8859\") " Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.208596 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bae2e2d-3734-420b-9c02-90980eac8859-utilities\") pod \"8bae2e2d-3734-420b-9c02-90980eac8859\" (UID: \"8bae2e2d-3734-420b-9c02-90980eac8859\") " Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.208703 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dnxg\" (UniqueName: \"kubernetes.io/projected/8bae2e2d-3734-420b-9c02-90980eac8859-kube-api-access-7dnxg\") pod \"8bae2e2d-3734-420b-9c02-90980eac8859\" (UID: \"8bae2e2d-3734-420b-9c02-90980eac8859\") " Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.210127 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bae2e2d-3734-420b-9c02-90980eac8859-utilities" (OuterVolumeSpecName: "utilities") pod "8bae2e2d-3734-420b-9c02-90980eac8859" (UID: "8bae2e2d-3734-420b-9c02-90980eac8859"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.216060 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bae2e2d-3734-420b-9c02-90980eac8859-kube-api-access-7dnxg" (OuterVolumeSpecName: "kube-api-access-7dnxg") pod "8bae2e2d-3734-420b-9c02-90980eac8859" (UID: "8bae2e2d-3734-420b-9c02-90980eac8859"). InnerVolumeSpecName "kube-api-access-7dnxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.311505 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bae2e2d-3734-420b-9c02-90980eac8859-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.311583 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dnxg\" (UniqueName: \"kubernetes.io/projected/8bae2e2d-3734-420b-9c02-90980eac8859-kube-api-access-7dnxg\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.380158 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bae2e2d-3734-420b-9c02-90980eac8859-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bae2e2d-3734-420b-9c02-90980eac8859" (UID: "8bae2e2d-3734-420b-9c02-90980eac8859"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.414408 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bae2e2d-3734-420b-9c02-90980eac8859-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.613053 5047 generic.go:334] "Generic (PLEG): container finished" podID="8bae2e2d-3734-420b-9c02-90980eac8859" containerID="aa731eaa5e5abbee9819e50352c3ef1f1f1dcfb11c423245322d19b28f4ec2f2" exitCode=0 Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.613180 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnmfc" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.613160 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnmfc" event={"ID":"8bae2e2d-3734-420b-9c02-90980eac8859","Type":"ContainerDied","Data":"aa731eaa5e5abbee9819e50352c3ef1f1f1dcfb11c423245322d19b28f4ec2f2"} Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.613425 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnmfc" event={"ID":"8bae2e2d-3734-420b-9c02-90980eac8859","Type":"ContainerDied","Data":"b49cff3eacb6f4ee2ce77188a2186af0b152fa3a83d9e3b21dc4563f84b189a6"} Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.613461 5047 scope.go:117] "RemoveContainer" containerID="aa731eaa5e5abbee9819e50352c3ef1f1f1dcfb11c423245322d19b28f4ec2f2" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.665014 5047 scope.go:117] "RemoveContainer" containerID="74d3a54d5f6b20825a326de20c1513d15b64919c71d449a37fc6f6f8e568b28a" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.674159 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qnmfc"] Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.687987 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qnmfc"] Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.692642 5047 scope.go:117] "RemoveContainer" containerID="702baf6936349ce01e46a157f2a821f5703a663354572d4531a530e3e11cab32" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.730152 5047 scope.go:117] "RemoveContainer" containerID="aa731eaa5e5abbee9819e50352c3ef1f1f1dcfb11c423245322d19b28f4ec2f2" Feb 23 07:20:55 crc kubenswrapper[5047]: E0223 07:20:55.730838 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa731eaa5e5abbee9819e50352c3ef1f1f1dcfb11c423245322d19b28f4ec2f2\": container with ID starting with aa731eaa5e5abbee9819e50352c3ef1f1f1dcfb11c423245322d19b28f4ec2f2 not found: ID does not exist" containerID="aa731eaa5e5abbee9819e50352c3ef1f1f1dcfb11c423245322d19b28f4ec2f2" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.730932 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa731eaa5e5abbee9819e50352c3ef1f1f1dcfb11c423245322d19b28f4ec2f2"} err="failed to get container status \"aa731eaa5e5abbee9819e50352c3ef1f1f1dcfb11c423245322d19b28f4ec2f2\": rpc error: code = NotFound desc = could not find container \"aa731eaa5e5abbee9819e50352c3ef1f1f1dcfb11c423245322d19b28f4ec2f2\": container with ID starting with aa731eaa5e5abbee9819e50352c3ef1f1f1dcfb11c423245322d19b28f4ec2f2 not found: ID does not exist" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.730971 5047 scope.go:117] "RemoveContainer" containerID="74d3a54d5f6b20825a326de20c1513d15b64919c71d449a37fc6f6f8e568b28a" Feb 23 07:20:55 crc kubenswrapper[5047]: E0223 07:20:55.732211 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d3a54d5f6b20825a326de20c1513d15b64919c71d449a37fc6f6f8e568b28a\": container with ID starting with 74d3a54d5f6b20825a326de20c1513d15b64919c71d449a37fc6f6f8e568b28a not found: ID does not exist" containerID="74d3a54d5f6b20825a326de20c1513d15b64919c71d449a37fc6f6f8e568b28a" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.732360 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d3a54d5f6b20825a326de20c1513d15b64919c71d449a37fc6f6f8e568b28a"} err="failed to get container status \"74d3a54d5f6b20825a326de20c1513d15b64919c71d449a37fc6f6f8e568b28a\": rpc error: code = NotFound desc = could not find container \"74d3a54d5f6b20825a326de20c1513d15b64919c71d449a37fc6f6f8e568b28a\": container with ID starting with 74d3a54d5f6b20825a326de20c1513d15b64919c71d449a37fc6f6f8e568b28a not found: ID does not exist" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.732399 5047 scope.go:117] "RemoveContainer" containerID="702baf6936349ce01e46a157f2a821f5703a663354572d4531a530e3e11cab32" Feb 23 07:20:55 crc kubenswrapper[5047]: E0223 07:20:55.732808 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"702baf6936349ce01e46a157f2a821f5703a663354572d4531a530e3e11cab32\": container with ID starting with 702baf6936349ce01e46a157f2a821f5703a663354572d4531a530e3e11cab32 not found: ID does not exist" containerID="702baf6936349ce01e46a157f2a821f5703a663354572d4531a530e3e11cab32" Feb 23 07:20:55 crc kubenswrapper[5047]: I0223 07:20:55.732833 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"702baf6936349ce01e46a157f2a821f5703a663354572d4531a530e3e11cab32"} err="failed to get container status \"702baf6936349ce01e46a157f2a821f5703a663354572d4531a530e3e11cab32\": rpc error: code = NotFound desc = could not find container \"702baf6936349ce01e46a157f2a821f5703a663354572d4531a530e3e11cab32\": container with ID starting with 702baf6936349ce01e46a157f2a821f5703a663354572d4531a530e3e11cab32 not found: ID does not exist" Feb 23 07:20:56 crc kubenswrapper[5047]: I0223 07:20:56.352319 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bae2e2d-3734-420b-9c02-90980eac8859" path="/var/lib/kubelet/pods/8bae2e2d-3734-420b-9c02-90980eac8859/volumes" Feb 23 07:21:16 crc kubenswrapper[5047]: I0223 07:21:16.759977 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:21:16 crc kubenswrapper[5047]: I0223 07:21:16.761123 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:21:46 crc kubenswrapper[5047]: I0223 07:21:46.760164 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:21:46 crc kubenswrapper[5047]: I0223 07:21:46.761040 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:22:16 crc kubenswrapper[5047]: I0223 07:22:16.759766 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:22:16 crc kubenswrapper[5047]: I0223 07:22:16.762151 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:22:16 crc kubenswrapper[5047]: I0223 07:22:16.762262 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 07:22:16 crc kubenswrapper[5047]: I0223 07:22:16.764159 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d13b86f3c29a55a8034ca0dd7a08a0b8d33896a1a4f64bdde470e123ac2b1e9"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:22:16 crc kubenswrapper[5047]: I0223 07:22:16.764266 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://0d13b86f3c29a55a8034ca0dd7a08a0b8d33896a1a4f64bdde470e123ac2b1e9" gracePeriod=600 Feb 23 07:22:17 crc kubenswrapper[5047]: I0223 07:22:17.086782 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="0d13b86f3c29a55a8034ca0dd7a08a0b8d33896a1a4f64bdde470e123ac2b1e9" exitCode=0 Feb 23 07:22:17 crc kubenswrapper[5047]: I0223 07:22:17.087064 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"0d13b86f3c29a55a8034ca0dd7a08a0b8d33896a1a4f64bdde470e123ac2b1e9"} Feb 23 07:22:17 crc kubenswrapper[5047]: I0223 07:22:17.087356 5047 scope.go:117] "RemoveContainer" containerID="cf451a45d17468799ce9c18cb57b8bb5cc051930b8abccce96bfd029bd6287d0" Feb 23 07:22:18 crc kubenswrapper[5047]: I0223 07:22:18.101775 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f"} Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.353083 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-brwq7"] Feb 23 07:22:26 crc kubenswrapper[5047]: E0223 07:22:26.354239 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bae2e2d-3734-420b-9c02-90980eac8859" containerName="extract-content" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.354258 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bae2e2d-3734-420b-9c02-90980eac8859" containerName="extract-content" Feb 23 07:22:26 crc kubenswrapper[5047]: E0223 07:22:26.354303 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bae2e2d-3734-420b-9c02-90980eac8859" containerName="registry-server" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.354312 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bae2e2d-3734-420b-9c02-90980eac8859" containerName="registry-server" Feb 23 07:22:26 crc kubenswrapper[5047]: E0223 07:22:26.354320 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bae2e2d-3734-420b-9c02-90980eac8859" containerName="extract-utilities" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.354329 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bae2e2d-3734-420b-9c02-90980eac8859" containerName="extract-utilities" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.354520 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bae2e2d-3734-420b-9c02-90980eac8859" containerName="registry-server" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.356508 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.374277 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-brwq7"] Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.509947 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/920cb13a-017f-400b-8b51-e341395ba40c-utilities\") pod \"redhat-marketplace-brwq7\" (UID: \"920cb13a-017f-400b-8b51-e341395ba40c\") " pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.510053 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/920cb13a-017f-400b-8b51-e341395ba40c-catalog-content\") pod \"redhat-marketplace-brwq7\" (UID: \"920cb13a-017f-400b-8b51-e341395ba40c\") " pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.510108 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kswmr\" (UniqueName: \"kubernetes.io/projected/920cb13a-017f-400b-8b51-e341395ba40c-kube-api-access-kswmr\") pod \"redhat-marketplace-brwq7\" (UID: \"920cb13a-017f-400b-8b51-e341395ba40c\") " pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.611772 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kswmr\" (UniqueName: \"kubernetes.io/projected/920cb13a-017f-400b-8b51-e341395ba40c-kube-api-access-kswmr\") pod \"redhat-marketplace-brwq7\" (UID: \"920cb13a-017f-400b-8b51-e341395ba40c\") " pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.611848 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/920cb13a-017f-400b-8b51-e341395ba40c-utilities\") pod \"redhat-marketplace-brwq7\" (UID: \"920cb13a-017f-400b-8b51-e341395ba40c\") " pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.611895 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/920cb13a-017f-400b-8b51-e341395ba40c-catalog-content\") pod \"redhat-marketplace-brwq7\" (UID: \"920cb13a-017f-400b-8b51-e341395ba40c\") " pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.612462 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/920cb13a-017f-400b-8b51-e341395ba40c-catalog-content\") pod \"redhat-marketplace-brwq7\" (UID: \"920cb13a-017f-400b-8b51-e341395ba40c\") " pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.612655 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/920cb13a-017f-400b-8b51-e341395ba40c-utilities\") pod \"redhat-marketplace-brwq7\" (UID: \"920cb13a-017f-400b-8b51-e341395ba40c\") " pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.648012 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kswmr\" (UniqueName: \"kubernetes.io/projected/920cb13a-017f-400b-8b51-e341395ba40c-kube-api-access-kswmr\") pod \"redhat-marketplace-brwq7\" (UID: \"920cb13a-017f-400b-8b51-e341395ba40c\") " pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:26 crc kubenswrapper[5047]: I0223 07:22:26.683987 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:27 crc kubenswrapper[5047]: I0223 07:22:27.156372 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-brwq7"] Feb 23 07:22:27 crc kubenswrapper[5047]: I0223 07:22:27.218174 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brwq7" event={"ID":"920cb13a-017f-400b-8b51-e341395ba40c","Type":"ContainerStarted","Data":"7daf2455830a94a40d216c3723569c19c99fd064302f5c6e7c122738425a5e21"} Feb 23 07:22:28 crc kubenswrapper[5047]: I0223 07:22:28.230678 5047 generic.go:334] "Generic (PLEG): container finished" podID="920cb13a-017f-400b-8b51-e341395ba40c" containerID="391d4edcc5ef29d0c6e534df06892d9f6dd8c9549e19185f4464a9db33f30557" exitCode=0 Feb 23 07:22:28 crc kubenswrapper[5047]: I0223 07:22:28.230775 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brwq7" event={"ID":"920cb13a-017f-400b-8b51-e341395ba40c","Type":"ContainerDied","Data":"391d4edcc5ef29d0c6e534df06892d9f6dd8c9549e19185f4464a9db33f30557"} Feb 23 07:22:30 crc kubenswrapper[5047]: I0223 07:22:30.259840 5047 generic.go:334] "Generic (PLEG): container finished" podID="920cb13a-017f-400b-8b51-e341395ba40c" containerID="92b9e268871226462c6193b7739a222c52d7ac865022615353b20c2d91565f21" exitCode=0 Feb 23 07:22:30 crc kubenswrapper[5047]: I0223 07:22:30.260105 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brwq7" event={"ID":"920cb13a-017f-400b-8b51-e341395ba40c","Type":"ContainerDied","Data":"92b9e268871226462c6193b7739a222c52d7ac865022615353b20c2d91565f21"} Feb 23 07:22:31 crc kubenswrapper[5047]: I0223 07:22:31.270161 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brwq7" event={"ID":"920cb13a-017f-400b-8b51-e341395ba40c","Type":"ContainerStarted","Data":"b3d44636adcac696402e8f118ad8f7e55f8ea536313978f1da59615500ce63e3"} Feb 23 07:22:31 crc kubenswrapper[5047]: I0223 07:22:31.308188 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-brwq7" podStartSLOduration=2.75739947 podStartE2EDuration="5.308157826s" podCreationTimestamp="2026-02-23 07:22:26 +0000 UTC" firstStartedPulling="2026-02-23 07:22:28.233271604 +0000 UTC m=+2270.484598778" lastFinishedPulling="2026-02-23 07:22:30.78403 +0000 UTC m=+2273.035357134" observedRunningTime="2026-02-23 07:22:31.295351468 +0000 UTC m=+2273.546678642" watchObservedRunningTime="2026-02-23 07:22:31.308157826 +0000 UTC m=+2273.559484970" Feb 23 07:22:36 crc kubenswrapper[5047]: I0223 07:22:36.684863 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:36 crc kubenswrapper[5047]: I0223 07:22:36.685494 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:36 crc kubenswrapper[5047]: I0223 07:22:36.774293 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:37 crc kubenswrapper[5047]: I0223 07:22:37.403395 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:37 crc kubenswrapper[5047]: I0223 07:22:37.483753 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-brwq7"] Feb 23 07:22:39 crc kubenswrapper[5047]: I0223 07:22:39.342764 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-brwq7" podUID="920cb13a-017f-400b-8b51-e341395ba40c" containerName="registry-server" containerID="cri-o://b3d44636adcac696402e8f118ad8f7e55f8ea536313978f1da59615500ce63e3" gracePeriod=2 Feb 23 07:22:39 crc kubenswrapper[5047]: I0223 07:22:39.804465 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:39 crc kubenswrapper[5047]: I0223 07:22:39.982804 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/920cb13a-017f-400b-8b51-e341395ba40c-utilities\") pod \"920cb13a-017f-400b-8b51-e341395ba40c\" (UID: \"920cb13a-017f-400b-8b51-e341395ba40c\") " Feb 23 07:22:39 crc kubenswrapper[5047]: I0223 07:22:39.983148 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kswmr\" (UniqueName: \"kubernetes.io/projected/920cb13a-017f-400b-8b51-e341395ba40c-kube-api-access-kswmr\") pod \"920cb13a-017f-400b-8b51-e341395ba40c\" (UID: \"920cb13a-017f-400b-8b51-e341395ba40c\") " Feb 23 07:22:39 crc kubenswrapper[5047]: I0223 07:22:39.983241 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/920cb13a-017f-400b-8b51-e341395ba40c-catalog-content\") pod \"920cb13a-017f-400b-8b51-e341395ba40c\" (UID: \"920cb13a-017f-400b-8b51-e341395ba40c\") " Feb 23 07:22:39 crc kubenswrapper[5047]: I0223 07:22:39.984307 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/920cb13a-017f-400b-8b51-e341395ba40c-utilities" (OuterVolumeSpecName: "utilities") pod "920cb13a-017f-400b-8b51-e341395ba40c" (UID: "920cb13a-017f-400b-8b51-e341395ba40c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:22:39 crc kubenswrapper[5047]: I0223 07:22:39.997272 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/920cb13a-017f-400b-8b51-e341395ba40c-kube-api-access-kswmr" (OuterVolumeSpecName: "kube-api-access-kswmr") pod "920cb13a-017f-400b-8b51-e341395ba40c" (UID: "920cb13a-017f-400b-8b51-e341395ba40c"). InnerVolumeSpecName "kube-api-access-kswmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.019311 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/920cb13a-017f-400b-8b51-e341395ba40c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "920cb13a-017f-400b-8b51-e341395ba40c" (UID: "920cb13a-017f-400b-8b51-e341395ba40c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.084998 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/920cb13a-017f-400b-8b51-e341395ba40c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.085041 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/920cb13a-017f-400b-8b51-e341395ba40c-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.085056 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kswmr\" (UniqueName: \"kubernetes.io/projected/920cb13a-017f-400b-8b51-e341395ba40c-kube-api-access-kswmr\") on node \"crc\" DevicePath \"\"" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.370926 5047 generic.go:334] "Generic (PLEG): container finished" podID="920cb13a-017f-400b-8b51-e341395ba40c" containerID="b3d44636adcac696402e8f118ad8f7e55f8ea536313978f1da59615500ce63e3" exitCode=0 Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.371000 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brwq7" event={"ID":"920cb13a-017f-400b-8b51-e341395ba40c","Type":"ContainerDied","Data":"b3d44636adcac696402e8f118ad8f7e55f8ea536313978f1da59615500ce63e3"} Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.371042 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-brwq7" event={"ID":"920cb13a-017f-400b-8b51-e341395ba40c","Type":"ContainerDied","Data":"7daf2455830a94a40d216c3723569c19c99fd064302f5c6e7c122738425a5e21"} Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.371072 5047 scope.go:117] "RemoveContainer" containerID="b3d44636adcac696402e8f118ad8f7e55f8ea536313978f1da59615500ce63e3" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.371290 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-brwq7" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.414766 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-brwq7"] Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.416217 5047 scope.go:117] "RemoveContainer" containerID="92b9e268871226462c6193b7739a222c52d7ac865022615353b20c2d91565f21" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.422578 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-brwq7"] Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.437545 5047 scope.go:117] "RemoveContainer" containerID="391d4edcc5ef29d0c6e534df06892d9f6dd8c9549e19185f4464a9db33f30557" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.484896 5047 scope.go:117] "RemoveContainer" containerID="b3d44636adcac696402e8f118ad8f7e55f8ea536313978f1da59615500ce63e3" Feb 23 07:22:40 crc kubenswrapper[5047]: E0223 07:22:40.485850 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3d44636adcac696402e8f118ad8f7e55f8ea536313978f1da59615500ce63e3\": container with ID starting with b3d44636adcac696402e8f118ad8f7e55f8ea536313978f1da59615500ce63e3 not found: ID does not exist" containerID="b3d44636adcac696402e8f118ad8f7e55f8ea536313978f1da59615500ce63e3" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.485945 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d44636adcac696402e8f118ad8f7e55f8ea536313978f1da59615500ce63e3"} err="failed to get container status \"b3d44636adcac696402e8f118ad8f7e55f8ea536313978f1da59615500ce63e3\": rpc error: code = NotFound desc = could not find container \"b3d44636adcac696402e8f118ad8f7e55f8ea536313978f1da59615500ce63e3\": container with ID starting with b3d44636adcac696402e8f118ad8f7e55f8ea536313978f1da59615500ce63e3 not found: ID does not exist" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.485994 5047 scope.go:117] "RemoveContainer" containerID="92b9e268871226462c6193b7739a222c52d7ac865022615353b20c2d91565f21" Feb 23 07:22:40 crc kubenswrapper[5047]: E0223 07:22:40.486688 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b9e268871226462c6193b7739a222c52d7ac865022615353b20c2d91565f21\": container with ID starting with 92b9e268871226462c6193b7739a222c52d7ac865022615353b20c2d91565f21 not found: ID does not exist" containerID="92b9e268871226462c6193b7739a222c52d7ac865022615353b20c2d91565f21" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.486762 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b9e268871226462c6193b7739a222c52d7ac865022615353b20c2d91565f21"} err="failed to get container status \"92b9e268871226462c6193b7739a222c52d7ac865022615353b20c2d91565f21\": rpc error: code = NotFound desc = could not find container \"92b9e268871226462c6193b7739a222c52d7ac865022615353b20c2d91565f21\": container with ID starting with 92b9e268871226462c6193b7739a222c52d7ac865022615353b20c2d91565f21 not found: ID does not exist" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.486806 5047 scope.go:117] "RemoveContainer" containerID="391d4edcc5ef29d0c6e534df06892d9f6dd8c9549e19185f4464a9db33f30557" Feb 23 07:22:40 crc kubenswrapper[5047]: E0223 07:22:40.487485 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"391d4edcc5ef29d0c6e534df06892d9f6dd8c9549e19185f4464a9db33f30557\": container with ID starting with 391d4edcc5ef29d0c6e534df06892d9f6dd8c9549e19185f4464a9db33f30557 not found: ID does not exist" containerID="391d4edcc5ef29d0c6e534df06892d9f6dd8c9549e19185f4464a9db33f30557" Feb 23 07:22:40 crc kubenswrapper[5047]: I0223 07:22:40.487534 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"391d4edcc5ef29d0c6e534df06892d9f6dd8c9549e19185f4464a9db33f30557"} err="failed to get container status \"391d4edcc5ef29d0c6e534df06892d9f6dd8c9549e19185f4464a9db33f30557\": rpc error: code = NotFound desc = could not find container \"391d4edcc5ef29d0c6e534df06892d9f6dd8c9549e19185f4464a9db33f30557\": container with ID starting with 391d4edcc5ef29d0c6e534df06892d9f6dd8c9549e19185f4464a9db33f30557 not found: ID does not exist" Feb 23 07:22:42 crc kubenswrapper[5047]: I0223 07:22:42.353114 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="920cb13a-017f-400b-8b51-e341395ba40c" path="/var/lib/kubelet/pods/920cb13a-017f-400b-8b51-e341395ba40c/volumes" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.530825 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rqzt5"] Feb 23 07:22:45 crc kubenswrapper[5047]: E0223 07:22:45.531859 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="920cb13a-017f-400b-8b51-e341395ba40c" containerName="registry-server" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.531879 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="920cb13a-017f-400b-8b51-e341395ba40c" containerName="registry-server" Feb 23 07:22:45 crc kubenswrapper[5047]: E0223 07:22:45.531932 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="920cb13a-017f-400b-8b51-e341395ba40c" containerName="extract-utilities" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.531942 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="920cb13a-017f-400b-8b51-e341395ba40c" containerName="extract-utilities" Feb 23 07:22:45 crc kubenswrapper[5047]: E0223 07:22:45.531962 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="920cb13a-017f-400b-8b51-e341395ba40c" containerName="extract-content" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.531971 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="920cb13a-017f-400b-8b51-e341395ba40c" containerName="extract-content" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.532249 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="920cb13a-017f-400b-8b51-e341395ba40c" containerName="registry-server" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.533973 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.552170 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rqzt5"] Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.686579 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e921773b-a96e-4928-a8a6-2c32ad6932bc-catalog-content\") pod \"community-operators-rqzt5\" (UID: \"e921773b-a96e-4928-a8a6-2c32ad6932bc\") " pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.686646 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e921773b-a96e-4928-a8a6-2c32ad6932bc-utilities\") pod \"community-operators-rqzt5\" (UID: \"e921773b-a96e-4928-a8a6-2c32ad6932bc\") " pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.686689 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86c6x\" (UniqueName: \"kubernetes.io/projected/e921773b-a96e-4928-a8a6-2c32ad6932bc-kube-api-access-86c6x\") pod \"community-operators-rqzt5\" (UID: \"e921773b-a96e-4928-a8a6-2c32ad6932bc\") " pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.788078 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e921773b-a96e-4928-a8a6-2c32ad6932bc-catalog-content\") pod \"community-operators-rqzt5\" (UID: \"e921773b-a96e-4928-a8a6-2c32ad6932bc\") " pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.788150 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e921773b-a96e-4928-a8a6-2c32ad6932bc-utilities\") pod \"community-operators-rqzt5\" (UID: \"e921773b-a96e-4928-a8a6-2c32ad6932bc\") " pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.788200 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86c6x\" (UniqueName: \"kubernetes.io/projected/e921773b-a96e-4928-a8a6-2c32ad6932bc-kube-api-access-86c6x\") pod \"community-operators-rqzt5\" (UID: \"e921773b-a96e-4928-a8a6-2c32ad6932bc\") " pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.789100 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e921773b-a96e-4928-a8a6-2c32ad6932bc-utilities\") pod \"community-operators-rqzt5\" (UID: \"e921773b-a96e-4928-a8a6-2c32ad6932bc\") " pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.789118 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e921773b-a96e-4928-a8a6-2c32ad6932bc-catalog-content\") pod \"community-operators-rqzt5\" (UID: \"e921773b-a96e-4928-a8a6-2c32ad6932bc\") " pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.810269 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86c6x\" (UniqueName: \"kubernetes.io/projected/e921773b-a96e-4928-a8a6-2c32ad6932bc-kube-api-access-86c6x\") pod \"community-operators-rqzt5\" (UID: \"e921773b-a96e-4928-a8a6-2c32ad6932bc\") " pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:45 crc kubenswrapper[5047]: I0223 07:22:45.864959 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:46 crc kubenswrapper[5047]: I0223 07:22:46.454803 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rqzt5"] Feb 23 07:22:47 crc kubenswrapper[5047]: I0223 07:22:47.442863 5047 generic.go:334] "Generic (PLEG): container finished" podID="e921773b-a96e-4928-a8a6-2c32ad6932bc" containerID="03daec562fa4ebe90ac40fb0b50e188e63464c6f90c8250e4e0e27442ab6b9e0" exitCode=0 Feb 23 07:22:47 crc kubenswrapper[5047]: I0223 07:22:47.442967 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqzt5" event={"ID":"e921773b-a96e-4928-a8a6-2c32ad6932bc","Type":"ContainerDied","Data":"03daec562fa4ebe90ac40fb0b50e188e63464c6f90c8250e4e0e27442ab6b9e0"} Feb 23 07:22:47 crc kubenswrapper[5047]: I0223 07:22:47.443059 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqzt5" event={"ID":"e921773b-a96e-4928-a8a6-2c32ad6932bc","Type":"ContainerStarted","Data":"e1cdd3942990f505074563393f272ccbaf69c8ab8c500c5c5856170392d30866"} Feb 23 07:22:48 crc kubenswrapper[5047]: I0223 07:22:48.451922 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqzt5" event={"ID":"e921773b-a96e-4928-a8a6-2c32ad6932bc","Type":"ContainerStarted","Data":"42969688a00bd04106be1509ff51e1511f014ea787479571cfe37b0244cd4425"} Feb 23 07:22:49 crc kubenswrapper[5047]: I0223 07:22:49.464313 5047 generic.go:334] "Generic (PLEG): container finished" podID="e921773b-a96e-4928-a8a6-2c32ad6932bc" containerID="42969688a00bd04106be1509ff51e1511f014ea787479571cfe37b0244cd4425" exitCode=0 Feb 23 07:22:49 crc kubenswrapper[5047]: I0223 07:22:49.464404 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqzt5" event={"ID":"e921773b-a96e-4928-a8a6-2c32ad6932bc","Type":"ContainerDied","Data":"42969688a00bd04106be1509ff51e1511f014ea787479571cfe37b0244cd4425"} Feb 23 07:22:51 crc kubenswrapper[5047]: I0223 07:22:51.484535 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqzt5" event={"ID":"e921773b-a96e-4928-a8a6-2c32ad6932bc","Type":"ContainerStarted","Data":"c472a72d3c3382cbdc27dd9cb4acb9b0a48834928991ad6f9c60a1c60bb77fc1"} Feb 23 07:22:51 crc kubenswrapper[5047]: I0223 07:22:51.511689 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rqzt5" podStartSLOduration=4.088242771 podStartE2EDuration="6.51166981s" podCreationTimestamp="2026-02-23 07:22:45 +0000 UTC" firstStartedPulling="2026-02-23 07:22:47.446591231 +0000 UTC m=+2289.697918385" lastFinishedPulling="2026-02-23 07:22:49.87001824 +0000 UTC m=+2292.121345424" observedRunningTime="2026-02-23 07:22:51.504170326 +0000 UTC m=+2293.755497460" watchObservedRunningTime="2026-02-23 07:22:51.51166981 +0000 UTC m=+2293.762996944" Feb 23 07:22:55 crc kubenswrapper[5047]: I0223 07:22:55.866209 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:55 crc kubenswrapper[5047]: I0223 07:22:55.866871 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:55 crc kubenswrapper[5047]: I0223 07:22:55.944015 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:56 crc kubenswrapper[5047]: I0223 07:22:56.599021 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:56 crc kubenswrapper[5047]: I0223 07:22:56.673025 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rqzt5"] Feb 23 07:22:58 crc kubenswrapper[5047]: I0223 07:22:58.550950 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rqzt5" podUID="e921773b-a96e-4928-a8a6-2c32ad6932bc" containerName="registry-server" containerID="cri-o://c472a72d3c3382cbdc27dd9cb4acb9b0a48834928991ad6f9c60a1c60bb77fc1" gracePeriod=2 Feb 23 07:22:58 crc kubenswrapper[5047]: I0223 07:22:58.996012 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.127648 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86c6x\" (UniqueName: \"kubernetes.io/projected/e921773b-a96e-4928-a8a6-2c32ad6932bc-kube-api-access-86c6x\") pod \"e921773b-a96e-4928-a8a6-2c32ad6932bc\" (UID: \"e921773b-a96e-4928-a8a6-2c32ad6932bc\") " Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.127779 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e921773b-a96e-4928-a8a6-2c32ad6932bc-catalog-content\") pod \"e921773b-a96e-4928-a8a6-2c32ad6932bc\" (UID: \"e921773b-a96e-4928-a8a6-2c32ad6932bc\") " Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.128008 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e921773b-a96e-4928-a8a6-2c32ad6932bc-utilities\") pod \"e921773b-a96e-4928-a8a6-2c32ad6932bc\" (UID: \"e921773b-a96e-4928-a8a6-2c32ad6932bc\") " Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.128848 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e921773b-a96e-4928-a8a6-2c32ad6932bc-utilities" (OuterVolumeSpecName: "utilities") pod "e921773b-a96e-4928-a8a6-2c32ad6932bc" (UID: "e921773b-a96e-4928-a8a6-2c32ad6932bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.136333 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e921773b-a96e-4928-a8a6-2c32ad6932bc-kube-api-access-86c6x" (OuterVolumeSpecName: "kube-api-access-86c6x") pod "e921773b-a96e-4928-a8a6-2c32ad6932bc" (UID: "e921773b-a96e-4928-a8a6-2c32ad6932bc"). InnerVolumeSpecName "kube-api-access-86c6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.180503 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e921773b-a96e-4928-a8a6-2c32ad6932bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e921773b-a96e-4928-a8a6-2c32ad6932bc" (UID: "e921773b-a96e-4928-a8a6-2c32ad6932bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.229721 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e921773b-a96e-4928-a8a6-2c32ad6932bc-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.229765 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86c6x\" (UniqueName: \"kubernetes.io/projected/e921773b-a96e-4928-a8a6-2c32ad6932bc-kube-api-access-86c6x\") on node \"crc\" DevicePath \"\"" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.229779 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e921773b-a96e-4928-a8a6-2c32ad6932bc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.572849 5047 generic.go:334] "Generic (PLEG): container finished" podID="e921773b-a96e-4928-a8a6-2c32ad6932bc" containerID="c472a72d3c3382cbdc27dd9cb4acb9b0a48834928991ad6f9c60a1c60bb77fc1" exitCode=0 Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.572966 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqzt5" event={"ID":"e921773b-a96e-4928-a8a6-2c32ad6932bc","Type":"ContainerDied","Data":"c472a72d3c3382cbdc27dd9cb4acb9b0a48834928991ad6f9c60a1c60bb77fc1"} Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.573019 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqzt5" event={"ID":"e921773b-a96e-4928-a8a6-2c32ad6932bc","Type":"ContainerDied","Data":"e1cdd3942990f505074563393f272ccbaf69c8ab8c500c5c5856170392d30866"} Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.573051 5047 scope.go:117] "RemoveContainer" containerID="c472a72d3c3382cbdc27dd9cb4acb9b0a48834928991ad6f9c60a1c60bb77fc1" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.573103 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqzt5" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.616344 5047 scope.go:117] "RemoveContainer" containerID="42969688a00bd04106be1509ff51e1511f014ea787479571cfe37b0244cd4425" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.632327 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rqzt5"] Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.647788 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rqzt5"] Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.649312 5047 scope.go:117] "RemoveContainer" containerID="03daec562fa4ebe90ac40fb0b50e188e63464c6f90c8250e4e0e27442ab6b9e0" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.686363 5047 scope.go:117] "RemoveContainer" containerID="c472a72d3c3382cbdc27dd9cb4acb9b0a48834928991ad6f9c60a1c60bb77fc1" Feb 23 07:22:59 crc kubenswrapper[5047]: E0223 07:22:59.687643 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c472a72d3c3382cbdc27dd9cb4acb9b0a48834928991ad6f9c60a1c60bb77fc1\": container with ID starting with c472a72d3c3382cbdc27dd9cb4acb9b0a48834928991ad6f9c60a1c60bb77fc1 not found: ID does not exist" containerID="c472a72d3c3382cbdc27dd9cb4acb9b0a48834928991ad6f9c60a1c60bb77fc1" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.687713 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c472a72d3c3382cbdc27dd9cb4acb9b0a48834928991ad6f9c60a1c60bb77fc1"} err="failed to get container status \"c472a72d3c3382cbdc27dd9cb4acb9b0a48834928991ad6f9c60a1c60bb77fc1\": rpc error: code = NotFound desc = could not find container \"c472a72d3c3382cbdc27dd9cb4acb9b0a48834928991ad6f9c60a1c60bb77fc1\": container with ID starting with c472a72d3c3382cbdc27dd9cb4acb9b0a48834928991ad6f9c60a1c60bb77fc1 not found: ID does not exist" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.687754 5047 scope.go:117] "RemoveContainer" containerID="42969688a00bd04106be1509ff51e1511f014ea787479571cfe37b0244cd4425" Feb 23 07:22:59 crc kubenswrapper[5047]: E0223 07:22:59.688165 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42969688a00bd04106be1509ff51e1511f014ea787479571cfe37b0244cd4425\": container with ID starting with 42969688a00bd04106be1509ff51e1511f014ea787479571cfe37b0244cd4425 not found: ID does not exist" containerID="42969688a00bd04106be1509ff51e1511f014ea787479571cfe37b0244cd4425" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.688201 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42969688a00bd04106be1509ff51e1511f014ea787479571cfe37b0244cd4425"} err="failed to get container status \"42969688a00bd04106be1509ff51e1511f014ea787479571cfe37b0244cd4425\": rpc error: code = NotFound desc = could not find container \"42969688a00bd04106be1509ff51e1511f014ea787479571cfe37b0244cd4425\": container with ID starting with 42969688a00bd04106be1509ff51e1511f014ea787479571cfe37b0244cd4425 not found: ID does not exist" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.688221 5047 scope.go:117] "RemoveContainer" containerID="03daec562fa4ebe90ac40fb0b50e188e63464c6f90c8250e4e0e27442ab6b9e0" Feb 23 07:22:59 crc kubenswrapper[5047]: E0223 07:22:59.688505 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03daec562fa4ebe90ac40fb0b50e188e63464c6f90c8250e4e0e27442ab6b9e0\": container with ID starting with 03daec562fa4ebe90ac40fb0b50e188e63464c6f90c8250e4e0e27442ab6b9e0 not found: ID does not exist" containerID="03daec562fa4ebe90ac40fb0b50e188e63464c6f90c8250e4e0e27442ab6b9e0" Feb 23 07:22:59 crc kubenswrapper[5047]: I0223 07:22:59.688537 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03daec562fa4ebe90ac40fb0b50e188e63464c6f90c8250e4e0e27442ab6b9e0"} err="failed to get container status \"03daec562fa4ebe90ac40fb0b50e188e63464c6f90c8250e4e0e27442ab6b9e0\": rpc error: code = NotFound desc = could not find container \"03daec562fa4ebe90ac40fb0b50e188e63464c6f90c8250e4e0e27442ab6b9e0\": container with ID starting with 03daec562fa4ebe90ac40fb0b50e188e63464c6f90c8250e4e0e27442ab6b9e0 not found: ID does not exist" Feb 23 07:23:00 crc kubenswrapper[5047]: I0223 07:23:00.359991 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e921773b-a96e-4928-a8a6-2c32ad6932bc" path="/var/lib/kubelet/pods/e921773b-a96e-4928-a8a6-2c32ad6932bc/volumes" Feb 23 07:24:46 crc kubenswrapper[5047]: I0223 07:24:46.759386 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:24:46 crc kubenswrapper[5047]: I0223 07:24:46.759934 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:25:16 crc kubenswrapper[5047]: I0223 07:25:16.759799 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:25:16 crc kubenswrapper[5047]: I0223 07:25:16.760742 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:25:46 crc kubenswrapper[5047]: I0223 07:25:46.760101 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:25:46 crc kubenswrapper[5047]: I0223 07:25:46.761078 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:25:46 crc kubenswrapper[5047]: I0223 07:25:46.761145 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 07:25:47 crc kubenswrapper[5047]: I0223 07:25:47.276586 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:25:47 crc kubenswrapper[5047]: I0223 07:25:47.276743 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" gracePeriod=600 Feb 23 07:25:47 crc kubenswrapper[5047]: E0223 07:25:47.402514 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:25:48 crc kubenswrapper[5047]: I0223 07:25:48.291426 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" exitCode=0 Feb 23 07:25:48 crc kubenswrapper[5047]: I0223 07:25:48.291495 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f"} Feb 23 07:25:48 crc kubenswrapper[5047]: I0223 07:25:48.291546 5047 scope.go:117] "RemoveContainer" containerID="0d13b86f3c29a55a8034ca0dd7a08a0b8d33896a1a4f64bdde470e123ac2b1e9" Feb 23 07:25:48 crc kubenswrapper[5047]: I0223 07:25:48.298064 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:25:48 crc kubenswrapper[5047]: E0223 07:25:48.298612 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:26:02 crc kubenswrapper[5047]: I0223 07:26:02.340847 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:26:02 crc kubenswrapper[5047]: E0223 07:26:02.341858 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:26:13 crc kubenswrapper[5047]: I0223 07:26:13.341595 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:26:13 crc kubenswrapper[5047]: E0223 07:26:13.344125 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:26:27 crc kubenswrapper[5047]: I0223 07:26:27.342581 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:26:27 crc kubenswrapper[5047]: E0223 07:26:27.344313 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:26:39 crc kubenswrapper[5047]: I0223 07:26:39.342144 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:26:39 crc kubenswrapper[5047]: E0223 07:26:39.343478 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:26:54 crc kubenswrapper[5047]: I0223 07:26:54.344059 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:26:54 crc kubenswrapper[5047]: E0223 07:26:54.345664 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:27:07 crc kubenswrapper[5047]: I0223 07:27:07.341852 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:27:07 crc kubenswrapper[5047]: E0223 07:27:07.343056 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:27:18 crc kubenswrapper[5047]: I0223 07:27:18.350791 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:27:18 crc kubenswrapper[5047]: E0223 07:27:18.352098 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:27:31 crc kubenswrapper[5047]: I0223 07:27:31.341098 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:27:31 crc kubenswrapper[5047]: E0223 07:27:31.341817 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:27:42 crc kubenswrapper[5047]: I0223 07:27:42.341361 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:27:42 crc kubenswrapper[5047]: E0223 07:27:42.342545 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:27:57 crc kubenswrapper[5047]: I0223 07:27:57.342110 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:27:57 crc kubenswrapper[5047]: E0223 07:27:57.343093 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:28:10 crc kubenswrapper[5047]: I0223 07:28:10.342386 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:28:10 crc kubenswrapper[5047]: E0223 07:28:10.343636 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:28:21 crc kubenswrapper[5047]: I0223 07:28:21.341188 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:28:21 crc kubenswrapper[5047]: E0223 07:28:21.341895 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:28:34 crc kubenswrapper[5047]: I0223 07:28:34.341142 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:28:34 crc kubenswrapper[5047]: E0223 07:28:34.342034 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:28:45 crc kubenswrapper[5047]: I0223 07:28:45.340845 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:28:45 crc kubenswrapper[5047]: E0223 07:28:45.341564 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:28:58 crc kubenswrapper[5047]: I0223 07:28:58.352118 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:28:58 crc kubenswrapper[5047]: E0223 07:28:58.353058 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:29:13 crc kubenswrapper[5047]: I0223 07:29:13.341573 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:29:13 crc kubenswrapper[5047]: E0223 07:29:13.342793 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:29:24 crc kubenswrapper[5047]: I0223 07:29:24.342196 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:29:24 crc kubenswrapper[5047]: E0223 07:29:24.343250 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:29:35 crc kubenswrapper[5047]: I0223 07:29:35.341279 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:29:35 crc kubenswrapper[5047]: E0223 07:29:35.344397 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:29:46 crc kubenswrapper[5047]: I0223 07:29:46.341675 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:29:46 crc kubenswrapper[5047]: E0223 07:29:46.342827 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:29:57 crc kubenswrapper[5047]: I0223 07:29:57.341544 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:29:57 crc kubenswrapper[5047]: E0223 07:29:57.342786 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.162381 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc"] Feb 23 07:30:00 crc kubenswrapper[5047]: E0223 07:30:00.162688 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e921773b-a96e-4928-a8a6-2c32ad6932bc" containerName="extract-content" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.162703 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e921773b-a96e-4928-a8a6-2c32ad6932bc" containerName="extract-content" Feb 23 07:30:00 crc kubenswrapper[5047]: E0223 07:30:00.162742 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e921773b-a96e-4928-a8a6-2c32ad6932bc" containerName="extract-utilities" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.162749 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e921773b-a96e-4928-a8a6-2c32ad6932bc" containerName="extract-utilities" Feb 23 07:30:00 crc kubenswrapper[5047]: E0223 07:30:00.162759 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e921773b-a96e-4928-a8a6-2c32ad6932bc" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.162766 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e921773b-a96e-4928-a8a6-2c32ad6932bc" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.162932 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e921773b-a96e-4928-a8a6-2c32ad6932bc" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.163427 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.169811 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.169867 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.184800 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc"] Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.288053 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/843b8015-e7e8-41a1-b138-af5e0c84a030-secret-volume\") pod \"collect-profiles-29530530-m9fjc\" (UID: \"843b8015-e7e8-41a1-b138-af5e0c84a030\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.288444 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/843b8015-e7e8-41a1-b138-af5e0c84a030-config-volume\") pod \"collect-profiles-29530530-m9fjc\" (UID: \"843b8015-e7e8-41a1-b138-af5e0c84a030\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.288533 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rfd5\" (UniqueName: \"kubernetes.io/projected/843b8015-e7e8-41a1-b138-af5e0c84a030-kube-api-access-8rfd5\") pod \"collect-profiles-29530530-m9fjc\" (UID: \"843b8015-e7e8-41a1-b138-af5e0c84a030\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.390145 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/843b8015-e7e8-41a1-b138-af5e0c84a030-secret-volume\") pod \"collect-profiles-29530530-m9fjc\" (UID: \"843b8015-e7e8-41a1-b138-af5e0c84a030\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.390293 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/843b8015-e7e8-41a1-b138-af5e0c84a030-config-volume\") pod \"collect-profiles-29530530-m9fjc\" (UID: \"843b8015-e7e8-41a1-b138-af5e0c84a030\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.390321 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rfd5\" (UniqueName: \"kubernetes.io/projected/843b8015-e7e8-41a1-b138-af5e0c84a030-kube-api-access-8rfd5\") pod \"collect-profiles-29530530-m9fjc\" (UID: \"843b8015-e7e8-41a1-b138-af5e0c84a030\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.391579 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/843b8015-e7e8-41a1-b138-af5e0c84a030-config-volume\") pod \"collect-profiles-29530530-m9fjc\" (UID: \"843b8015-e7e8-41a1-b138-af5e0c84a030\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.397711 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/843b8015-e7e8-41a1-b138-af5e0c84a030-secret-volume\") pod \"collect-profiles-29530530-m9fjc\" (UID: \"843b8015-e7e8-41a1-b138-af5e0c84a030\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.424206 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rfd5\" (UniqueName: \"kubernetes.io/projected/843b8015-e7e8-41a1-b138-af5e0c84a030-kube-api-access-8rfd5\") pod \"collect-profiles-29530530-m9fjc\" (UID: \"843b8015-e7e8-41a1-b138-af5e0c84a030\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" Feb 23 07:30:00 crc kubenswrapper[5047]: I0223 07:30:00.488857 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" Feb 23 07:30:01 crc kubenswrapper[5047]: I0223 07:30:01.012273 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc"] Feb 23 07:30:02 crc kubenswrapper[5047]: I0223 07:30:02.039231 5047 generic.go:334] "Generic (PLEG): container finished" podID="843b8015-e7e8-41a1-b138-af5e0c84a030" containerID="5e3175a92fc9af5cb28f81925aae27a76c1169a89bc1df5121ffbda49e05a7e6" exitCode=0 Feb 23 07:30:02 crc kubenswrapper[5047]: I0223 07:30:02.039495 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" event={"ID":"843b8015-e7e8-41a1-b138-af5e0c84a030","Type":"ContainerDied","Data":"5e3175a92fc9af5cb28f81925aae27a76c1169a89bc1df5121ffbda49e05a7e6"} Feb 23 07:30:02 crc kubenswrapper[5047]: I0223 07:30:02.039578 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" event={"ID":"843b8015-e7e8-41a1-b138-af5e0c84a030","Type":"ContainerStarted","Data":"4b37c5fff1c1a159196035164055250ab9490a66e747b8546c4ea01d6aad046e"} Feb 23 07:30:03 crc kubenswrapper[5047]: I0223 07:30:03.401393 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" Feb 23 07:30:03 crc kubenswrapper[5047]: I0223 07:30:03.449956 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/843b8015-e7e8-41a1-b138-af5e0c84a030-config-volume\") pod \"843b8015-e7e8-41a1-b138-af5e0c84a030\" (UID: \"843b8015-e7e8-41a1-b138-af5e0c84a030\") " Feb 23 07:30:03 crc kubenswrapper[5047]: I0223 07:30:03.450013 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/843b8015-e7e8-41a1-b138-af5e0c84a030-secret-volume\") pod \"843b8015-e7e8-41a1-b138-af5e0c84a030\" (UID: \"843b8015-e7e8-41a1-b138-af5e0c84a030\") " Feb 23 07:30:03 crc kubenswrapper[5047]: I0223 07:30:03.450626 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rfd5\" (UniqueName: \"kubernetes.io/projected/843b8015-e7e8-41a1-b138-af5e0c84a030-kube-api-access-8rfd5\") pod \"843b8015-e7e8-41a1-b138-af5e0c84a030\" (UID: \"843b8015-e7e8-41a1-b138-af5e0c84a030\") " Feb 23 07:30:03 crc kubenswrapper[5047]: I0223 07:30:03.451488 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843b8015-e7e8-41a1-b138-af5e0c84a030-config-volume" (OuterVolumeSpecName: "config-volume") pod "843b8015-e7e8-41a1-b138-af5e0c84a030" (UID: "843b8015-e7e8-41a1-b138-af5e0c84a030"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:30:03 crc kubenswrapper[5047]: I0223 07:30:03.451670 5047 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/843b8015-e7e8-41a1-b138-af5e0c84a030-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:03 crc kubenswrapper[5047]: I0223 07:30:03.457810 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/843b8015-e7e8-41a1-b138-af5e0c84a030-kube-api-access-8rfd5" (OuterVolumeSpecName: "kube-api-access-8rfd5") pod "843b8015-e7e8-41a1-b138-af5e0c84a030" (UID: "843b8015-e7e8-41a1-b138-af5e0c84a030"). InnerVolumeSpecName "kube-api-access-8rfd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:30:03 crc kubenswrapper[5047]: I0223 07:30:03.458079 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/843b8015-e7e8-41a1-b138-af5e0c84a030-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "843b8015-e7e8-41a1-b138-af5e0c84a030" (UID: "843b8015-e7e8-41a1-b138-af5e0c84a030"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:30:03 crc kubenswrapper[5047]: I0223 07:30:03.554642 5047 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/843b8015-e7e8-41a1-b138-af5e0c84a030-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:03 crc kubenswrapper[5047]: I0223 07:30:03.554819 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rfd5\" (UniqueName: \"kubernetes.io/projected/843b8015-e7e8-41a1-b138-af5e0c84a030-kube-api-access-8rfd5\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:04 crc kubenswrapper[5047]: I0223 07:30:04.060195 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" event={"ID":"843b8015-e7e8-41a1-b138-af5e0c84a030","Type":"ContainerDied","Data":"4b37c5fff1c1a159196035164055250ab9490a66e747b8546c4ea01d6aad046e"} Feb 23 07:30:04 crc kubenswrapper[5047]: I0223 07:30:04.060719 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b37c5fff1c1a159196035164055250ab9490a66e747b8546c4ea01d6aad046e" Feb 23 07:30:04 crc kubenswrapper[5047]: I0223 07:30:04.060262 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc" Feb 23 07:30:04 crc kubenswrapper[5047]: I0223 07:30:04.516857 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4"] Feb 23 07:30:04 crc kubenswrapper[5047]: I0223 07:30:04.527544 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-jxmq4"] Feb 23 07:30:06 crc kubenswrapper[5047]: I0223 07:30:06.357052 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d845d9f8-e91e-46b9-b484-dbfeb6006c7f" path="/var/lib/kubelet/pods/d845d9f8-e91e-46b9-b484-dbfeb6006c7f/volumes" Feb 23 07:30:12 crc kubenswrapper[5047]: I0223 07:30:12.341430 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:30:12 crc kubenswrapper[5047]: E0223 07:30:12.342769 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.416753 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dzj8d"] Feb 23 07:30:13 crc kubenswrapper[5047]: E0223 07:30:13.417410 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843b8015-e7e8-41a1-b138-af5e0c84a030" containerName="collect-profiles" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.417442 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="843b8015-e7e8-41a1-b138-af5e0c84a030" containerName="collect-profiles" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.417800 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="843b8015-e7e8-41a1-b138-af5e0c84a030" containerName="collect-profiles" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.420217 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.440827 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzj8d"] Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.535752 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtgqp\" (UniqueName: \"kubernetes.io/projected/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-kube-api-access-xtgqp\") pod \"certified-operators-dzj8d\" (UID: \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\") " pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.535841 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-utilities\") pod \"certified-operators-dzj8d\" (UID: \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\") " pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.535876 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-catalog-content\") pod \"certified-operators-dzj8d\" (UID: \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\") " pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.637479 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtgqp\" (UniqueName: \"kubernetes.io/projected/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-kube-api-access-xtgqp\") pod \"certified-operators-dzj8d\" (UID: \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\") " pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.637595 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-utilities\") pod \"certified-operators-dzj8d\" (UID: \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\") " pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.637630 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-catalog-content\") pod \"certified-operators-dzj8d\" (UID: \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\") " pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.638348 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-utilities\") pod \"certified-operators-dzj8d\" (UID: \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\") " pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.638449 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-catalog-content\") pod \"certified-operators-dzj8d\" (UID: \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\") " pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.665306 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtgqp\" (UniqueName: \"kubernetes.io/projected/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-kube-api-access-xtgqp\") pod \"certified-operators-dzj8d\" (UID: \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\") " pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:13 crc kubenswrapper[5047]: I0223 07:30:13.753271 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:14 crc kubenswrapper[5047]: I0223 07:30:13.998247 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzj8d"] Feb 23 07:30:14 crc kubenswrapper[5047]: I0223 07:30:14.153279 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzj8d" event={"ID":"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5","Type":"ContainerStarted","Data":"300b0f212e4304e06e70908a54ac110e8b2009bb94a737099ceafead2df881dc"} Feb 23 07:30:15 crc kubenswrapper[5047]: I0223 07:30:15.162846 5047 generic.go:334] "Generic (PLEG): container finished" podID="24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" containerID="6bb8d19da5efffc9048e8372b08d65395392d051fcfc2c34bef996bc24ff67e4" exitCode=0 Feb 23 07:30:15 crc kubenswrapper[5047]: I0223 07:30:15.162945 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzj8d" event={"ID":"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5","Type":"ContainerDied","Data":"6bb8d19da5efffc9048e8372b08d65395392d051fcfc2c34bef996bc24ff67e4"} Feb 23 07:30:15 crc kubenswrapper[5047]: I0223 07:30:15.165994 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:30:16 crc kubenswrapper[5047]: I0223 07:30:16.174957 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzj8d" event={"ID":"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5","Type":"ContainerStarted","Data":"c592ffa346398d2c7714c60d147cd3bf99d98ce8d235e7f5c1fcd95cf12f76ab"} Feb 23 07:30:17 crc kubenswrapper[5047]: I0223 07:30:17.186507 5047 generic.go:334] "Generic (PLEG): container finished" podID="24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" containerID="c592ffa346398d2c7714c60d147cd3bf99d98ce8d235e7f5c1fcd95cf12f76ab" exitCode=0 Feb 23 07:30:17 crc kubenswrapper[5047]: I0223 07:30:17.186615 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzj8d" event={"ID":"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5","Type":"ContainerDied","Data":"c592ffa346398d2c7714c60d147cd3bf99d98ce8d235e7f5c1fcd95cf12f76ab"} Feb 23 07:30:18 crc kubenswrapper[5047]: I0223 07:30:18.197689 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzj8d" event={"ID":"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5","Type":"ContainerStarted","Data":"9727b2b23cdfe6630607219f136f775f818868a23a942df5374cd398f5ecd436"} Feb 23 07:30:18 crc kubenswrapper[5047]: I0223 07:30:18.224847 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dzj8d" podStartSLOduration=2.811365725 podStartE2EDuration="5.224812315s" podCreationTimestamp="2026-02-23 07:30:13 +0000 UTC" firstStartedPulling="2026-02-23 07:30:15.165705641 +0000 UTC m=+2737.417032795" lastFinishedPulling="2026-02-23 07:30:17.579152231 +0000 UTC m=+2739.830479385" observedRunningTime="2026-02-23 07:30:18.21738685 +0000 UTC m=+2740.468714024" watchObservedRunningTime="2026-02-23 07:30:18.224812315 +0000 UTC m=+2740.476139489" Feb 23 07:30:23 crc kubenswrapper[5047]: I0223 07:30:23.341882 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:30:23 crc kubenswrapper[5047]: E0223 07:30:23.343076 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:30:23 crc kubenswrapper[5047]: I0223 07:30:23.754264 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:23 crc kubenswrapper[5047]: I0223 07:30:23.754342 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:23 crc kubenswrapper[5047]: I0223 07:30:23.810361 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:24 crc kubenswrapper[5047]: I0223 07:30:24.291359 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:24 crc kubenswrapper[5047]: I0223 07:30:24.352517 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzj8d"] Feb 23 07:30:26 crc kubenswrapper[5047]: I0223 07:30:26.265305 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dzj8d" podUID="24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" containerName="registry-server" containerID="cri-o://9727b2b23cdfe6630607219f136f775f818868a23a942df5374cd398f5ecd436" gracePeriod=2 Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.283955 5047 generic.go:334] "Generic (PLEG): container finished" podID="24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" containerID="9727b2b23cdfe6630607219f136f775f818868a23a942df5374cd398f5ecd436" exitCode=0 Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.284038 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzj8d" event={"ID":"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5","Type":"ContainerDied","Data":"9727b2b23cdfe6630607219f136f775f818868a23a942df5374cd398f5ecd436"} Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.284785 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzj8d" event={"ID":"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5","Type":"ContainerDied","Data":"300b0f212e4304e06e70908a54ac110e8b2009bb94a737099ceafead2df881dc"} Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.284813 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="300b0f212e4304e06e70908a54ac110e8b2009bb94a737099ceafead2df881dc" Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.294864 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.372888 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-catalog-content\") pod \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\" (UID: \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\") " Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.373115 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-utilities\") pod \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\" (UID: \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\") " Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.373164 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtgqp\" (UniqueName: \"kubernetes.io/projected/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-kube-api-access-xtgqp\") pod \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\" (UID: \"24eb1eb2-5cef-43bb-ba85-2b01c405dcb5\") " Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.375227 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-utilities" (OuterVolumeSpecName: "utilities") pod "24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" (UID: "24eb1eb2-5cef-43bb-ba85-2b01c405dcb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.381701 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-kube-api-access-xtgqp" (OuterVolumeSpecName: "kube-api-access-xtgqp") pod "24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" (UID: "24eb1eb2-5cef-43bb-ba85-2b01c405dcb5"). InnerVolumeSpecName "kube-api-access-xtgqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.443232 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" (UID: "24eb1eb2-5cef-43bb-ba85-2b01c405dcb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.475827 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.475885 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:27 crc kubenswrapper[5047]: I0223 07:30:27.475944 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtgqp\" (UniqueName: \"kubernetes.io/projected/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5-kube-api-access-xtgqp\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:28 crc kubenswrapper[5047]: I0223 07:30:28.293468 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzj8d" Feb 23 07:30:28 crc kubenswrapper[5047]: I0223 07:30:28.336979 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzj8d"] Feb 23 07:30:28 crc kubenswrapper[5047]: I0223 07:30:28.350394 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dzj8d"] Feb 23 07:30:30 crc kubenswrapper[5047]: I0223 07:30:30.356507 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" path="/var/lib/kubelet/pods/24eb1eb2-5cef-43bb-ba85-2b01c405dcb5/volumes" Feb 23 07:30:32 crc kubenswrapper[5047]: I0223 07:30:32.936505 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b5n85"] Feb 23 07:30:32 crc kubenswrapper[5047]: E0223 07:30:32.937331 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" containerName="extract-content" Feb 23 07:30:32 crc kubenswrapper[5047]: I0223 07:30:32.937352 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" containerName="extract-content" Feb 23 07:30:32 crc kubenswrapper[5047]: E0223 07:30:32.937381 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" containerName="extract-utilities" Feb 23 07:30:32 crc kubenswrapper[5047]: I0223 07:30:32.937392 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" containerName="extract-utilities" Feb 23 07:30:32 crc kubenswrapper[5047]: E0223 07:30:32.937408 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" containerName="registry-server" Feb 23 07:30:32 crc kubenswrapper[5047]: I0223 07:30:32.937419 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" containerName="registry-server" Feb 23 07:30:32 crc kubenswrapper[5047]: I0223 07:30:32.937648 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="24eb1eb2-5cef-43bb-ba85-2b01c405dcb5" containerName="registry-server" Feb 23 07:30:32 crc kubenswrapper[5047]: I0223 07:30:32.939209 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:32 crc kubenswrapper[5047]: I0223 07:30:32.974056 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b5n85"] Feb 23 07:30:32 crc kubenswrapper[5047]: I0223 07:30:32.976147 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxgz7\" (UniqueName: \"kubernetes.io/projected/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-kube-api-access-mxgz7\") pod \"redhat-operators-b5n85\" (UID: \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\") " pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:32 crc kubenswrapper[5047]: I0223 07:30:32.976208 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-utilities\") pod \"redhat-operators-b5n85\" (UID: \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\") " pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:32 crc kubenswrapper[5047]: I0223 07:30:32.976310 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-catalog-content\") pod \"redhat-operators-b5n85\" (UID: \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\") " pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:33 crc kubenswrapper[5047]: I0223 07:30:33.078235 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxgz7\" (UniqueName: \"kubernetes.io/projected/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-kube-api-access-mxgz7\") pod \"redhat-operators-b5n85\" (UID: \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\") " pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:33 crc kubenswrapper[5047]: I0223 07:30:33.078565 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-utilities\") pod \"redhat-operators-b5n85\" (UID: \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\") " pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:33 crc kubenswrapper[5047]: I0223 07:30:33.078743 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-catalog-content\") pod \"redhat-operators-b5n85\" (UID: \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\") " pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:33 crc kubenswrapper[5047]: I0223 07:30:33.079254 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-utilities\") pod \"redhat-operators-b5n85\" (UID: \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\") " pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:33 crc kubenswrapper[5047]: I0223 07:30:33.079674 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-catalog-content\") pod \"redhat-operators-b5n85\" (UID: \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\") " pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:33 crc kubenswrapper[5047]: I0223 07:30:33.100173 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxgz7\" (UniqueName: \"kubernetes.io/projected/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-kube-api-access-mxgz7\") pod \"redhat-operators-b5n85\" (UID: \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\") " pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:33 crc kubenswrapper[5047]: I0223 07:30:33.273384 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:33 crc kubenswrapper[5047]: I0223 07:30:33.538841 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b5n85"] Feb 23 07:30:34 crc kubenswrapper[5047]: I0223 07:30:34.363354 5047 generic.go:334] "Generic (PLEG): container finished" podID="f3fac8e4-56a6-4b79-b5a6-99b938425b1f" containerID="9578b96929a7b8888a3cf0d34c3941669b01f30464684eedbbef65dac1213bb3" exitCode=0 Feb 23 07:30:34 crc kubenswrapper[5047]: I0223 07:30:34.363462 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5n85" event={"ID":"f3fac8e4-56a6-4b79-b5a6-99b938425b1f","Type":"ContainerDied","Data":"9578b96929a7b8888a3cf0d34c3941669b01f30464684eedbbef65dac1213bb3"} Feb 23 07:30:34 crc kubenswrapper[5047]: I0223 07:30:34.363671 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5n85" event={"ID":"f3fac8e4-56a6-4b79-b5a6-99b938425b1f","Type":"ContainerStarted","Data":"e98adc12008d7adf8c4faac280a7cd0de8a8c90d60d8c95588df033b41d73346"} Feb 23 07:30:36 crc kubenswrapper[5047]: I0223 07:30:36.385673 5047 generic.go:334] "Generic (PLEG): container finished" podID="f3fac8e4-56a6-4b79-b5a6-99b938425b1f" containerID="dabc24fb993c593f3b17ec2bcf1f4810c34613069602280157b52222fcd80284" exitCode=0 Feb 23 07:30:36 crc kubenswrapper[5047]: I0223 07:30:36.386084 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5n85" event={"ID":"f3fac8e4-56a6-4b79-b5a6-99b938425b1f","Type":"ContainerDied","Data":"dabc24fb993c593f3b17ec2bcf1f4810c34613069602280157b52222fcd80284"} Feb 23 07:30:37 crc kubenswrapper[5047]: I0223 07:30:37.341556 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:30:37 crc kubenswrapper[5047]: E0223 07:30:37.342671 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:30:37 crc kubenswrapper[5047]: I0223 07:30:37.400168 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5n85" event={"ID":"f3fac8e4-56a6-4b79-b5a6-99b938425b1f","Type":"ContainerStarted","Data":"6ffddab3fa6eddc557ca446a26206b44ebaf689d2f2467a8a5b3deb06a1f5435"} Feb 23 07:30:37 crc kubenswrapper[5047]: I0223 07:30:37.429312 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b5n85" podStartSLOduration=2.996139058 podStartE2EDuration="5.429286999s" podCreationTimestamp="2026-02-23 07:30:32 +0000 UTC" firstStartedPulling="2026-02-23 07:30:34.368312505 +0000 UTC m=+2756.619639649" lastFinishedPulling="2026-02-23 07:30:36.801460456 +0000 UTC m=+2759.052787590" observedRunningTime="2026-02-23 07:30:37.422389017 +0000 UTC m=+2759.673716221" watchObservedRunningTime="2026-02-23 07:30:37.429286999 +0000 UTC m=+2759.680614143" Feb 23 07:30:43 crc kubenswrapper[5047]: I0223 07:30:43.274236 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:43 crc kubenswrapper[5047]: I0223 07:30:43.276176 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:44 crc kubenswrapper[5047]: I0223 07:30:44.325702 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b5n85" podUID="f3fac8e4-56a6-4b79-b5a6-99b938425b1f" containerName="registry-server" probeResult="failure" output=< Feb 23 07:30:44 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 07:30:44 crc kubenswrapper[5047]: > Feb 23 07:30:50 crc kubenswrapper[5047]: I0223 07:30:50.340577 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:30:51 crc kubenswrapper[5047]: I0223 07:30:51.532817 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"9aa5e0174e70f4c81659c1ed89e399ba24809d231adef2083d34abf38a5b43e1"} Feb 23 07:30:53 crc kubenswrapper[5047]: I0223 07:30:53.339724 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:53 crc kubenswrapper[5047]: I0223 07:30:53.349230 5047 scope.go:117] "RemoveContainer" containerID="8f6e37f1f6cd5a01340c7a7c2791d58e7a3095aab22cfed70b914d2f52d6dc8d" Feb 23 07:30:53 crc kubenswrapper[5047]: I0223 07:30:53.410257 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:53 crc kubenswrapper[5047]: I0223 07:30:53.585119 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b5n85"] Feb 23 07:30:54 crc kubenswrapper[5047]: I0223 07:30:54.561594 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b5n85" podUID="f3fac8e4-56a6-4b79-b5a6-99b938425b1f" containerName="registry-server" containerID="cri-o://6ffddab3fa6eddc557ca446a26206b44ebaf689d2f2467a8a5b3deb06a1f5435" gracePeriod=2 Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.013063 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.059667 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-catalog-content\") pod \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\" (UID: \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\") " Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.059837 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxgz7\" (UniqueName: \"kubernetes.io/projected/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-kube-api-access-mxgz7\") pod \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\" (UID: \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\") " Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.059885 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-utilities\") pod \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\" (UID: \"f3fac8e4-56a6-4b79-b5a6-99b938425b1f\") " Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.060809 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-utilities" (OuterVolumeSpecName: "utilities") pod "f3fac8e4-56a6-4b79-b5a6-99b938425b1f" (UID: "f3fac8e4-56a6-4b79-b5a6-99b938425b1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.079212 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-kube-api-access-mxgz7" (OuterVolumeSpecName: "kube-api-access-mxgz7") pod "f3fac8e4-56a6-4b79-b5a6-99b938425b1f" (UID: "f3fac8e4-56a6-4b79-b5a6-99b938425b1f"). InnerVolumeSpecName "kube-api-access-mxgz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.162588 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxgz7\" (UniqueName: \"kubernetes.io/projected/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-kube-api-access-mxgz7\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.162628 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.266491 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3fac8e4-56a6-4b79-b5a6-99b938425b1f" (UID: "f3fac8e4-56a6-4b79-b5a6-99b938425b1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.368376 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3fac8e4-56a6-4b79-b5a6-99b938425b1f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.578316 5047 generic.go:334] "Generic (PLEG): container finished" podID="f3fac8e4-56a6-4b79-b5a6-99b938425b1f" containerID="6ffddab3fa6eddc557ca446a26206b44ebaf689d2f2467a8a5b3deb06a1f5435" exitCode=0 Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.578412 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5n85" event={"ID":"f3fac8e4-56a6-4b79-b5a6-99b938425b1f","Type":"ContainerDied","Data":"6ffddab3fa6eddc557ca446a26206b44ebaf689d2f2467a8a5b3deb06a1f5435"} Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.578438 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b5n85" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.578476 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5n85" event={"ID":"f3fac8e4-56a6-4b79-b5a6-99b938425b1f","Type":"ContainerDied","Data":"e98adc12008d7adf8c4faac280a7cd0de8a8c90d60d8c95588df033b41d73346"} Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.578508 5047 scope.go:117] "RemoveContainer" containerID="6ffddab3fa6eddc557ca446a26206b44ebaf689d2f2467a8a5b3deb06a1f5435" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.628345 5047 scope.go:117] "RemoveContainer" containerID="dabc24fb993c593f3b17ec2bcf1f4810c34613069602280157b52222fcd80284" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.644849 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b5n85"] Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.653980 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b5n85"] Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.671572 5047 scope.go:117] "RemoveContainer" containerID="9578b96929a7b8888a3cf0d34c3941669b01f30464684eedbbef65dac1213bb3" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.694850 5047 scope.go:117] "RemoveContainer" containerID="6ffddab3fa6eddc557ca446a26206b44ebaf689d2f2467a8a5b3deb06a1f5435" Feb 23 07:30:55 crc kubenswrapper[5047]: E0223 07:30:55.702471 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ffddab3fa6eddc557ca446a26206b44ebaf689d2f2467a8a5b3deb06a1f5435\": container with ID starting with 6ffddab3fa6eddc557ca446a26206b44ebaf689d2f2467a8a5b3deb06a1f5435 not found: ID does not exist" containerID="6ffddab3fa6eddc557ca446a26206b44ebaf689d2f2467a8a5b3deb06a1f5435" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.702533 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ffddab3fa6eddc557ca446a26206b44ebaf689d2f2467a8a5b3deb06a1f5435"} err="failed to get container status \"6ffddab3fa6eddc557ca446a26206b44ebaf689d2f2467a8a5b3deb06a1f5435\": rpc error: code = NotFound desc = could not find container \"6ffddab3fa6eddc557ca446a26206b44ebaf689d2f2467a8a5b3deb06a1f5435\": container with ID starting with 6ffddab3fa6eddc557ca446a26206b44ebaf689d2f2467a8a5b3deb06a1f5435 not found: ID does not exist" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.702568 5047 scope.go:117] "RemoveContainer" containerID="dabc24fb993c593f3b17ec2bcf1f4810c34613069602280157b52222fcd80284" Feb 23 07:30:55 crc kubenswrapper[5047]: E0223 07:30:55.702839 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dabc24fb993c593f3b17ec2bcf1f4810c34613069602280157b52222fcd80284\": container with ID starting with dabc24fb993c593f3b17ec2bcf1f4810c34613069602280157b52222fcd80284 not found: ID does not exist" containerID="dabc24fb993c593f3b17ec2bcf1f4810c34613069602280157b52222fcd80284" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.702870 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dabc24fb993c593f3b17ec2bcf1f4810c34613069602280157b52222fcd80284"} err="failed to get container status \"dabc24fb993c593f3b17ec2bcf1f4810c34613069602280157b52222fcd80284\": rpc error: code = NotFound desc = could not find container \"dabc24fb993c593f3b17ec2bcf1f4810c34613069602280157b52222fcd80284\": container with ID starting with dabc24fb993c593f3b17ec2bcf1f4810c34613069602280157b52222fcd80284 not found: ID does not exist" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.702888 5047 scope.go:117] "RemoveContainer" containerID="9578b96929a7b8888a3cf0d34c3941669b01f30464684eedbbef65dac1213bb3" Feb 23 07:30:55 crc kubenswrapper[5047]: E0223 07:30:55.703144 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9578b96929a7b8888a3cf0d34c3941669b01f30464684eedbbef65dac1213bb3\": container with ID starting with 9578b96929a7b8888a3cf0d34c3941669b01f30464684eedbbef65dac1213bb3 not found: ID does not exist" containerID="9578b96929a7b8888a3cf0d34c3941669b01f30464684eedbbef65dac1213bb3" Feb 23 07:30:55 crc kubenswrapper[5047]: I0223 07:30:55.703167 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9578b96929a7b8888a3cf0d34c3941669b01f30464684eedbbef65dac1213bb3"} err="failed to get container status \"9578b96929a7b8888a3cf0d34c3941669b01f30464684eedbbef65dac1213bb3\": rpc error: code = NotFound desc = could not find container \"9578b96929a7b8888a3cf0d34c3941669b01f30464684eedbbef65dac1213bb3\": container with ID starting with 9578b96929a7b8888a3cf0d34c3941669b01f30464684eedbbef65dac1213bb3 not found: ID does not exist" Feb 23 07:30:56 crc kubenswrapper[5047]: I0223 07:30:56.360049 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3fac8e4-56a6-4b79-b5a6-99b938425b1f" path="/var/lib/kubelet/pods/f3fac8e4-56a6-4b79-b5a6-99b938425b1f/volumes" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.259878 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mtmpz"] Feb 23 07:32:54 crc kubenswrapper[5047]: E0223 07:32:54.261180 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3fac8e4-56a6-4b79-b5a6-99b938425b1f" containerName="extract-content" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.261209 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fac8e4-56a6-4b79-b5a6-99b938425b1f" containerName="extract-content" Feb 23 07:32:54 crc kubenswrapper[5047]: E0223 07:32:54.261262 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3fac8e4-56a6-4b79-b5a6-99b938425b1f" containerName="extract-utilities" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.261280 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fac8e4-56a6-4b79-b5a6-99b938425b1f" containerName="extract-utilities" Feb 23 07:32:54 crc kubenswrapper[5047]: E0223 07:32:54.261306 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3fac8e4-56a6-4b79-b5a6-99b938425b1f" containerName="registry-server" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.261323 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fac8e4-56a6-4b79-b5a6-99b938425b1f" containerName="registry-server" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.261690 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3fac8e4-56a6-4b79-b5a6-99b938425b1f" containerName="registry-server" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.263641 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.286280 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mtmpz"] Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.355341 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwd6x\" (UniqueName: \"kubernetes.io/projected/f0e601da-5ae0-4c40-8742-328404fd4ce0-kube-api-access-fwd6x\") pod \"community-operators-mtmpz\" (UID: \"f0e601da-5ae0-4c40-8742-328404fd4ce0\") " pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.355440 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e601da-5ae0-4c40-8742-328404fd4ce0-catalog-content\") pod \"community-operators-mtmpz\" (UID: \"f0e601da-5ae0-4c40-8742-328404fd4ce0\") " pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.355520 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e601da-5ae0-4c40-8742-328404fd4ce0-utilities\") pod \"community-operators-mtmpz\" (UID: \"f0e601da-5ae0-4c40-8742-328404fd4ce0\") " pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.456716 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e601da-5ae0-4c40-8742-328404fd4ce0-catalog-content\") pod \"community-operators-mtmpz\" (UID: \"f0e601da-5ae0-4c40-8742-328404fd4ce0\") " pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.456822 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e601da-5ae0-4c40-8742-328404fd4ce0-utilities\") pod \"community-operators-mtmpz\" (UID: \"f0e601da-5ae0-4c40-8742-328404fd4ce0\") " pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.456872 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwd6x\" (UniqueName: \"kubernetes.io/projected/f0e601da-5ae0-4c40-8742-328404fd4ce0-kube-api-access-fwd6x\") pod \"community-operators-mtmpz\" (UID: \"f0e601da-5ae0-4c40-8742-328404fd4ce0\") " pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.458121 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e601da-5ae0-4c40-8742-328404fd4ce0-catalog-content\") pod \"community-operators-mtmpz\" (UID: \"f0e601da-5ae0-4c40-8742-328404fd4ce0\") " pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.458845 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e601da-5ae0-4c40-8742-328404fd4ce0-utilities\") pod \"community-operators-mtmpz\" (UID: \"f0e601da-5ae0-4c40-8742-328404fd4ce0\") " pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.477697 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwd6x\" (UniqueName: \"kubernetes.io/projected/f0e601da-5ae0-4c40-8742-328404fd4ce0-kube-api-access-fwd6x\") pod \"community-operators-mtmpz\" (UID: \"f0e601da-5ae0-4c40-8742-328404fd4ce0\") " pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:32:54 crc kubenswrapper[5047]: I0223 07:32:54.583660 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:32:55 crc kubenswrapper[5047]: I0223 07:32:55.080254 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mtmpz"] Feb 23 07:32:55 crc kubenswrapper[5047]: I0223 07:32:55.742515 5047 generic.go:334] "Generic (PLEG): container finished" podID="f0e601da-5ae0-4c40-8742-328404fd4ce0" containerID="0435c5eaae1ed46286d90fc36508de70c34b8b0890a1243161bfd06990eb08a4" exitCode=0 Feb 23 07:32:55 crc kubenswrapper[5047]: I0223 07:32:55.742569 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtmpz" event={"ID":"f0e601da-5ae0-4c40-8742-328404fd4ce0","Type":"ContainerDied","Data":"0435c5eaae1ed46286d90fc36508de70c34b8b0890a1243161bfd06990eb08a4"} Feb 23 07:32:55 crc kubenswrapper[5047]: I0223 07:32:55.742598 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtmpz" event={"ID":"f0e601da-5ae0-4c40-8742-328404fd4ce0","Type":"ContainerStarted","Data":"564325d99ea8c729122a665e06490f08ede1490b639faba0d30deb9c85843be3"} Feb 23 07:32:56 crc kubenswrapper[5047]: I0223 07:32:56.753592 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtmpz" event={"ID":"f0e601da-5ae0-4c40-8742-328404fd4ce0","Type":"ContainerStarted","Data":"57b9b3f1fd184a4ae1796df0b0e378de5a78fd03e03564d9386c9731958b7637"} Feb 23 07:32:57 crc kubenswrapper[5047]: I0223 07:32:57.768528 5047 generic.go:334] "Generic (PLEG): container finished" podID="f0e601da-5ae0-4c40-8742-328404fd4ce0" containerID="57b9b3f1fd184a4ae1796df0b0e378de5a78fd03e03564d9386c9731958b7637" exitCode=0 Feb 23 07:32:57 crc kubenswrapper[5047]: I0223 07:32:57.768637 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtmpz" event={"ID":"f0e601da-5ae0-4c40-8742-328404fd4ce0","Type":"ContainerDied","Data":"57b9b3f1fd184a4ae1796df0b0e378de5a78fd03e03564d9386c9731958b7637"} Feb 23 07:32:58 crc kubenswrapper[5047]: I0223 07:32:58.780193 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtmpz" event={"ID":"f0e601da-5ae0-4c40-8742-328404fd4ce0","Type":"ContainerStarted","Data":"c37e4ec7b58531454611b2673aa78b4bed58d3a23e3ac1d9ca86ca6c432ac9da"} Feb 23 07:32:58 crc kubenswrapper[5047]: I0223 07:32:58.810756 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mtmpz" podStartSLOduration=2.38244149 podStartE2EDuration="4.81073173s" podCreationTimestamp="2026-02-23 07:32:54 +0000 UTC" firstStartedPulling="2026-02-23 07:32:55.743950556 +0000 UTC m=+2897.995277690" lastFinishedPulling="2026-02-23 07:32:58.172240786 +0000 UTC m=+2900.423567930" observedRunningTime="2026-02-23 07:32:58.805963021 +0000 UTC m=+2901.057290195" watchObservedRunningTime="2026-02-23 07:32:58.81073173 +0000 UTC m=+2901.062058874" Feb 23 07:33:04 crc kubenswrapper[5047]: I0223 07:33:04.584833 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:33:04 crc kubenswrapper[5047]: I0223 07:33:04.585751 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:33:04 crc kubenswrapper[5047]: I0223 07:33:04.648323 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:33:04 crc kubenswrapper[5047]: I0223 07:33:04.896983 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:33:04 crc kubenswrapper[5047]: I0223 07:33:04.981528 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mtmpz"] Feb 23 07:33:06 crc kubenswrapper[5047]: I0223 07:33:06.853179 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mtmpz" podUID="f0e601da-5ae0-4c40-8742-328404fd4ce0" containerName="registry-server" containerID="cri-o://c37e4ec7b58531454611b2673aa78b4bed58d3a23e3ac1d9ca86ca6c432ac9da" gracePeriod=2 Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.364235 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.476980 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e601da-5ae0-4c40-8742-328404fd4ce0-catalog-content\") pod \"f0e601da-5ae0-4c40-8742-328404fd4ce0\" (UID: \"f0e601da-5ae0-4c40-8742-328404fd4ce0\") " Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.477081 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwd6x\" (UniqueName: \"kubernetes.io/projected/f0e601da-5ae0-4c40-8742-328404fd4ce0-kube-api-access-fwd6x\") pod \"f0e601da-5ae0-4c40-8742-328404fd4ce0\" (UID: \"f0e601da-5ae0-4c40-8742-328404fd4ce0\") " Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.477123 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e601da-5ae0-4c40-8742-328404fd4ce0-utilities\") pod \"f0e601da-5ae0-4c40-8742-328404fd4ce0\" (UID: \"f0e601da-5ae0-4c40-8742-328404fd4ce0\") " Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.478209 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e601da-5ae0-4c40-8742-328404fd4ce0-utilities" (OuterVolumeSpecName: "utilities") pod "f0e601da-5ae0-4c40-8742-328404fd4ce0" (UID: "f0e601da-5ae0-4c40-8742-328404fd4ce0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.486112 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e601da-5ae0-4c40-8742-328404fd4ce0-kube-api-access-fwd6x" (OuterVolumeSpecName: "kube-api-access-fwd6x") pod "f0e601da-5ae0-4c40-8742-328404fd4ce0" (UID: "f0e601da-5ae0-4c40-8742-328404fd4ce0"). InnerVolumeSpecName "kube-api-access-fwd6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.530439 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e601da-5ae0-4c40-8742-328404fd4ce0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0e601da-5ae0-4c40-8742-328404fd4ce0" (UID: "f0e601da-5ae0-4c40-8742-328404fd4ce0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.579251 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e601da-5ae0-4c40-8742-328404fd4ce0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.579289 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwd6x\" (UniqueName: \"kubernetes.io/projected/f0e601da-5ae0-4c40-8742-328404fd4ce0-kube-api-access-fwd6x\") on node \"crc\" DevicePath \"\"" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.579303 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e601da-5ae0-4c40-8742-328404fd4ce0-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.867220 5047 generic.go:334] "Generic (PLEG): container finished" podID="f0e601da-5ae0-4c40-8742-328404fd4ce0" containerID="c37e4ec7b58531454611b2673aa78b4bed58d3a23e3ac1d9ca86ca6c432ac9da" exitCode=0 Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.867317 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtmpz" event={"ID":"f0e601da-5ae0-4c40-8742-328404fd4ce0","Type":"ContainerDied","Data":"c37e4ec7b58531454611b2673aa78b4bed58d3a23e3ac1d9ca86ca6c432ac9da"} Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.867385 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtmpz" event={"ID":"f0e601da-5ae0-4c40-8742-328404fd4ce0","Type":"ContainerDied","Data":"564325d99ea8c729122a665e06490f08ede1490b639faba0d30deb9c85843be3"} Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.867379 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtmpz" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.867414 5047 scope.go:117] "RemoveContainer" containerID="c37e4ec7b58531454611b2673aa78b4bed58d3a23e3ac1d9ca86ca6c432ac9da" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.908818 5047 scope.go:117] "RemoveContainer" containerID="57b9b3f1fd184a4ae1796df0b0e378de5a78fd03e03564d9386c9731958b7637" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.926079 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mtmpz"] Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.937747 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mtmpz"] Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.948827 5047 scope.go:117] "RemoveContainer" containerID="0435c5eaae1ed46286d90fc36508de70c34b8b0890a1243161bfd06990eb08a4" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.992768 5047 scope.go:117] "RemoveContainer" containerID="c37e4ec7b58531454611b2673aa78b4bed58d3a23e3ac1d9ca86ca6c432ac9da" Feb 23 07:33:07 crc kubenswrapper[5047]: E0223 07:33:07.993516 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37e4ec7b58531454611b2673aa78b4bed58d3a23e3ac1d9ca86ca6c432ac9da\": container with ID starting with c37e4ec7b58531454611b2673aa78b4bed58d3a23e3ac1d9ca86ca6c432ac9da not found: ID does not exist" containerID="c37e4ec7b58531454611b2673aa78b4bed58d3a23e3ac1d9ca86ca6c432ac9da" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.993648 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37e4ec7b58531454611b2673aa78b4bed58d3a23e3ac1d9ca86ca6c432ac9da"} err="failed to get container status \"c37e4ec7b58531454611b2673aa78b4bed58d3a23e3ac1d9ca86ca6c432ac9da\": rpc error: code = NotFound desc = could not find container \"c37e4ec7b58531454611b2673aa78b4bed58d3a23e3ac1d9ca86ca6c432ac9da\": container with ID starting with c37e4ec7b58531454611b2673aa78b4bed58d3a23e3ac1d9ca86ca6c432ac9da not found: ID does not exist" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.993703 5047 scope.go:117] "RemoveContainer" containerID="57b9b3f1fd184a4ae1796df0b0e378de5a78fd03e03564d9386c9731958b7637" Feb 23 07:33:07 crc kubenswrapper[5047]: E0223 07:33:07.994256 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b9b3f1fd184a4ae1796df0b0e378de5a78fd03e03564d9386c9731958b7637\": container with ID starting with 57b9b3f1fd184a4ae1796df0b0e378de5a78fd03e03564d9386c9731958b7637 not found: ID does not exist" containerID="57b9b3f1fd184a4ae1796df0b0e378de5a78fd03e03564d9386c9731958b7637" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.994324 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b9b3f1fd184a4ae1796df0b0e378de5a78fd03e03564d9386c9731958b7637"} err="failed to get container status \"57b9b3f1fd184a4ae1796df0b0e378de5a78fd03e03564d9386c9731958b7637\": rpc error: code = NotFound desc = could not find container \"57b9b3f1fd184a4ae1796df0b0e378de5a78fd03e03564d9386c9731958b7637\": container with ID starting with 57b9b3f1fd184a4ae1796df0b0e378de5a78fd03e03564d9386c9731958b7637 not found: ID does not exist" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.994369 5047 scope.go:117] "RemoveContainer" containerID="0435c5eaae1ed46286d90fc36508de70c34b8b0890a1243161bfd06990eb08a4" Feb 23 07:33:07 crc kubenswrapper[5047]: E0223 07:33:07.994794 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0435c5eaae1ed46286d90fc36508de70c34b8b0890a1243161bfd06990eb08a4\": container with ID starting with 0435c5eaae1ed46286d90fc36508de70c34b8b0890a1243161bfd06990eb08a4 not found: ID does not exist" containerID="0435c5eaae1ed46286d90fc36508de70c34b8b0890a1243161bfd06990eb08a4" Feb 23 07:33:07 crc kubenswrapper[5047]: I0223 07:33:07.994855 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0435c5eaae1ed46286d90fc36508de70c34b8b0890a1243161bfd06990eb08a4"} err="failed to get container status \"0435c5eaae1ed46286d90fc36508de70c34b8b0890a1243161bfd06990eb08a4\": rpc error: code = NotFound desc = could not find container \"0435c5eaae1ed46286d90fc36508de70c34b8b0890a1243161bfd06990eb08a4\": container with ID starting with 0435c5eaae1ed46286d90fc36508de70c34b8b0890a1243161bfd06990eb08a4 not found: ID does not exist" Feb 23 07:33:08 crc kubenswrapper[5047]: I0223 07:33:08.352952 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0e601da-5ae0-4c40-8742-328404fd4ce0" path="/var/lib/kubelet/pods/f0e601da-5ae0-4c40-8742-328404fd4ce0/volumes" Feb 23 07:33:16 crc kubenswrapper[5047]: I0223 07:33:16.760548 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:33:16 crc kubenswrapper[5047]: I0223 07:33:16.761359 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:33:46 crc kubenswrapper[5047]: I0223 07:33:46.760586 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:33:46 crc kubenswrapper[5047]: I0223 07:33:46.763134 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.532598 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vg62j"] Feb 23 07:33:51 crc kubenswrapper[5047]: E0223 07:33:51.534454 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e601da-5ae0-4c40-8742-328404fd4ce0" containerName="extract-content" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.534493 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e601da-5ae0-4c40-8742-328404fd4ce0" containerName="extract-content" Feb 23 07:33:51 crc kubenswrapper[5047]: E0223 07:33:51.534515 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e601da-5ae0-4c40-8742-328404fd4ce0" containerName="extract-utilities" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.534523 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e601da-5ae0-4c40-8742-328404fd4ce0" containerName="extract-utilities" Feb 23 07:33:51 crc kubenswrapper[5047]: E0223 07:33:51.534530 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e601da-5ae0-4c40-8742-328404fd4ce0" containerName="registry-server" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.534536 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e601da-5ae0-4c40-8742-328404fd4ce0" containerName="registry-server" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.534729 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0e601da-5ae0-4c40-8742-328404fd4ce0" containerName="registry-server" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.536176 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.539176 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vg62j"] Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.727825 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d427e-02e8-47a0-8c50-adb9b2807b49-catalog-content\") pod \"redhat-marketplace-vg62j\" (UID: \"637d427e-02e8-47a0-8c50-adb9b2807b49\") " pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.727893 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d427e-02e8-47a0-8c50-adb9b2807b49-utilities\") pod \"redhat-marketplace-vg62j\" (UID: \"637d427e-02e8-47a0-8c50-adb9b2807b49\") " pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.727958 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqltp\" (UniqueName: \"kubernetes.io/projected/637d427e-02e8-47a0-8c50-adb9b2807b49-kube-api-access-pqltp\") pod \"redhat-marketplace-vg62j\" (UID: \"637d427e-02e8-47a0-8c50-adb9b2807b49\") " pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.829077 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d427e-02e8-47a0-8c50-adb9b2807b49-catalog-content\") pod \"redhat-marketplace-vg62j\" (UID: \"637d427e-02e8-47a0-8c50-adb9b2807b49\") " pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.829139 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d427e-02e8-47a0-8c50-adb9b2807b49-utilities\") pod \"redhat-marketplace-vg62j\" (UID: \"637d427e-02e8-47a0-8c50-adb9b2807b49\") " pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.829179 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqltp\" (UniqueName: \"kubernetes.io/projected/637d427e-02e8-47a0-8c50-adb9b2807b49-kube-api-access-pqltp\") pod \"redhat-marketplace-vg62j\" (UID: \"637d427e-02e8-47a0-8c50-adb9b2807b49\") " pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.829822 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d427e-02e8-47a0-8c50-adb9b2807b49-catalog-content\") pod \"redhat-marketplace-vg62j\" (UID: \"637d427e-02e8-47a0-8c50-adb9b2807b49\") " pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.829943 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d427e-02e8-47a0-8c50-adb9b2807b49-utilities\") pod \"redhat-marketplace-vg62j\" (UID: \"637d427e-02e8-47a0-8c50-adb9b2807b49\") " pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.863430 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqltp\" (UniqueName: \"kubernetes.io/projected/637d427e-02e8-47a0-8c50-adb9b2807b49-kube-api-access-pqltp\") pod \"redhat-marketplace-vg62j\" (UID: \"637d427e-02e8-47a0-8c50-adb9b2807b49\") " pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:33:51 crc kubenswrapper[5047]: I0223 07:33:51.871562 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:33:52 crc kubenswrapper[5047]: I0223 07:33:52.142371 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vg62j"] Feb 23 07:33:52 crc kubenswrapper[5047]: I0223 07:33:52.334639 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vg62j" event={"ID":"637d427e-02e8-47a0-8c50-adb9b2807b49","Type":"ContainerStarted","Data":"a7534ce9e2d756b57421f116ee56175b50f028982a964e8ad4a7e2771d0133c7"} Feb 23 07:33:52 crc kubenswrapper[5047]: I0223 07:33:52.335301 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vg62j" event={"ID":"637d427e-02e8-47a0-8c50-adb9b2807b49","Type":"ContainerStarted","Data":"6e53511ca0185c8e4798be915f62a95461ba2e45b0d8080fdbcac3f9d151af85"} Feb 23 07:33:53 crc kubenswrapper[5047]: I0223 07:33:53.346111 5047 generic.go:334] "Generic (PLEG): container finished" podID="637d427e-02e8-47a0-8c50-adb9b2807b49" containerID="a7534ce9e2d756b57421f116ee56175b50f028982a964e8ad4a7e2771d0133c7" exitCode=0 Feb 23 07:33:53 crc kubenswrapper[5047]: I0223 07:33:53.346247 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vg62j" event={"ID":"637d427e-02e8-47a0-8c50-adb9b2807b49","Type":"ContainerDied","Data":"a7534ce9e2d756b57421f116ee56175b50f028982a964e8ad4a7e2771d0133c7"} Feb 23 07:33:54 crc kubenswrapper[5047]: I0223 07:33:54.364588 5047 generic.go:334] "Generic (PLEG): container finished" podID="637d427e-02e8-47a0-8c50-adb9b2807b49" containerID="89602658f839880d9c67ff7fd039af8c6f1d5dd2cdfaf2c7f7c6175756a5abbb" exitCode=0 Feb 23 07:33:54 crc kubenswrapper[5047]: I0223 07:33:54.364759 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vg62j" event={"ID":"637d427e-02e8-47a0-8c50-adb9b2807b49","Type":"ContainerDied","Data":"89602658f839880d9c67ff7fd039af8c6f1d5dd2cdfaf2c7f7c6175756a5abbb"} Feb 23 07:33:55 crc kubenswrapper[5047]: I0223 07:33:55.378179 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vg62j" event={"ID":"637d427e-02e8-47a0-8c50-adb9b2807b49","Type":"ContainerStarted","Data":"44af8f10fbac86df6f59f230aa36914fb9f694bf015a26e36495c7ab51b686c4"} Feb 23 07:33:55 crc kubenswrapper[5047]: I0223 07:33:55.405744 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vg62j" podStartSLOduration=2.789797598 podStartE2EDuration="4.405715052s" podCreationTimestamp="2026-02-23 07:33:51 +0000 UTC" firstStartedPulling="2026-02-23 07:33:53.348984295 +0000 UTC m=+2955.600311439" lastFinishedPulling="2026-02-23 07:33:54.964901719 +0000 UTC m=+2957.216228893" observedRunningTime="2026-02-23 07:33:55.397484731 +0000 UTC m=+2957.648811865" watchObservedRunningTime="2026-02-23 07:33:55.405715052 +0000 UTC m=+2957.657042226" Feb 23 07:34:01 crc kubenswrapper[5047]: I0223 07:34:01.872870 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:34:01 crc kubenswrapper[5047]: I0223 07:34:01.873727 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:34:01 crc kubenswrapper[5047]: I0223 07:34:01.954251 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:34:02 crc kubenswrapper[5047]: I0223 07:34:02.486338 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:34:02 crc kubenswrapper[5047]: I0223 07:34:02.576763 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vg62j"] Feb 23 07:34:04 crc kubenswrapper[5047]: I0223 07:34:04.460410 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vg62j" podUID="637d427e-02e8-47a0-8c50-adb9b2807b49" containerName="registry-server" containerID="cri-o://44af8f10fbac86df6f59f230aa36914fb9f694bf015a26e36495c7ab51b686c4" gracePeriod=2 Feb 23 07:34:04 crc kubenswrapper[5047]: I0223 07:34:04.980386 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.074187 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d427e-02e8-47a0-8c50-adb9b2807b49-utilities\") pod \"637d427e-02e8-47a0-8c50-adb9b2807b49\" (UID: \"637d427e-02e8-47a0-8c50-adb9b2807b49\") " Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.074395 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d427e-02e8-47a0-8c50-adb9b2807b49-catalog-content\") pod \"637d427e-02e8-47a0-8c50-adb9b2807b49\" (UID: \"637d427e-02e8-47a0-8c50-adb9b2807b49\") " Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.074505 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqltp\" (UniqueName: \"kubernetes.io/projected/637d427e-02e8-47a0-8c50-adb9b2807b49-kube-api-access-pqltp\") pod \"637d427e-02e8-47a0-8c50-adb9b2807b49\" (UID: \"637d427e-02e8-47a0-8c50-adb9b2807b49\") " Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.075379 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637d427e-02e8-47a0-8c50-adb9b2807b49-utilities" (OuterVolumeSpecName: "utilities") pod "637d427e-02e8-47a0-8c50-adb9b2807b49" (UID: "637d427e-02e8-47a0-8c50-adb9b2807b49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.085740 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637d427e-02e8-47a0-8c50-adb9b2807b49-kube-api-access-pqltp" (OuterVolumeSpecName: "kube-api-access-pqltp") pod "637d427e-02e8-47a0-8c50-adb9b2807b49" (UID: "637d427e-02e8-47a0-8c50-adb9b2807b49"). InnerVolumeSpecName "kube-api-access-pqltp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.114548 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637d427e-02e8-47a0-8c50-adb9b2807b49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "637d427e-02e8-47a0-8c50-adb9b2807b49" (UID: "637d427e-02e8-47a0-8c50-adb9b2807b49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.175787 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637d427e-02e8-47a0-8c50-adb9b2807b49-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.175825 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqltp\" (UniqueName: \"kubernetes.io/projected/637d427e-02e8-47a0-8c50-adb9b2807b49-kube-api-access-pqltp\") on node \"crc\" DevicePath \"\"" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.175841 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637d427e-02e8-47a0-8c50-adb9b2807b49-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.474684 5047 generic.go:334] "Generic (PLEG): container finished" podID="637d427e-02e8-47a0-8c50-adb9b2807b49" containerID="44af8f10fbac86df6f59f230aa36914fb9f694bf015a26e36495c7ab51b686c4" exitCode=0 Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.474731 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vg62j" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.474741 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vg62j" event={"ID":"637d427e-02e8-47a0-8c50-adb9b2807b49","Type":"ContainerDied","Data":"44af8f10fbac86df6f59f230aa36914fb9f694bf015a26e36495c7ab51b686c4"} Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.474771 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vg62j" event={"ID":"637d427e-02e8-47a0-8c50-adb9b2807b49","Type":"ContainerDied","Data":"6e53511ca0185c8e4798be915f62a95461ba2e45b0d8080fdbcac3f9d151af85"} Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.474795 5047 scope.go:117] "RemoveContainer" containerID="44af8f10fbac86df6f59f230aa36914fb9f694bf015a26e36495c7ab51b686c4" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.516295 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vg62j"] Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.522094 5047 scope.go:117] "RemoveContainer" containerID="89602658f839880d9c67ff7fd039af8c6f1d5dd2cdfaf2c7f7c6175756a5abbb" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.524819 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vg62j"] Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.554501 5047 scope.go:117] "RemoveContainer" containerID="a7534ce9e2d756b57421f116ee56175b50f028982a964e8ad4a7e2771d0133c7" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.578450 5047 scope.go:117] "RemoveContainer" containerID="44af8f10fbac86df6f59f230aa36914fb9f694bf015a26e36495c7ab51b686c4" Feb 23 07:34:05 crc kubenswrapper[5047]: E0223 07:34:05.579242 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44af8f10fbac86df6f59f230aa36914fb9f694bf015a26e36495c7ab51b686c4\": container with ID starting with 44af8f10fbac86df6f59f230aa36914fb9f694bf015a26e36495c7ab51b686c4 not found: ID does not exist" containerID="44af8f10fbac86df6f59f230aa36914fb9f694bf015a26e36495c7ab51b686c4" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.579339 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44af8f10fbac86df6f59f230aa36914fb9f694bf015a26e36495c7ab51b686c4"} err="failed to get container status \"44af8f10fbac86df6f59f230aa36914fb9f694bf015a26e36495c7ab51b686c4\": rpc error: code = NotFound desc = could not find container \"44af8f10fbac86df6f59f230aa36914fb9f694bf015a26e36495c7ab51b686c4\": container with ID starting with 44af8f10fbac86df6f59f230aa36914fb9f694bf015a26e36495c7ab51b686c4 not found: ID does not exist" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.579390 5047 scope.go:117] "RemoveContainer" containerID="89602658f839880d9c67ff7fd039af8c6f1d5dd2cdfaf2c7f7c6175756a5abbb" Feb 23 07:34:05 crc kubenswrapper[5047]: E0223 07:34:05.579865 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89602658f839880d9c67ff7fd039af8c6f1d5dd2cdfaf2c7f7c6175756a5abbb\": container with ID starting with 89602658f839880d9c67ff7fd039af8c6f1d5dd2cdfaf2c7f7c6175756a5abbb not found: ID does not exist" containerID="89602658f839880d9c67ff7fd039af8c6f1d5dd2cdfaf2c7f7c6175756a5abbb" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.579945 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89602658f839880d9c67ff7fd039af8c6f1d5dd2cdfaf2c7f7c6175756a5abbb"} err="failed to get container status \"89602658f839880d9c67ff7fd039af8c6f1d5dd2cdfaf2c7f7c6175756a5abbb\": rpc error: code = NotFound desc = could not find container \"89602658f839880d9c67ff7fd039af8c6f1d5dd2cdfaf2c7f7c6175756a5abbb\": container with ID starting with 89602658f839880d9c67ff7fd039af8c6f1d5dd2cdfaf2c7f7c6175756a5abbb not found: ID does not exist" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.579978 5047 scope.go:117] "RemoveContainer" containerID="a7534ce9e2d756b57421f116ee56175b50f028982a964e8ad4a7e2771d0133c7" Feb 23 07:34:05 crc kubenswrapper[5047]: E0223 07:34:05.580379 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7534ce9e2d756b57421f116ee56175b50f028982a964e8ad4a7e2771d0133c7\": container with ID starting with a7534ce9e2d756b57421f116ee56175b50f028982a964e8ad4a7e2771d0133c7 not found: ID does not exist" containerID="a7534ce9e2d756b57421f116ee56175b50f028982a964e8ad4a7e2771d0133c7" Feb 23 07:34:05 crc kubenswrapper[5047]: I0223 07:34:05.580421 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7534ce9e2d756b57421f116ee56175b50f028982a964e8ad4a7e2771d0133c7"} err="failed to get container status \"a7534ce9e2d756b57421f116ee56175b50f028982a964e8ad4a7e2771d0133c7\": rpc error: code = NotFound desc = could not find container \"a7534ce9e2d756b57421f116ee56175b50f028982a964e8ad4a7e2771d0133c7\": container with ID starting with a7534ce9e2d756b57421f116ee56175b50f028982a964e8ad4a7e2771d0133c7 not found: ID does not exist" Feb 23 07:34:06 crc kubenswrapper[5047]: I0223 07:34:06.356351 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637d427e-02e8-47a0-8c50-adb9b2807b49" path="/var/lib/kubelet/pods/637d427e-02e8-47a0-8c50-adb9b2807b49/volumes" Feb 23 07:34:16 crc kubenswrapper[5047]: I0223 07:34:16.759697 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:34:16 crc kubenswrapper[5047]: I0223 07:34:16.761402 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:34:16 crc kubenswrapper[5047]: I0223 07:34:16.761664 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 07:34:16 crc kubenswrapper[5047]: I0223 07:34:16.762764 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9aa5e0174e70f4c81659c1ed89e399ba24809d231adef2083d34abf38a5b43e1"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:34:16 crc kubenswrapper[5047]: I0223 07:34:16.762986 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://9aa5e0174e70f4c81659c1ed89e399ba24809d231adef2083d34abf38a5b43e1" gracePeriod=600 Feb 23 07:34:17 crc kubenswrapper[5047]: I0223 07:34:17.602526 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"9aa5e0174e70f4c81659c1ed89e399ba24809d231adef2083d34abf38a5b43e1"} Feb 23 07:34:17 crc kubenswrapper[5047]: I0223 07:34:17.602579 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="9aa5e0174e70f4c81659c1ed89e399ba24809d231adef2083d34abf38a5b43e1" exitCode=0 Feb 23 07:34:17 crc kubenswrapper[5047]: I0223 07:34:17.603410 5047 scope.go:117] "RemoveContainer" containerID="3ef7fa8794001773a7a23e319386dfd331f12b70067e1a9cb0ae90f1d8a89a9f" Feb 23 07:34:17 crc kubenswrapper[5047]: I0223 07:34:17.603440 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e"} Feb 23 07:36:46 crc kubenswrapper[5047]: I0223 07:36:46.760026 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:36:46 crc kubenswrapper[5047]: I0223 07:36:46.761273 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:36:53 crc kubenswrapper[5047]: I0223 07:36:53.627848 5047 scope.go:117] "RemoveContainer" containerID="9727b2b23cdfe6630607219f136f775f818868a23a942df5374cd398f5ecd436" Feb 23 07:36:53 crc kubenswrapper[5047]: I0223 07:36:53.654867 5047 scope.go:117] "RemoveContainer" containerID="6bb8d19da5efffc9048e8372b08d65395392d051fcfc2c34bef996bc24ff67e4" Feb 23 07:36:53 crc kubenswrapper[5047]: I0223 07:36:53.671878 5047 scope.go:117] "RemoveContainer" containerID="c592ffa346398d2c7714c60d147cd3bf99d98ce8d235e7f5c1fcd95cf12f76ab" Feb 23 07:37:16 crc kubenswrapper[5047]: I0223 07:37:16.759475 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:37:16 crc kubenswrapper[5047]: I0223 07:37:16.761006 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:37:46 crc kubenswrapper[5047]: I0223 07:37:46.759804 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:37:46 crc kubenswrapper[5047]: I0223 07:37:46.760743 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:37:46 crc kubenswrapper[5047]: I0223 07:37:46.760821 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 07:37:46 crc kubenswrapper[5047]: I0223 07:37:46.761896 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:37:46 crc kubenswrapper[5047]: I0223 07:37:46.762044 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" gracePeriod=600 Feb 23 07:37:46 crc kubenswrapper[5047]: E0223 07:37:46.911404 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:37:47 crc kubenswrapper[5047]: I0223 07:37:47.808234 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" exitCode=0 Feb 23 07:37:47 crc kubenswrapper[5047]: I0223 07:37:47.808252 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e"} Feb 23 07:37:47 crc kubenswrapper[5047]: I0223 07:37:47.808326 5047 scope.go:117] "RemoveContainer" containerID="9aa5e0174e70f4c81659c1ed89e399ba24809d231adef2083d34abf38a5b43e1" Feb 23 07:37:47 crc kubenswrapper[5047]: I0223 07:37:47.809185 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:37:47 crc kubenswrapper[5047]: E0223 07:37:47.809695 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:37:59 crc kubenswrapper[5047]: I0223 07:37:59.341577 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:37:59 crc kubenswrapper[5047]: E0223 07:37:59.342862 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:38:11 crc kubenswrapper[5047]: I0223 07:38:11.341722 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:38:11 crc kubenswrapper[5047]: E0223 07:38:11.344308 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:38:24 crc kubenswrapper[5047]: I0223 07:38:24.340738 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:38:24 crc kubenswrapper[5047]: E0223 07:38:24.341687 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:38:39 crc kubenswrapper[5047]: I0223 07:38:39.341419 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:38:39 crc kubenswrapper[5047]: E0223 07:38:39.342440 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:38:54 crc kubenswrapper[5047]: I0223 07:38:54.341493 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:38:54 crc kubenswrapper[5047]: E0223 07:38:54.342794 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:39:08 crc kubenswrapper[5047]: I0223 07:39:08.350795 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:39:08 crc kubenswrapper[5047]: E0223 07:39:08.351780 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:39:20 crc kubenswrapper[5047]: I0223 07:39:20.341255 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:39:20 crc kubenswrapper[5047]: E0223 07:39:20.342677 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:39:31 crc kubenswrapper[5047]: I0223 07:39:31.341137 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:39:31 crc kubenswrapper[5047]: E0223 07:39:31.342304 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:39:42 crc kubenswrapper[5047]: I0223 07:39:42.341670 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:39:42 crc kubenswrapper[5047]: E0223 07:39:42.343052 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:39:57 crc kubenswrapper[5047]: I0223 07:39:57.341493 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:39:57 crc kubenswrapper[5047]: E0223 07:39:57.342802 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:40:10 crc kubenswrapper[5047]: I0223 07:40:10.340894 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:40:10 crc kubenswrapper[5047]: E0223 07:40:10.342486 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:40:23 crc kubenswrapper[5047]: I0223 07:40:23.340878 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:40:23 crc kubenswrapper[5047]: E0223 07:40:23.341575 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:40:36 crc kubenswrapper[5047]: I0223 07:40:36.342581 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:40:36 crc kubenswrapper[5047]: E0223 07:40:36.344374 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:40:51 crc kubenswrapper[5047]: I0223 07:40:51.341555 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:40:51 crc kubenswrapper[5047]: E0223 07:40:51.344159 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:41:05 crc kubenswrapper[5047]: I0223 07:41:05.341324 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:41:05 crc kubenswrapper[5047]: E0223 07:41:05.342576 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.209653 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fjcj5"] Feb 23 07:41:09 crc kubenswrapper[5047]: E0223 07:41:09.210661 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637d427e-02e8-47a0-8c50-adb9b2807b49" containerName="extract-content" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.210679 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="637d427e-02e8-47a0-8c50-adb9b2807b49" containerName="extract-content" Feb 23 07:41:09 crc kubenswrapper[5047]: E0223 07:41:09.210697 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637d427e-02e8-47a0-8c50-adb9b2807b49" containerName="registry-server" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.210709 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="637d427e-02e8-47a0-8c50-adb9b2807b49" containerName="registry-server" Feb 23 07:41:09 crc kubenswrapper[5047]: E0223 07:41:09.210729 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637d427e-02e8-47a0-8c50-adb9b2807b49" containerName="extract-utilities" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.210738 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="637d427e-02e8-47a0-8c50-adb9b2807b49" containerName="extract-utilities" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.212628 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="637d427e-02e8-47a0-8c50-adb9b2807b49" containerName="registry-server" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.214216 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.234996 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fjcj5"] Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.309961 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89402f00-3610-493b-9052-9eb1b645d836-utilities\") pod \"certified-operators-fjcj5\" (UID: \"89402f00-3610-493b-9052-9eb1b645d836\") " pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.310157 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krnzt\" (UniqueName: \"kubernetes.io/projected/89402f00-3610-493b-9052-9eb1b645d836-kube-api-access-krnzt\") pod \"certified-operators-fjcj5\" (UID: \"89402f00-3610-493b-9052-9eb1b645d836\") " pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.310793 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89402f00-3610-493b-9052-9eb1b645d836-catalog-content\") pod \"certified-operators-fjcj5\" (UID: \"89402f00-3610-493b-9052-9eb1b645d836\") " pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.412330 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89402f00-3610-493b-9052-9eb1b645d836-catalog-content\") pod \"certified-operators-fjcj5\" (UID: \"89402f00-3610-493b-9052-9eb1b645d836\") " pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.412739 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89402f00-3610-493b-9052-9eb1b645d836-utilities\") pod \"certified-operators-fjcj5\" (UID: \"89402f00-3610-493b-9052-9eb1b645d836\") " pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.412891 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krnzt\" (UniqueName: \"kubernetes.io/projected/89402f00-3610-493b-9052-9eb1b645d836-kube-api-access-krnzt\") pod \"certified-operators-fjcj5\" (UID: \"89402f00-3610-493b-9052-9eb1b645d836\") " pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.413025 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89402f00-3610-493b-9052-9eb1b645d836-catalog-content\") pod \"certified-operators-fjcj5\" (UID: \"89402f00-3610-493b-9052-9eb1b645d836\") " pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.413600 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89402f00-3610-493b-9052-9eb1b645d836-utilities\") pod \"certified-operators-fjcj5\" (UID: \"89402f00-3610-493b-9052-9eb1b645d836\") " pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.435205 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krnzt\" (UniqueName: \"kubernetes.io/projected/89402f00-3610-493b-9052-9eb1b645d836-kube-api-access-krnzt\") pod \"certified-operators-fjcj5\" (UID: \"89402f00-3610-493b-9052-9eb1b645d836\") " pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.543530 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:09 crc kubenswrapper[5047]: I0223 07:41:09.841127 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fjcj5"] Feb 23 07:41:10 crc kubenswrapper[5047]: I0223 07:41:10.278931 5047 generic.go:334] "Generic (PLEG): container finished" podID="89402f00-3610-493b-9052-9eb1b645d836" containerID="0061b6185671d2c408c45e548ef3b7b48b007dc5200c10b61cb29904789c2411" exitCode=0 Feb 23 07:41:10 crc kubenswrapper[5047]: I0223 07:41:10.278978 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjcj5" event={"ID":"89402f00-3610-493b-9052-9eb1b645d836","Type":"ContainerDied","Data":"0061b6185671d2c408c45e548ef3b7b48b007dc5200c10b61cb29904789c2411"} Feb 23 07:41:10 crc kubenswrapper[5047]: I0223 07:41:10.279005 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjcj5" event={"ID":"89402f00-3610-493b-9052-9eb1b645d836","Type":"ContainerStarted","Data":"cd086b7f7c2834dfe6c301373ef28103237fed894b789212fe0dd3052a6a2971"} Feb 23 07:41:10 crc kubenswrapper[5047]: I0223 07:41:10.280868 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:41:11 crc kubenswrapper[5047]: I0223 07:41:11.288314 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjcj5" event={"ID":"89402f00-3610-493b-9052-9eb1b645d836","Type":"ContainerStarted","Data":"c1f55f77e842a2774ded092871e769e7c93227c8784d373756ec31eb75dec3a0"} Feb 23 07:41:12 crc kubenswrapper[5047]: I0223 07:41:12.299974 5047 generic.go:334] "Generic (PLEG): container finished" podID="89402f00-3610-493b-9052-9eb1b645d836" containerID="c1f55f77e842a2774ded092871e769e7c93227c8784d373756ec31eb75dec3a0" exitCode=0 Feb 23 07:41:12 crc kubenswrapper[5047]: I0223 07:41:12.300061 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjcj5" event={"ID":"89402f00-3610-493b-9052-9eb1b645d836","Type":"ContainerDied","Data":"c1f55f77e842a2774ded092871e769e7c93227c8784d373756ec31eb75dec3a0"} Feb 23 07:41:13 crc kubenswrapper[5047]: I0223 07:41:13.313838 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjcj5" event={"ID":"89402f00-3610-493b-9052-9eb1b645d836","Type":"ContainerStarted","Data":"570a5ee2102a5228496b88c5e24bb9cbbab05554dd5b52a684b8f6d66732bd0d"} Feb 23 07:41:13 crc kubenswrapper[5047]: I0223 07:41:13.348863 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fjcj5" podStartSLOduration=1.9282896950000001 podStartE2EDuration="4.348832884s" podCreationTimestamp="2026-02-23 07:41:09 +0000 UTC" firstStartedPulling="2026-02-23 07:41:10.280664816 +0000 UTC m=+3392.531991950" lastFinishedPulling="2026-02-23 07:41:12.701208005 +0000 UTC m=+3394.952535139" observedRunningTime="2026-02-23 07:41:13.337343287 +0000 UTC m=+3395.588670431" watchObservedRunningTime="2026-02-23 07:41:13.348832884 +0000 UTC m=+3395.600160018" Feb 23 07:41:19 crc kubenswrapper[5047]: I0223 07:41:19.341666 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:41:19 crc kubenswrapper[5047]: E0223 07:41:19.342758 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:41:19 crc kubenswrapper[5047]: I0223 07:41:19.544658 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:19 crc kubenswrapper[5047]: I0223 07:41:19.544747 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:19 crc kubenswrapper[5047]: I0223 07:41:19.634319 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:20 crc kubenswrapper[5047]: I0223 07:41:20.440574 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:20 crc kubenswrapper[5047]: I0223 07:41:20.500357 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fjcj5"] Feb 23 07:41:22 crc kubenswrapper[5047]: I0223 07:41:22.394271 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fjcj5" podUID="89402f00-3610-493b-9052-9eb1b645d836" containerName="registry-server" containerID="cri-o://570a5ee2102a5228496b88c5e24bb9cbbab05554dd5b52a684b8f6d66732bd0d" gracePeriod=2 Feb 23 07:41:22 crc kubenswrapper[5047]: I0223 07:41:22.936461 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.031899 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89402f00-3610-493b-9052-9eb1b645d836-utilities\") pod \"89402f00-3610-493b-9052-9eb1b645d836\" (UID: \"89402f00-3610-493b-9052-9eb1b645d836\") " Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.032101 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89402f00-3610-493b-9052-9eb1b645d836-catalog-content\") pod \"89402f00-3610-493b-9052-9eb1b645d836\" (UID: \"89402f00-3610-493b-9052-9eb1b645d836\") " Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.032209 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krnzt\" (UniqueName: \"kubernetes.io/projected/89402f00-3610-493b-9052-9eb1b645d836-kube-api-access-krnzt\") pod \"89402f00-3610-493b-9052-9eb1b645d836\" (UID: \"89402f00-3610-493b-9052-9eb1b645d836\") " Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.035116 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89402f00-3610-493b-9052-9eb1b645d836-utilities" (OuterVolumeSpecName: "utilities") pod "89402f00-3610-493b-9052-9eb1b645d836" (UID: "89402f00-3610-493b-9052-9eb1b645d836"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.041415 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89402f00-3610-493b-9052-9eb1b645d836-kube-api-access-krnzt" (OuterVolumeSpecName: "kube-api-access-krnzt") pod "89402f00-3610-493b-9052-9eb1b645d836" (UID: "89402f00-3610-493b-9052-9eb1b645d836"). InnerVolumeSpecName "kube-api-access-krnzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.117022 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89402f00-3610-493b-9052-9eb1b645d836-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89402f00-3610-493b-9052-9eb1b645d836" (UID: "89402f00-3610-493b-9052-9eb1b645d836"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.134450 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89402f00-3610-493b-9052-9eb1b645d836-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.134503 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89402f00-3610-493b-9052-9eb1b645d836-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.134529 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krnzt\" (UniqueName: \"kubernetes.io/projected/89402f00-3610-493b-9052-9eb1b645d836-kube-api-access-krnzt\") on node \"crc\" DevicePath \"\"" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.409882 5047 generic.go:334] "Generic (PLEG): container finished" podID="89402f00-3610-493b-9052-9eb1b645d836" containerID="570a5ee2102a5228496b88c5e24bb9cbbab05554dd5b52a684b8f6d66732bd0d" exitCode=0 Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.409978 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjcj5" event={"ID":"89402f00-3610-493b-9052-9eb1b645d836","Type":"ContainerDied","Data":"570a5ee2102a5228496b88c5e24bb9cbbab05554dd5b52a684b8f6d66732bd0d"} Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.410610 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjcj5" event={"ID":"89402f00-3610-493b-9052-9eb1b645d836","Type":"ContainerDied","Data":"cd086b7f7c2834dfe6c301373ef28103237fed894b789212fe0dd3052a6a2971"} Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.410665 5047 scope.go:117] "RemoveContainer" containerID="570a5ee2102a5228496b88c5e24bb9cbbab05554dd5b52a684b8f6d66732bd0d" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.409998 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fjcj5" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.444599 5047 scope.go:117] "RemoveContainer" containerID="c1f55f77e842a2774ded092871e769e7c93227c8784d373756ec31eb75dec3a0" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.465801 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fjcj5"] Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.476599 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fjcj5"] Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.493192 5047 scope.go:117] "RemoveContainer" containerID="0061b6185671d2c408c45e548ef3b7b48b007dc5200c10b61cb29904789c2411" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.524369 5047 scope.go:117] "RemoveContainer" containerID="570a5ee2102a5228496b88c5e24bb9cbbab05554dd5b52a684b8f6d66732bd0d" Feb 23 07:41:23 crc kubenswrapper[5047]: E0223 07:41:23.525122 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"570a5ee2102a5228496b88c5e24bb9cbbab05554dd5b52a684b8f6d66732bd0d\": container with ID starting with 570a5ee2102a5228496b88c5e24bb9cbbab05554dd5b52a684b8f6d66732bd0d not found: ID does not exist" containerID="570a5ee2102a5228496b88c5e24bb9cbbab05554dd5b52a684b8f6d66732bd0d" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.525193 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"570a5ee2102a5228496b88c5e24bb9cbbab05554dd5b52a684b8f6d66732bd0d"} err="failed to get container status \"570a5ee2102a5228496b88c5e24bb9cbbab05554dd5b52a684b8f6d66732bd0d\": rpc error: code = NotFound desc = could not find container \"570a5ee2102a5228496b88c5e24bb9cbbab05554dd5b52a684b8f6d66732bd0d\": container with ID starting with 570a5ee2102a5228496b88c5e24bb9cbbab05554dd5b52a684b8f6d66732bd0d not found: ID does not exist" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.525235 5047 scope.go:117] "RemoveContainer" containerID="c1f55f77e842a2774ded092871e769e7c93227c8784d373756ec31eb75dec3a0" Feb 23 07:41:23 crc kubenswrapper[5047]: E0223 07:41:23.526124 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1f55f77e842a2774ded092871e769e7c93227c8784d373756ec31eb75dec3a0\": container with ID starting with c1f55f77e842a2774ded092871e769e7c93227c8784d373756ec31eb75dec3a0 not found: ID does not exist" containerID="c1f55f77e842a2774ded092871e769e7c93227c8784d373756ec31eb75dec3a0" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.526178 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f55f77e842a2774ded092871e769e7c93227c8784d373756ec31eb75dec3a0"} err="failed to get container status \"c1f55f77e842a2774ded092871e769e7c93227c8784d373756ec31eb75dec3a0\": rpc error: code = NotFound desc = could not find container \"c1f55f77e842a2774ded092871e769e7c93227c8784d373756ec31eb75dec3a0\": container with ID starting with c1f55f77e842a2774ded092871e769e7c93227c8784d373756ec31eb75dec3a0 not found: ID does not exist" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.526214 5047 scope.go:117] "RemoveContainer" containerID="0061b6185671d2c408c45e548ef3b7b48b007dc5200c10b61cb29904789c2411" Feb 23 07:41:23 crc kubenswrapper[5047]: E0223 07:41:23.527020 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0061b6185671d2c408c45e548ef3b7b48b007dc5200c10b61cb29904789c2411\": container with ID starting with 0061b6185671d2c408c45e548ef3b7b48b007dc5200c10b61cb29904789c2411 not found: ID does not exist" containerID="0061b6185671d2c408c45e548ef3b7b48b007dc5200c10b61cb29904789c2411" Feb 23 07:41:23 crc kubenswrapper[5047]: I0223 07:41:23.527051 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0061b6185671d2c408c45e548ef3b7b48b007dc5200c10b61cb29904789c2411"} err="failed to get container status \"0061b6185671d2c408c45e548ef3b7b48b007dc5200c10b61cb29904789c2411\": rpc error: code = NotFound desc = could not find container \"0061b6185671d2c408c45e548ef3b7b48b007dc5200c10b61cb29904789c2411\": container with ID starting with 0061b6185671d2c408c45e548ef3b7b48b007dc5200c10b61cb29904789c2411 not found: ID does not exist" Feb 23 07:41:24 crc kubenswrapper[5047]: I0223 07:41:24.356116 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89402f00-3610-493b-9052-9eb1b645d836" path="/var/lib/kubelet/pods/89402f00-3610-493b-9052-9eb1b645d836/volumes" Feb 23 07:41:33 crc kubenswrapper[5047]: I0223 07:41:33.343057 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:41:33 crc kubenswrapper[5047]: E0223 07:41:33.344238 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.285276 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qtm7w"] Feb 23 07:41:48 crc kubenswrapper[5047]: E0223 07:41:48.288062 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89402f00-3610-493b-9052-9eb1b645d836" containerName="registry-server" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.288189 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="89402f00-3610-493b-9052-9eb1b645d836" containerName="registry-server" Feb 23 07:41:48 crc kubenswrapper[5047]: E0223 07:41:48.288299 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89402f00-3610-493b-9052-9eb1b645d836" containerName="extract-utilities" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.288381 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="89402f00-3610-493b-9052-9eb1b645d836" containerName="extract-utilities" Feb 23 07:41:48 crc kubenswrapper[5047]: E0223 07:41:48.288490 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89402f00-3610-493b-9052-9eb1b645d836" containerName="extract-content" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.288585 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="89402f00-3610-493b-9052-9eb1b645d836" containerName="extract-content" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.288839 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="89402f00-3610-493b-9052-9eb1b645d836" containerName="registry-server" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.290337 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.325225 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qtm7w"] Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.342122 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:41:48 crc kubenswrapper[5047]: E0223 07:41:48.342601 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.462021 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22cc304e-478e-437d-8573-ed9972785178-catalog-content\") pod \"redhat-operators-qtm7w\" (UID: \"22cc304e-478e-437d-8573-ed9972785178\") " pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.462096 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22cc304e-478e-437d-8573-ed9972785178-utilities\") pod \"redhat-operators-qtm7w\" (UID: \"22cc304e-478e-437d-8573-ed9972785178\") " pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.462169 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqvnx\" (UniqueName: \"kubernetes.io/projected/22cc304e-478e-437d-8573-ed9972785178-kube-api-access-xqvnx\") pod \"redhat-operators-qtm7w\" (UID: \"22cc304e-478e-437d-8573-ed9972785178\") " pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.563862 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22cc304e-478e-437d-8573-ed9972785178-catalog-content\") pod \"redhat-operators-qtm7w\" (UID: \"22cc304e-478e-437d-8573-ed9972785178\") " pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.563928 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22cc304e-478e-437d-8573-ed9972785178-utilities\") pod \"redhat-operators-qtm7w\" (UID: \"22cc304e-478e-437d-8573-ed9972785178\") " pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.563989 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqvnx\" (UniqueName: \"kubernetes.io/projected/22cc304e-478e-437d-8573-ed9972785178-kube-api-access-xqvnx\") pod \"redhat-operators-qtm7w\" (UID: \"22cc304e-478e-437d-8573-ed9972785178\") " pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.564457 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22cc304e-478e-437d-8573-ed9972785178-catalog-content\") pod \"redhat-operators-qtm7w\" (UID: \"22cc304e-478e-437d-8573-ed9972785178\") " pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.564700 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22cc304e-478e-437d-8573-ed9972785178-utilities\") pod \"redhat-operators-qtm7w\" (UID: \"22cc304e-478e-437d-8573-ed9972785178\") " pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.592675 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqvnx\" (UniqueName: \"kubernetes.io/projected/22cc304e-478e-437d-8573-ed9972785178-kube-api-access-xqvnx\") pod \"redhat-operators-qtm7w\" (UID: \"22cc304e-478e-437d-8573-ed9972785178\") " pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:41:48 crc kubenswrapper[5047]: I0223 07:41:48.661553 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:41:49 crc kubenswrapper[5047]: I0223 07:41:49.142413 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qtm7w"] Feb 23 07:41:49 crc kubenswrapper[5047]: I0223 07:41:49.649469 5047 generic.go:334] "Generic (PLEG): container finished" podID="22cc304e-478e-437d-8573-ed9972785178" containerID="37ba11be3adbbbb8de763afa0277a490d636b7b27e77c7ca6cd6ea7e90a7891a" exitCode=0 Feb 23 07:41:49 crc kubenswrapper[5047]: I0223 07:41:49.649575 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtm7w" event={"ID":"22cc304e-478e-437d-8573-ed9972785178","Type":"ContainerDied","Data":"37ba11be3adbbbb8de763afa0277a490d636b7b27e77c7ca6cd6ea7e90a7891a"} Feb 23 07:41:49 crc kubenswrapper[5047]: I0223 07:41:49.649838 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtm7w" event={"ID":"22cc304e-478e-437d-8573-ed9972785178","Type":"ContainerStarted","Data":"fc4563e2c130eeaac3cdd273fa11c584293fce9b3ea3a0bc8b1e73e4c7c083fb"} Feb 23 07:41:50 crc kubenswrapper[5047]: I0223 07:41:50.658632 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtm7w" event={"ID":"22cc304e-478e-437d-8573-ed9972785178","Type":"ContainerStarted","Data":"a43d80b174efcf0692b8bf2a07d8139ff57c2d15f1ef7973ddbeb6d2ae5eb2bd"} Feb 23 07:41:52 crc kubenswrapper[5047]: I0223 07:41:52.677707 5047 generic.go:334] "Generic (PLEG): container finished" podID="22cc304e-478e-437d-8573-ed9972785178" containerID="a43d80b174efcf0692b8bf2a07d8139ff57c2d15f1ef7973ddbeb6d2ae5eb2bd" exitCode=0 Feb 23 07:41:52 crc kubenswrapper[5047]: I0223 07:41:52.677799 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtm7w" event={"ID":"22cc304e-478e-437d-8573-ed9972785178","Type":"ContainerDied","Data":"a43d80b174efcf0692b8bf2a07d8139ff57c2d15f1ef7973ddbeb6d2ae5eb2bd"} Feb 23 07:41:53 crc kubenswrapper[5047]: I0223 07:41:53.689171 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtm7w" event={"ID":"22cc304e-478e-437d-8573-ed9972785178","Type":"ContainerStarted","Data":"e53ef40de1ddb0ed8e1c025c1a40a5cca54ff9ec8961911c62e1aad7612f8896"} Feb 23 07:41:53 crc kubenswrapper[5047]: I0223 07:41:53.723709 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qtm7w" podStartSLOduration=2.2670077920000002 podStartE2EDuration="5.723686614s" podCreationTimestamp="2026-02-23 07:41:48 +0000 UTC" firstStartedPulling="2026-02-23 07:41:49.650795782 +0000 UTC m=+3431.902122926" lastFinishedPulling="2026-02-23 07:41:53.107474564 +0000 UTC m=+3435.358801748" observedRunningTime="2026-02-23 07:41:53.720962442 +0000 UTC m=+3435.972289596" watchObservedRunningTime="2026-02-23 07:41:53.723686614 +0000 UTC m=+3435.975013748" Feb 23 07:41:58 crc kubenswrapper[5047]: I0223 07:41:58.662009 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:41:58 crc kubenswrapper[5047]: I0223 07:41:58.662107 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:41:59 crc kubenswrapper[5047]: I0223 07:41:59.710462 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qtm7w" podUID="22cc304e-478e-437d-8573-ed9972785178" containerName="registry-server" probeResult="failure" output=< Feb 23 07:41:59 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 07:41:59 crc kubenswrapper[5047]: > Feb 23 07:42:00 crc kubenswrapper[5047]: I0223 07:42:00.341124 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:42:00 crc kubenswrapper[5047]: E0223 07:42:00.341387 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:42:08 crc kubenswrapper[5047]: I0223 07:42:08.713714 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:42:08 crc kubenswrapper[5047]: I0223 07:42:08.768702 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:42:08 crc kubenswrapper[5047]: I0223 07:42:08.955171 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qtm7w"] Feb 23 07:42:09 crc kubenswrapper[5047]: I0223 07:42:09.814184 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qtm7w" podUID="22cc304e-478e-437d-8573-ed9972785178" containerName="registry-server" containerID="cri-o://e53ef40de1ddb0ed8e1c025c1a40a5cca54ff9ec8961911c62e1aad7612f8896" gracePeriod=2 Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.233509 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.389080 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqvnx\" (UniqueName: \"kubernetes.io/projected/22cc304e-478e-437d-8573-ed9972785178-kube-api-access-xqvnx\") pod \"22cc304e-478e-437d-8573-ed9972785178\" (UID: \"22cc304e-478e-437d-8573-ed9972785178\") " Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.389588 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22cc304e-478e-437d-8573-ed9972785178-catalog-content\") pod \"22cc304e-478e-437d-8573-ed9972785178\" (UID: \"22cc304e-478e-437d-8573-ed9972785178\") " Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.389711 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22cc304e-478e-437d-8573-ed9972785178-utilities\") pod \"22cc304e-478e-437d-8573-ed9972785178\" (UID: \"22cc304e-478e-437d-8573-ed9972785178\") " Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.391160 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22cc304e-478e-437d-8573-ed9972785178-utilities" (OuterVolumeSpecName: "utilities") pod "22cc304e-478e-437d-8573-ed9972785178" (UID: "22cc304e-478e-437d-8573-ed9972785178"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.397268 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22cc304e-478e-437d-8573-ed9972785178-kube-api-access-xqvnx" (OuterVolumeSpecName: "kube-api-access-xqvnx") pod "22cc304e-478e-437d-8573-ed9972785178" (UID: "22cc304e-478e-437d-8573-ed9972785178"). InnerVolumeSpecName "kube-api-access-xqvnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.491919 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqvnx\" (UniqueName: \"kubernetes.io/projected/22cc304e-478e-437d-8573-ed9972785178-kube-api-access-xqvnx\") on node \"crc\" DevicePath \"\"" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.491953 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22cc304e-478e-437d-8573-ed9972785178-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.514697 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22cc304e-478e-437d-8573-ed9972785178-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22cc304e-478e-437d-8573-ed9972785178" (UID: "22cc304e-478e-437d-8573-ed9972785178"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.592754 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22cc304e-478e-437d-8573-ed9972785178-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.841502 5047 generic.go:334] "Generic (PLEG): container finished" podID="22cc304e-478e-437d-8573-ed9972785178" containerID="e53ef40de1ddb0ed8e1c025c1a40a5cca54ff9ec8961911c62e1aad7612f8896" exitCode=0 Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.841551 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtm7w" event={"ID":"22cc304e-478e-437d-8573-ed9972785178","Type":"ContainerDied","Data":"e53ef40de1ddb0ed8e1c025c1a40a5cca54ff9ec8961911c62e1aad7612f8896"} Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.841581 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtm7w" event={"ID":"22cc304e-478e-437d-8573-ed9972785178","Type":"ContainerDied","Data":"fc4563e2c130eeaac3cdd273fa11c584293fce9b3ea3a0bc8b1e73e4c7c083fb"} Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.841598 5047 scope.go:117] "RemoveContainer" containerID="e53ef40de1ddb0ed8e1c025c1a40a5cca54ff9ec8961911c62e1aad7612f8896" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.843027 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qtm7w" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.863372 5047 scope.go:117] "RemoveContainer" containerID="a43d80b174efcf0692b8bf2a07d8139ff57c2d15f1ef7973ddbeb6d2ae5eb2bd" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.881823 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qtm7w"] Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.893778 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qtm7w"] Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.905486 5047 scope.go:117] "RemoveContainer" containerID="37ba11be3adbbbb8de763afa0277a490d636b7b27e77c7ca6cd6ea7e90a7891a" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.928169 5047 scope.go:117] "RemoveContainer" containerID="e53ef40de1ddb0ed8e1c025c1a40a5cca54ff9ec8961911c62e1aad7612f8896" Feb 23 07:42:10 crc kubenswrapper[5047]: E0223 07:42:10.929622 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e53ef40de1ddb0ed8e1c025c1a40a5cca54ff9ec8961911c62e1aad7612f8896\": container with ID starting with e53ef40de1ddb0ed8e1c025c1a40a5cca54ff9ec8961911c62e1aad7612f8896 not found: ID does not exist" containerID="e53ef40de1ddb0ed8e1c025c1a40a5cca54ff9ec8961911c62e1aad7612f8896" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.929680 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e53ef40de1ddb0ed8e1c025c1a40a5cca54ff9ec8961911c62e1aad7612f8896"} err="failed to get container status \"e53ef40de1ddb0ed8e1c025c1a40a5cca54ff9ec8961911c62e1aad7612f8896\": rpc error: code = NotFound desc = could not find container \"e53ef40de1ddb0ed8e1c025c1a40a5cca54ff9ec8961911c62e1aad7612f8896\": container with ID starting with e53ef40de1ddb0ed8e1c025c1a40a5cca54ff9ec8961911c62e1aad7612f8896 not found: ID does not exist" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.929709 5047 scope.go:117] "RemoveContainer" containerID="a43d80b174efcf0692b8bf2a07d8139ff57c2d15f1ef7973ddbeb6d2ae5eb2bd" Feb 23 07:42:10 crc kubenswrapper[5047]: E0223 07:42:10.930332 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a43d80b174efcf0692b8bf2a07d8139ff57c2d15f1ef7973ddbeb6d2ae5eb2bd\": container with ID starting with a43d80b174efcf0692b8bf2a07d8139ff57c2d15f1ef7973ddbeb6d2ae5eb2bd not found: ID does not exist" containerID="a43d80b174efcf0692b8bf2a07d8139ff57c2d15f1ef7973ddbeb6d2ae5eb2bd" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.930375 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a43d80b174efcf0692b8bf2a07d8139ff57c2d15f1ef7973ddbeb6d2ae5eb2bd"} err="failed to get container status \"a43d80b174efcf0692b8bf2a07d8139ff57c2d15f1ef7973ddbeb6d2ae5eb2bd\": rpc error: code = NotFound desc = could not find container \"a43d80b174efcf0692b8bf2a07d8139ff57c2d15f1ef7973ddbeb6d2ae5eb2bd\": container with ID starting with a43d80b174efcf0692b8bf2a07d8139ff57c2d15f1ef7973ddbeb6d2ae5eb2bd not found: ID does not exist" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.930390 5047 scope.go:117] "RemoveContainer" containerID="37ba11be3adbbbb8de763afa0277a490d636b7b27e77c7ca6cd6ea7e90a7891a" Feb 23 07:42:10 crc kubenswrapper[5047]: E0223 07:42:10.930949 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37ba11be3adbbbb8de763afa0277a490d636b7b27e77c7ca6cd6ea7e90a7891a\": container with ID starting with 37ba11be3adbbbb8de763afa0277a490d636b7b27e77c7ca6cd6ea7e90a7891a not found: ID does not exist" containerID="37ba11be3adbbbb8de763afa0277a490d636b7b27e77c7ca6cd6ea7e90a7891a" Feb 23 07:42:10 crc kubenswrapper[5047]: I0223 07:42:10.930978 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ba11be3adbbbb8de763afa0277a490d636b7b27e77c7ca6cd6ea7e90a7891a"} err="failed to get container status \"37ba11be3adbbbb8de763afa0277a490d636b7b27e77c7ca6cd6ea7e90a7891a\": rpc error: code = NotFound desc = could not find container \"37ba11be3adbbbb8de763afa0277a490d636b7b27e77c7ca6cd6ea7e90a7891a\": container with ID starting with 37ba11be3adbbbb8de763afa0277a490d636b7b27e77c7ca6cd6ea7e90a7891a not found: ID does not exist" Feb 23 07:42:11 crc kubenswrapper[5047]: I0223 07:42:11.341601 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:42:11 crc kubenswrapper[5047]: E0223 07:42:11.342248 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:42:12 crc kubenswrapper[5047]: I0223 07:42:12.354094 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22cc304e-478e-437d-8573-ed9972785178" path="/var/lib/kubelet/pods/22cc304e-478e-437d-8573-ed9972785178/volumes" Feb 23 07:42:25 crc kubenswrapper[5047]: I0223 07:42:25.340987 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:42:25 crc kubenswrapper[5047]: E0223 07:42:25.341693 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:42:39 crc kubenswrapper[5047]: I0223 07:42:39.340874 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:42:39 crc kubenswrapper[5047]: E0223 07:42:39.342465 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:42:54 crc kubenswrapper[5047]: I0223 07:42:54.341664 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:42:55 crc kubenswrapper[5047]: I0223 07:42:55.310508 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"359c89ecbc9f34e05cf22629cd30f4da456985c8cc28afe334784ecc30ab0dc0"} Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.784857 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tc58h"] Feb 23 07:43:23 crc kubenswrapper[5047]: E0223 07:43:23.785791 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22cc304e-478e-437d-8573-ed9972785178" containerName="extract-content" Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.785809 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="22cc304e-478e-437d-8573-ed9972785178" containerName="extract-content" Feb 23 07:43:23 crc kubenswrapper[5047]: E0223 07:43:23.785830 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22cc304e-478e-437d-8573-ed9972785178" containerName="registry-server" Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.785836 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="22cc304e-478e-437d-8573-ed9972785178" containerName="registry-server" Feb 23 07:43:23 crc kubenswrapper[5047]: E0223 07:43:23.785851 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22cc304e-478e-437d-8573-ed9972785178" containerName="extract-utilities" Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.785857 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="22cc304e-478e-437d-8573-ed9972785178" containerName="extract-utilities" Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.786009 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="22cc304e-478e-437d-8573-ed9972785178" containerName="registry-server" Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.787141 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.811381 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tc58h"] Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.876244 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qfvv\" (UniqueName: \"kubernetes.io/projected/b9358f26-4094-48ad-a0a5-fcc6839014ec-kube-api-access-5qfvv\") pod \"community-operators-tc58h\" (UID: \"b9358f26-4094-48ad-a0a5-fcc6839014ec\") " pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.876332 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9358f26-4094-48ad-a0a5-fcc6839014ec-utilities\") pod \"community-operators-tc58h\" (UID: \"b9358f26-4094-48ad-a0a5-fcc6839014ec\") " pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.876399 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9358f26-4094-48ad-a0a5-fcc6839014ec-catalog-content\") pod \"community-operators-tc58h\" (UID: \"b9358f26-4094-48ad-a0a5-fcc6839014ec\") " pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.977631 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9358f26-4094-48ad-a0a5-fcc6839014ec-catalog-content\") pod \"community-operators-tc58h\" (UID: \"b9358f26-4094-48ad-a0a5-fcc6839014ec\") " pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.977798 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qfvv\" (UniqueName: \"kubernetes.io/projected/b9358f26-4094-48ad-a0a5-fcc6839014ec-kube-api-access-5qfvv\") pod \"community-operators-tc58h\" (UID: \"b9358f26-4094-48ad-a0a5-fcc6839014ec\") " pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.977858 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9358f26-4094-48ad-a0a5-fcc6839014ec-utilities\") pod \"community-operators-tc58h\" (UID: \"b9358f26-4094-48ad-a0a5-fcc6839014ec\") " pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.978445 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9358f26-4094-48ad-a0a5-fcc6839014ec-catalog-content\") pod \"community-operators-tc58h\" (UID: \"b9358f26-4094-48ad-a0a5-fcc6839014ec\") " pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:23 crc kubenswrapper[5047]: I0223 07:43:23.978459 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9358f26-4094-48ad-a0a5-fcc6839014ec-utilities\") pod \"community-operators-tc58h\" (UID: \"b9358f26-4094-48ad-a0a5-fcc6839014ec\") " pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:24 crc kubenswrapper[5047]: I0223 07:43:24.001311 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qfvv\" (UniqueName: \"kubernetes.io/projected/b9358f26-4094-48ad-a0a5-fcc6839014ec-kube-api-access-5qfvv\") pod \"community-operators-tc58h\" (UID: \"b9358f26-4094-48ad-a0a5-fcc6839014ec\") " pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:24 crc kubenswrapper[5047]: I0223 07:43:24.113991 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:24 crc kubenswrapper[5047]: I0223 07:43:24.440866 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tc58h"] Feb 23 07:43:24 crc kubenswrapper[5047]: I0223 07:43:24.615528 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc58h" event={"ID":"b9358f26-4094-48ad-a0a5-fcc6839014ec","Type":"ContainerStarted","Data":"dd0263317948c66b419971bf140771608b4da7b1e41dc4aaaab74afbd3a1c01c"} Feb 23 07:43:24 crc kubenswrapper[5047]: I0223 07:43:24.615594 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc58h" event={"ID":"b9358f26-4094-48ad-a0a5-fcc6839014ec","Type":"ContainerStarted","Data":"89bcc8944afcc49f76b23db13569f0817e60226531d7d019d809bdb6d4af7a0e"} Feb 23 07:43:25 crc kubenswrapper[5047]: I0223 07:43:25.626320 5047 generic.go:334] "Generic (PLEG): container finished" podID="b9358f26-4094-48ad-a0a5-fcc6839014ec" containerID="dd0263317948c66b419971bf140771608b4da7b1e41dc4aaaab74afbd3a1c01c" exitCode=0 Feb 23 07:43:25 crc kubenswrapper[5047]: I0223 07:43:25.626442 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc58h" event={"ID":"b9358f26-4094-48ad-a0a5-fcc6839014ec","Type":"ContainerDied","Data":"dd0263317948c66b419971bf140771608b4da7b1e41dc4aaaab74afbd3a1c01c"} Feb 23 07:43:26 crc kubenswrapper[5047]: I0223 07:43:26.639521 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc58h" event={"ID":"b9358f26-4094-48ad-a0a5-fcc6839014ec","Type":"ContainerStarted","Data":"f154b2a4cd9f127c6fd91bb5e5955601998e8601fbe7d5f87ee29273990ec873"} Feb 23 07:43:27 crc kubenswrapper[5047]: I0223 07:43:27.652875 5047 generic.go:334] "Generic (PLEG): container finished" podID="b9358f26-4094-48ad-a0a5-fcc6839014ec" containerID="f154b2a4cd9f127c6fd91bb5e5955601998e8601fbe7d5f87ee29273990ec873" exitCode=0 Feb 23 07:43:27 crc kubenswrapper[5047]: I0223 07:43:27.652979 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc58h" event={"ID":"b9358f26-4094-48ad-a0a5-fcc6839014ec","Type":"ContainerDied","Data":"f154b2a4cd9f127c6fd91bb5e5955601998e8601fbe7d5f87ee29273990ec873"} Feb 23 07:43:28 crc kubenswrapper[5047]: I0223 07:43:28.667638 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc58h" event={"ID":"b9358f26-4094-48ad-a0a5-fcc6839014ec","Type":"ContainerStarted","Data":"984ae19ba2d65229d467339034a6e185253e3d48e561851dc152b028e2d1440a"} Feb 23 07:43:28 crc kubenswrapper[5047]: I0223 07:43:28.694790 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tc58h" podStartSLOduration=3.245204292 podStartE2EDuration="5.694763283s" podCreationTimestamp="2026-02-23 07:43:23 +0000 UTC" firstStartedPulling="2026-02-23 07:43:25.628954141 +0000 UTC m=+3527.880281275" lastFinishedPulling="2026-02-23 07:43:28.078513092 +0000 UTC m=+3530.329840266" observedRunningTime="2026-02-23 07:43:28.692264057 +0000 UTC m=+3530.943591221" watchObservedRunningTime="2026-02-23 07:43:28.694763283 +0000 UTC m=+3530.946090437" Feb 23 07:43:34 crc kubenswrapper[5047]: I0223 07:43:34.114724 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:34 crc kubenswrapper[5047]: I0223 07:43:34.115967 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:34 crc kubenswrapper[5047]: I0223 07:43:34.180180 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:34 crc kubenswrapper[5047]: I0223 07:43:34.770723 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:37 crc kubenswrapper[5047]: I0223 07:43:37.976780 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tc58h"] Feb 23 07:43:37 crc kubenswrapper[5047]: I0223 07:43:37.977178 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tc58h" podUID="b9358f26-4094-48ad-a0a5-fcc6839014ec" containerName="registry-server" containerID="cri-o://984ae19ba2d65229d467339034a6e185253e3d48e561851dc152b028e2d1440a" gracePeriod=2 Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.464340 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.632522 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9358f26-4094-48ad-a0a5-fcc6839014ec-catalog-content\") pod \"b9358f26-4094-48ad-a0a5-fcc6839014ec\" (UID: \"b9358f26-4094-48ad-a0a5-fcc6839014ec\") " Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.633046 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qfvv\" (UniqueName: \"kubernetes.io/projected/b9358f26-4094-48ad-a0a5-fcc6839014ec-kube-api-access-5qfvv\") pod \"b9358f26-4094-48ad-a0a5-fcc6839014ec\" (UID: \"b9358f26-4094-48ad-a0a5-fcc6839014ec\") " Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.633194 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9358f26-4094-48ad-a0a5-fcc6839014ec-utilities\") pod \"b9358f26-4094-48ad-a0a5-fcc6839014ec\" (UID: \"b9358f26-4094-48ad-a0a5-fcc6839014ec\") " Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.635496 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9358f26-4094-48ad-a0a5-fcc6839014ec-utilities" (OuterVolumeSpecName: "utilities") pod "b9358f26-4094-48ad-a0a5-fcc6839014ec" (UID: "b9358f26-4094-48ad-a0a5-fcc6839014ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.644226 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9358f26-4094-48ad-a0a5-fcc6839014ec-kube-api-access-5qfvv" (OuterVolumeSpecName: "kube-api-access-5qfvv") pod "b9358f26-4094-48ad-a0a5-fcc6839014ec" (UID: "b9358f26-4094-48ad-a0a5-fcc6839014ec"). InnerVolumeSpecName "kube-api-access-5qfvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.713330 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9358f26-4094-48ad-a0a5-fcc6839014ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9358f26-4094-48ad-a0a5-fcc6839014ec" (UID: "b9358f26-4094-48ad-a0a5-fcc6839014ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.734777 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9358f26-4094-48ad-a0a5-fcc6839014ec-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.734820 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9358f26-4094-48ad-a0a5-fcc6839014ec-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.734848 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qfvv\" (UniqueName: \"kubernetes.io/projected/b9358f26-4094-48ad-a0a5-fcc6839014ec-kube-api-access-5qfvv\") on node \"crc\" DevicePath \"\"" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.769322 5047 generic.go:334] "Generic (PLEG): container finished" podID="b9358f26-4094-48ad-a0a5-fcc6839014ec" containerID="984ae19ba2d65229d467339034a6e185253e3d48e561851dc152b028e2d1440a" exitCode=0 Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.769418 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tc58h" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.769404 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc58h" event={"ID":"b9358f26-4094-48ad-a0a5-fcc6839014ec","Type":"ContainerDied","Data":"984ae19ba2d65229d467339034a6e185253e3d48e561851dc152b028e2d1440a"} Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.769549 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tc58h" event={"ID":"b9358f26-4094-48ad-a0a5-fcc6839014ec","Type":"ContainerDied","Data":"89bcc8944afcc49f76b23db13569f0817e60226531d7d019d809bdb6d4af7a0e"} Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.769594 5047 scope.go:117] "RemoveContainer" containerID="984ae19ba2d65229d467339034a6e185253e3d48e561851dc152b028e2d1440a" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.801504 5047 scope.go:117] "RemoveContainer" containerID="f154b2a4cd9f127c6fd91bb5e5955601998e8601fbe7d5f87ee29273990ec873" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.832831 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tc58h"] Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.833133 5047 scope.go:117] "RemoveContainer" containerID="dd0263317948c66b419971bf140771608b4da7b1e41dc4aaaab74afbd3a1c01c" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.851063 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tc58h"] Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.872309 5047 scope.go:117] "RemoveContainer" containerID="984ae19ba2d65229d467339034a6e185253e3d48e561851dc152b028e2d1440a" Feb 23 07:43:38 crc kubenswrapper[5047]: E0223 07:43:38.873015 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984ae19ba2d65229d467339034a6e185253e3d48e561851dc152b028e2d1440a\": container with ID starting with 984ae19ba2d65229d467339034a6e185253e3d48e561851dc152b028e2d1440a not found: ID does not exist" containerID="984ae19ba2d65229d467339034a6e185253e3d48e561851dc152b028e2d1440a" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.873071 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984ae19ba2d65229d467339034a6e185253e3d48e561851dc152b028e2d1440a"} err="failed to get container status \"984ae19ba2d65229d467339034a6e185253e3d48e561851dc152b028e2d1440a\": rpc error: code = NotFound desc = could not find container \"984ae19ba2d65229d467339034a6e185253e3d48e561851dc152b028e2d1440a\": container with ID starting with 984ae19ba2d65229d467339034a6e185253e3d48e561851dc152b028e2d1440a not found: ID does not exist" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.873100 5047 scope.go:117] "RemoveContainer" containerID="f154b2a4cd9f127c6fd91bb5e5955601998e8601fbe7d5f87ee29273990ec873" Feb 23 07:43:38 crc kubenswrapper[5047]: E0223 07:43:38.873513 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f154b2a4cd9f127c6fd91bb5e5955601998e8601fbe7d5f87ee29273990ec873\": container with ID starting with f154b2a4cd9f127c6fd91bb5e5955601998e8601fbe7d5f87ee29273990ec873 not found: ID does not exist" containerID="f154b2a4cd9f127c6fd91bb5e5955601998e8601fbe7d5f87ee29273990ec873" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.873568 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f154b2a4cd9f127c6fd91bb5e5955601998e8601fbe7d5f87ee29273990ec873"} err="failed to get container status \"f154b2a4cd9f127c6fd91bb5e5955601998e8601fbe7d5f87ee29273990ec873\": rpc error: code = NotFound desc = could not find container \"f154b2a4cd9f127c6fd91bb5e5955601998e8601fbe7d5f87ee29273990ec873\": container with ID starting with f154b2a4cd9f127c6fd91bb5e5955601998e8601fbe7d5f87ee29273990ec873 not found: ID does not exist" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.873603 5047 scope.go:117] "RemoveContainer" containerID="dd0263317948c66b419971bf140771608b4da7b1e41dc4aaaab74afbd3a1c01c" Feb 23 07:43:38 crc kubenswrapper[5047]: E0223 07:43:38.874007 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd0263317948c66b419971bf140771608b4da7b1e41dc4aaaab74afbd3a1c01c\": container with ID starting with dd0263317948c66b419971bf140771608b4da7b1e41dc4aaaab74afbd3a1c01c not found: ID does not exist" containerID="dd0263317948c66b419971bf140771608b4da7b1e41dc4aaaab74afbd3a1c01c" Feb 23 07:43:38 crc kubenswrapper[5047]: I0223 07:43:38.874068 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd0263317948c66b419971bf140771608b4da7b1e41dc4aaaab74afbd3a1c01c"} err="failed to get container status \"dd0263317948c66b419971bf140771608b4da7b1e41dc4aaaab74afbd3a1c01c\": rpc error: code = NotFound desc = could not find container \"dd0263317948c66b419971bf140771608b4da7b1e41dc4aaaab74afbd3a1c01c\": container with ID starting with dd0263317948c66b419971bf140771608b4da7b1e41dc4aaaab74afbd3a1c01c not found: ID does not exist" Feb 23 07:43:40 crc kubenswrapper[5047]: I0223 07:43:40.352979 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9358f26-4094-48ad-a0a5-fcc6839014ec" path="/var/lib/kubelet/pods/b9358f26-4094-48ad-a0a5-fcc6839014ec/volumes" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.002791 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zh7j8"] Feb 23 07:44:26 crc kubenswrapper[5047]: E0223 07:44:26.004284 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9358f26-4094-48ad-a0a5-fcc6839014ec" containerName="extract-utilities" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.004324 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9358f26-4094-48ad-a0a5-fcc6839014ec" containerName="extract-utilities" Feb 23 07:44:26 crc kubenswrapper[5047]: E0223 07:44:26.004358 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9358f26-4094-48ad-a0a5-fcc6839014ec" containerName="registry-server" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.004367 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9358f26-4094-48ad-a0a5-fcc6839014ec" containerName="registry-server" Feb 23 07:44:26 crc kubenswrapper[5047]: E0223 07:44:26.004401 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9358f26-4094-48ad-a0a5-fcc6839014ec" containerName="extract-content" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.004409 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9358f26-4094-48ad-a0a5-fcc6839014ec" containerName="extract-content" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.004613 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9358f26-4094-48ad-a0a5-fcc6839014ec" containerName="registry-server" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.006116 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.031040 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh7j8"] Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.100228 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-catalog-content\") pod \"redhat-marketplace-zh7j8\" (UID: \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\") " pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.100311 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-utilities\") pod \"redhat-marketplace-zh7j8\" (UID: \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\") " pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.100360 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxr4h\" (UniqueName: \"kubernetes.io/projected/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-kube-api-access-nxr4h\") pod \"redhat-marketplace-zh7j8\" (UID: \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\") " pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.202516 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-catalog-content\") pod \"redhat-marketplace-zh7j8\" (UID: \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\") " pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.202622 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-utilities\") pod \"redhat-marketplace-zh7j8\" (UID: \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\") " pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.202666 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxr4h\" (UniqueName: \"kubernetes.io/projected/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-kube-api-access-nxr4h\") pod \"redhat-marketplace-zh7j8\" (UID: \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\") " pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.203396 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-catalog-content\") pod \"redhat-marketplace-zh7j8\" (UID: \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\") " pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.203417 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-utilities\") pod \"redhat-marketplace-zh7j8\" (UID: \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\") " pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.233825 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxr4h\" (UniqueName: \"kubernetes.io/projected/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-kube-api-access-nxr4h\") pod \"redhat-marketplace-zh7j8\" (UID: \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\") " pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.331730 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:26 crc kubenswrapper[5047]: I0223 07:44:26.808715 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh7j8"] Feb 23 07:44:27 crc kubenswrapper[5047]: I0223 07:44:27.242049 5047 generic.go:334] "Generic (PLEG): container finished" podID="6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" containerID="b0bb181eb94df736ef695a15ca4a83567011d7b67c7f34605a6fc7d08afd144c" exitCode=0 Feb 23 07:44:27 crc kubenswrapper[5047]: I0223 07:44:27.242167 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh7j8" event={"ID":"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e","Type":"ContainerDied","Data":"b0bb181eb94df736ef695a15ca4a83567011d7b67c7f34605a6fc7d08afd144c"} Feb 23 07:44:27 crc kubenswrapper[5047]: I0223 07:44:27.242851 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh7j8" event={"ID":"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e","Type":"ContainerStarted","Data":"746253fdac0cafa8ba0f51c66585baa68e4339d7510915e59c67e39dde7563ff"} Feb 23 07:44:29 crc kubenswrapper[5047]: I0223 07:44:29.263620 5047 generic.go:334] "Generic (PLEG): container finished" podID="6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" containerID="e37a5cf92ad0d8eb9ab6d42fd2a918077ad484d01d605b558a3c596ad8e761be" exitCode=0 Feb 23 07:44:29 crc kubenswrapper[5047]: I0223 07:44:29.263705 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh7j8" event={"ID":"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e","Type":"ContainerDied","Data":"e37a5cf92ad0d8eb9ab6d42fd2a918077ad484d01d605b558a3c596ad8e761be"} Feb 23 07:44:30 crc kubenswrapper[5047]: I0223 07:44:30.271671 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh7j8" event={"ID":"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e","Type":"ContainerStarted","Data":"9419bae3fd83f2b1c40aa03a3b7fa9678d72613a5c828f2083d2a5aa096f6ddc"} Feb 23 07:44:30 crc kubenswrapper[5047]: I0223 07:44:30.294239 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zh7j8" podStartSLOduration=2.861503408 podStartE2EDuration="5.294210771s" podCreationTimestamp="2026-02-23 07:44:25 +0000 UTC" firstStartedPulling="2026-02-23 07:44:27.24461925 +0000 UTC m=+3589.495946394" lastFinishedPulling="2026-02-23 07:44:29.677326623 +0000 UTC m=+3591.928653757" observedRunningTime="2026-02-23 07:44:30.288056027 +0000 UTC m=+3592.539383161" watchObservedRunningTime="2026-02-23 07:44:30.294210771 +0000 UTC m=+3592.545537925" Feb 23 07:44:36 crc kubenswrapper[5047]: I0223 07:44:36.332077 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:36 crc kubenswrapper[5047]: I0223 07:44:36.332752 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:36 crc kubenswrapper[5047]: I0223 07:44:36.412248 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:37 crc kubenswrapper[5047]: I0223 07:44:37.426459 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:37 crc kubenswrapper[5047]: I0223 07:44:37.502136 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh7j8"] Feb 23 07:44:39 crc kubenswrapper[5047]: I0223 07:44:39.366458 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zh7j8" podUID="6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" containerName="registry-server" containerID="cri-o://9419bae3fd83f2b1c40aa03a3b7fa9678d72613a5c828f2083d2a5aa096f6ddc" gracePeriod=2 Feb 23 07:44:39 crc kubenswrapper[5047]: I0223 07:44:39.878960 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.035777 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-utilities\") pod \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\" (UID: \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\") " Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.035940 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-catalog-content\") pod \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\" (UID: \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\") " Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.036099 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxr4h\" (UniqueName: \"kubernetes.io/projected/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-kube-api-access-nxr4h\") pod \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\" (UID: \"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e\") " Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.037032 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-utilities" (OuterVolumeSpecName: "utilities") pod "6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" (UID: "6192e7b3-b69d-44d1-88f3-ff6cbd9e667e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.043235 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-kube-api-access-nxr4h" (OuterVolumeSpecName: "kube-api-access-nxr4h") pod "6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" (UID: "6192e7b3-b69d-44d1-88f3-ff6cbd9e667e"). InnerVolumeSpecName "kube-api-access-nxr4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.063973 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" (UID: "6192e7b3-b69d-44d1-88f3-ff6cbd9e667e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.137473 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.137510 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxr4h\" (UniqueName: \"kubernetes.io/projected/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-kube-api-access-nxr4h\") on node \"crc\" DevicePath \"\"" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.137523 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.375069 5047 generic.go:334] "Generic (PLEG): container finished" podID="6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" containerID="9419bae3fd83f2b1c40aa03a3b7fa9678d72613a5c828f2083d2a5aa096f6ddc" exitCode=0 Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.375126 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh7j8" event={"ID":"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e","Type":"ContainerDied","Data":"9419bae3fd83f2b1c40aa03a3b7fa9678d72613a5c828f2083d2a5aa096f6ddc"} Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.375168 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zh7j8" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.375202 5047 scope.go:117] "RemoveContainer" containerID="9419bae3fd83f2b1c40aa03a3b7fa9678d72613a5c828f2083d2a5aa096f6ddc" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.375181 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh7j8" event={"ID":"6192e7b3-b69d-44d1-88f3-ff6cbd9e667e","Type":"ContainerDied","Data":"746253fdac0cafa8ba0f51c66585baa68e4339d7510915e59c67e39dde7563ff"} Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.402479 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh7j8"] Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.404998 5047 scope.go:117] "RemoveContainer" containerID="e37a5cf92ad0d8eb9ab6d42fd2a918077ad484d01d605b558a3c596ad8e761be" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.410635 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh7j8"] Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.434952 5047 scope.go:117] "RemoveContainer" containerID="b0bb181eb94df736ef695a15ca4a83567011d7b67c7f34605a6fc7d08afd144c" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.483022 5047 scope.go:117] "RemoveContainer" containerID="9419bae3fd83f2b1c40aa03a3b7fa9678d72613a5c828f2083d2a5aa096f6ddc" Feb 23 07:44:40 crc kubenswrapper[5047]: E0223 07:44:40.483804 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9419bae3fd83f2b1c40aa03a3b7fa9678d72613a5c828f2083d2a5aa096f6ddc\": container with ID starting with 9419bae3fd83f2b1c40aa03a3b7fa9678d72613a5c828f2083d2a5aa096f6ddc not found: ID does not exist" containerID="9419bae3fd83f2b1c40aa03a3b7fa9678d72613a5c828f2083d2a5aa096f6ddc" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.483847 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9419bae3fd83f2b1c40aa03a3b7fa9678d72613a5c828f2083d2a5aa096f6ddc"} err="failed to get container status \"9419bae3fd83f2b1c40aa03a3b7fa9678d72613a5c828f2083d2a5aa096f6ddc\": rpc error: code = NotFound desc = could not find container \"9419bae3fd83f2b1c40aa03a3b7fa9678d72613a5c828f2083d2a5aa096f6ddc\": container with ID starting with 9419bae3fd83f2b1c40aa03a3b7fa9678d72613a5c828f2083d2a5aa096f6ddc not found: ID does not exist" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.483876 5047 scope.go:117] "RemoveContainer" containerID="e37a5cf92ad0d8eb9ab6d42fd2a918077ad484d01d605b558a3c596ad8e761be" Feb 23 07:44:40 crc kubenswrapper[5047]: E0223 07:44:40.484492 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37a5cf92ad0d8eb9ab6d42fd2a918077ad484d01d605b558a3c596ad8e761be\": container with ID starting with e37a5cf92ad0d8eb9ab6d42fd2a918077ad484d01d605b558a3c596ad8e761be not found: ID does not exist" containerID="e37a5cf92ad0d8eb9ab6d42fd2a918077ad484d01d605b558a3c596ad8e761be" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.484521 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37a5cf92ad0d8eb9ab6d42fd2a918077ad484d01d605b558a3c596ad8e761be"} err="failed to get container status \"e37a5cf92ad0d8eb9ab6d42fd2a918077ad484d01d605b558a3c596ad8e761be\": rpc error: code = NotFound desc = could not find container \"e37a5cf92ad0d8eb9ab6d42fd2a918077ad484d01d605b558a3c596ad8e761be\": container with ID starting with e37a5cf92ad0d8eb9ab6d42fd2a918077ad484d01d605b558a3c596ad8e761be not found: ID does not exist" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.484538 5047 scope.go:117] "RemoveContainer" containerID="b0bb181eb94df736ef695a15ca4a83567011d7b67c7f34605a6fc7d08afd144c" Feb 23 07:44:40 crc kubenswrapper[5047]: E0223 07:44:40.484978 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0bb181eb94df736ef695a15ca4a83567011d7b67c7f34605a6fc7d08afd144c\": container with ID starting with b0bb181eb94df736ef695a15ca4a83567011d7b67c7f34605a6fc7d08afd144c not found: ID does not exist" containerID="b0bb181eb94df736ef695a15ca4a83567011d7b67c7f34605a6fc7d08afd144c" Feb 23 07:44:40 crc kubenswrapper[5047]: I0223 07:44:40.485012 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0bb181eb94df736ef695a15ca4a83567011d7b67c7f34605a6fc7d08afd144c"} err="failed to get container status \"b0bb181eb94df736ef695a15ca4a83567011d7b67c7f34605a6fc7d08afd144c\": rpc error: code = NotFound desc = could not find container \"b0bb181eb94df736ef695a15ca4a83567011d7b67c7f34605a6fc7d08afd144c\": container with ID starting with b0bb181eb94df736ef695a15ca4a83567011d7b67c7f34605a6fc7d08afd144c not found: ID does not exist" Feb 23 07:44:42 crc kubenswrapper[5047]: I0223 07:44:42.354658 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" path="/var/lib/kubelet/pods/6192e7b3-b69d-44d1-88f3-ff6cbd9e667e/volumes" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.160814 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw"] Feb 23 07:45:00 crc kubenswrapper[5047]: E0223 07:45:00.161782 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" containerName="registry-server" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.161799 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" containerName="registry-server" Feb 23 07:45:00 crc kubenswrapper[5047]: E0223 07:45:00.161815 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" containerName="extract-utilities" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.161823 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" containerName="extract-utilities" Feb 23 07:45:00 crc kubenswrapper[5047]: E0223 07:45:00.161855 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" containerName="extract-content" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.161865 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" containerName="extract-content" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.162041 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="6192e7b3-b69d-44d1-88f3-ff6cbd9e667e" containerName="registry-server" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.162573 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.165270 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.165464 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.179488 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw"] Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.250557 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fac5c32a-1ae4-472a-8158-73ecc3260f3d-secret-volume\") pod \"collect-profiles-29530545-m4rqw\" (UID: \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.250609 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj8bv\" (UniqueName: \"kubernetes.io/projected/fac5c32a-1ae4-472a-8158-73ecc3260f3d-kube-api-access-sj8bv\") pod \"collect-profiles-29530545-m4rqw\" (UID: \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.250643 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fac5c32a-1ae4-472a-8158-73ecc3260f3d-config-volume\") pod \"collect-profiles-29530545-m4rqw\" (UID: \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.353177 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fac5c32a-1ae4-472a-8158-73ecc3260f3d-secret-volume\") pod \"collect-profiles-29530545-m4rqw\" (UID: \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.353430 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj8bv\" (UniqueName: \"kubernetes.io/projected/fac5c32a-1ae4-472a-8158-73ecc3260f3d-kube-api-access-sj8bv\") pod \"collect-profiles-29530545-m4rqw\" (UID: \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.353470 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fac5c32a-1ae4-472a-8158-73ecc3260f3d-config-volume\") pod \"collect-profiles-29530545-m4rqw\" (UID: \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.354620 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fac5c32a-1ae4-472a-8158-73ecc3260f3d-config-volume\") pod \"collect-profiles-29530545-m4rqw\" (UID: \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.361290 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fac5c32a-1ae4-472a-8158-73ecc3260f3d-secret-volume\") pod \"collect-profiles-29530545-m4rqw\" (UID: \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.402981 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj8bv\" (UniqueName: \"kubernetes.io/projected/fac5c32a-1ae4-472a-8158-73ecc3260f3d-kube-api-access-sj8bv\") pod \"collect-profiles-29530545-m4rqw\" (UID: \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" Feb 23 07:45:00 crc kubenswrapper[5047]: I0223 07:45:00.483186 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" Feb 23 07:45:01 crc kubenswrapper[5047]: I0223 07:45:01.009798 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw"] Feb 23 07:45:01 crc kubenswrapper[5047]: I0223 07:45:01.555084 5047 generic.go:334] "Generic (PLEG): container finished" podID="fac5c32a-1ae4-472a-8158-73ecc3260f3d" containerID="f56bb4bcb58240badcd123a44f7d98f5bf8f8bd27c80dbc9f57a3ddbcac1fc28" exitCode=0 Feb 23 07:45:01 crc kubenswrapper[5047]: I0223 07:45:01.555185 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" event={"ID":"fac5c32a-1ae4-472a-8158-73ecc3260f3d","Type":"ContainerDied","Data":"f56bb4bcb58240badcd123a44f7d98f5bf8f8bd27c80dbc9f57a3ddbcac1fc28"} Feb 23 07:45:01 crc kubenswrapper[5047]: I0223 07:45:01.555489 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" event={"ID":"fac5c32a-1ae4-472a-8158-73ecc3260f3d","Type":"ContainerStarted","Data":"2f7d0b5fbe99f3d0562c4545eb2b51658b8e14137fef810e6006e36272fe2d5e"} Feb 23 07:45:02 crc kubenswrapper[5047]: I0223 07:45:02.884716 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" Feb 23 07:45:02 crc kubenswrapper[5047]: I0223 07:45:02.995189 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fac5c32a-1ae4-472a-8158-73ecc3260f3d-secret-volume\") pod \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\" (UID: \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\") " Feb 23 07:45:02 crc kubenswrapper[5047]: I0223 07:45:02.995525 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fac5c32a-1ae4-472a-8158-73ecc3260f3d-config-volume\") pod \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\" (UID: \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\") " Feb 23 07:45:02 crc kubenswrapper[5047]: I0223 07:45:02.995653 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj8bv\" (UniqueName: \"kubernetes.io/projected/fac5c32a-1ae4-472a-8158-73ecc3260f3d-kube-api-access-sj8bv\") pod \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\" (UID: \"fac5c32a-1ae4-472a-8158-73ecc3260f3d\") " Feb 23 07:45:02 crc kubenswrapper[5047]: I0223 07:45:02.997042 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fac5c32a-1ae4-472a-8158-73ecc3260f3d-config-volume" (OuterVolumeSpecName: "config-volume") pod "fac5c32a-1ae4-472a-8158-73ecc3260f3d" (UID: "fac5c32a-1ae4-472a-8158-73ecc3260f3d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:45:03 crc kubenswrapper[5047]: I0223 07:45:03.002711 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac5c32a-1ae4-472a-8158-73ecc3260f3d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fac5c32a-1ae4-472a-8158-73ecc3260f3d" (UID: "fac5c32a-1ae4-472a-8158-73ecc3260f3d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:45:03 crc kubenswrapper[5047]: I0223 07:45:03.004301 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac5c32a-1ae4-472a-8158-73ecc3260f3d-kube-api-access-sj8bv" (OuterVolumeSpecName: "kube-api-access-sj8bv") pod "fac5c32a-1ae4-472a-8158-73ecc3260f3d" (UID: "fac5c32a-1ae4-472a-8158-73ecc3260f3d"). InnerVolumeSpecName "kube-api-access-sj8bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:45:03 crc kubenswrapper[5047]: I0223 07:45:03.097822 5047 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fac5c32a-1ae4-472a-8158-73ecc3260f3d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:45:03 crc kubenswrapper[5047]: I0223 07:45:03.097879 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj8bv\" (UniqueName: \"kubernetes.io/projected/fac5c32a-1ae4-472a-8158-73ecc3260f3d-kube-api-access-sj8bv\") on node \"crc\" DevicePath \"\"" Feb 23 07:45:03 crc kubenswrapper[5047]: I0223 07:45:03.097893 5047 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fac5c32a-1ae4-472a-8158-73ecc3260f3d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:45:03 crc kubenswrapper[5047]: I0223 07:45:03.595315 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" event={"ID":"fac5c32a-1ae4-472a-8158-73ecc3260f3d","Type":"ContainerDied","Data":"2f7d0b5fbe99f3d0562c4545eb2b51658b8e14137fef810e6006e36272fe2d5e"} Feb 23 07:45:03 crc kubenswrapper[5047]: I0223 07:45:03.595374 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f7d0b5fbe99f3d0562c4545eb2b51658b8e14137fef810e6006e36272fe2d5e" Feb 23 07:45:03 crc kubenswrapper[5047]: I0223 07:45:03.595393 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw" Feb 23 07:45:04 crc kubenswrapper[5047]: I0223 07:45:04.043996 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9"] Feb 23 07:45:04 crc kubenswrapper[5047]: I0223 07:45:04.058105 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-hchc9"] Feb 23 07:45:04 crc kubenswrapper[5047]: I0223 07:45:04.348710 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5340c59b-28d2-4263-9d55-44587a81ea28" path="/var/lib/kubelet/pods/5340c59b-28d2-4263-9d55-44587a81ea28/volumes" Feb 23 07:45:16 crc kubenswrapper[5047]: I0223 07:45:16.760549 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:45:16 crc kubenswrapper[5047]: I0223 07:45:16.764221 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:45:46 crc kubenswrapper[5047]: I0223 07:45:46.759651 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:45:46 crc kubenswrapper[5047]: I0223 07:45:46.760356 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:45:54 crc kubenswrapper[5047]: I0223 07:45:54.003277 5047 scope.go:117] "RemoveContainer" containerID="1891e4a1c2ab2142005656591d4d959173de2074304278f15288f5e43fb22087" Feb 23 07:46:16 crc kubenswrapper[5047]: I0223 07:46:16.760302 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:46:16 crc kubenswrapper[5047]: I0223 07:46:16.761099 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:46:16 crc kubenswrapper[5047]: I0223 07:46:16.761173 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 07:46:16 crc kubenswrapper[5047]: I0223 07:46:16.762148 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"359c89ecbc9f34e05cf22629cd30f4da456985c8cc28afe334784ecc30ab0dc0"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:46:16 crc kubenswrapper[5047]: I0223 07:46:16.762297 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://359c89ecbc9f34e05cf22629cd30f4da456985c8cc28afe334784ecc30ab0dc0" gracePeriod=600 Feb 23 07:46:17 crc kubenswrapper[5047]: I0223 07:46:17.326737 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="359c89ecbc9f34e05cf22629cd30f4da456985c8cc28afe334784ecc30ab0dc0" exitCode=0 Feb 23 07:46:17 crc kubenswrapper[5047]: I0223 07:46:17.326844 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"359c89ecbc9f34e05cf22629cd30f4da456985c8cc28afe334784ecc30ab0dc0"} Feb 23 07:46:17 crc kubenswrapper[5047]: I0223 07:46:17.329021 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863"} Feb 23 07:46:17 crc kubenswrapper[5047]: I0223 07:46:17.329064 5047 scope.go:117] "RemoveContainer" containerID="d1c0494a328ede6b488d216910417f865ad23b17adcf8b69eba817b407a6ea6e" Feb 23 07:48:46 crc kubenswrapper[5047]: I0223 07:48:46.759773 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:48:46 crc kubenswrapper[5047]: I0223 07:48:46.760720 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:49:16 crc kubenswrapper[5047]: I0223 07:49:16.759604 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:49:16 crc kubenswrapper[5047]: I0223 07:49:16.760503 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:49:46 crc kubenswrapper[5047]: I0223 07:49:46.759607 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:49:46 crc kubenswrapper[5047]: I0223 07:49:46.760578 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:49:46 crc kubenswrapper[5047]: I0223 07:49:46.760663 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 07:49:46 crc kubenswrapper[5047]: I0223 07:49:46.761625 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:49:46 crc kubenswrapper[5047]: I0223 07:49:46.761732 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" gracePeriod=600 Feb 23 07:49:46 crc kubenswrapper[5047]: E0223 07:49:46.895775 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:49:47 crc kubenswrapper[5047]: I0223 07:49:47.388132 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" exitCode=0 Feb 23 07:49:47 crc kubenswrapper[5047]: I0223 07:49:47.388213 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863"} Feb 23 07:49:47 crc kubenswrapper[5047]: I0223 07:49:47.388302 5047 scope.go:117] "RemoveContainer" containerID="359c89ecbc9f34e05cf22629cd30f4da456985c8cc28afe334784ecc30ab0dc0" Feb 23 07:49:47 crc kubenswrapper[5047]: I0223 07:49:47.389066 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:49:47 crc kubenswrapper[5047]: E0223 07:49:47.389419 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:50:01 crc kubenswrapper[5047]: I0223 07:50:01.341395 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:50:01 crc kubenswrapper[5047]: E0223 07:50:01.343874 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:50:15 crc kubenswrapper[5047]: I0223 07:50:15.342041 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:50:15 crc kubenswrapper[5047]: E0223 07:50:15.343278 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:50:27 crc kubenswrapper[5047]: I0223 07:50:27.341685 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:50:27 crc kubenswrapper[5047]: E0223 07:50:27.342860 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:50:38 crc kubenswrapper[5047]: I0223 07:50:38.346508 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:50:38 crc kubenswrapper[5047]: E0223 07:50:38.349002 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:50:50 crc kubenswrapper[5047]: I0223 07:50:50.340887 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:50:50 crc kubenswrapper[5047]: E0223 07:50:50.342225 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:51:03 crc kubenswrapper[5047]: I0223 07:51:03.341395 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:51:03 crc kubenswrapper[5047]: E0223 07:51:03.342327 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:51:16 crc kubenswrapper[5047]: I0223 07:51:16.341687 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:51:16 crc kubenswrapper[5047]: E0223 07:51:16.342779 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:51:27 crc kubenswrapper[5047]: I0223 07:51:27.341414 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:51:27 crc kubenswrapper[5047]: E0223 07:51:27.342709 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:51:39 crc kubenswrapper[5047]: I0223 07:51:39.342018 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:51:39 crc kubenswrapper[5047]: E0223 07:51:39.343156 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:51:52 crc kubenswrapper[5047]: I0223 07:51:52.342039 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:51:52 crc kubenswrapper[5047]: E0223 07:51:52.343377 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:52:03 crc kubenswrapper[5047]: I0223 07:52:03.341449 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:52:03 crc kubenswrapper[5047]: E0223 07:52:03.342173 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:52:15 crc kubenswrapper[5047]: I0223 07:52:15.341240 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:52:15 crc kubenswrapper[5047]: E0223 07:52:15.342011 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.354772 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k6z24"] Feb 23 07:52:27 crc kubenswrapper[5047]: E0223 07:52:27.355660 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac5c32a-1ae4-472a-8158-73ecc3260f3d" containerName="collect-profiles" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.355672 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac5c32a-1ae4-472a-8158-73ecc3260f3d" containerName="collect-profiles" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.355836 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac5c32a-1ae4-472a-8158-73ecc3260f3d" containerName="collect-profiles" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.356937 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.406710 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g75p\" (UniqueName: \"kubernetes.io/projected/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-kube-api-access-7g75p\") pod \"redhat-operators-k6z24\" (UID: \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\") " pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.406775 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-catalog-content\") pod \"redhat-operators-k6z24\" (UID: \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\") " pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.407169 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-utilities\") pod \"redhat-operators-k6z24\" (UID: \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\") " pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.439969 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6z24"] Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.509224 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g75p\" (UniqueName: \"kubernetes.io/projected/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-kube-api-access-7g75p\") pod \"redhat-operators-k6z24\" (UID: \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\") " pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.509307 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-catalog-content\") pod \"redhat-operators-k6z24\" (UID: \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\") " pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.509379 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-utilities\") pod \"redhat-operators-k6z24\" (UID: \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\") " pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.509894 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-utilities\") pod \"redhat-operators-k6z24\" (UID: \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\") " pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.510154 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-catalog-content\") pod \"redhat-operators-k6z24\" (UID: \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\") " pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.530485 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g75p\" (UniqueName: \"kubernetes.io/projected/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-kube-api-access-7g75p\") pod \"redhat-operators-k6z24\" (UID: \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\") " pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:27 crc kubenswrapper[5047]: I0223 07:52:27.673771 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:28 crc kubenswrapper[5047]: I0223 07:52:28.131539 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6z24"] Feb 23 07:52:28 crc kubenswrapper[5047]: I0223 07:52:28.348293 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:52:28 crc kubenswrapper[5047]: E0223 07:52:28.348960 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:52:29 crc kubenswrapper[5047]: I0223 07:52:29.038644 5047 generic.go:334] "Generic (PLEG): container finished" podID="66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" containerID="313a1a5ad7cd61f989e77565237fffd22ff6b8030065acb5358ec564a12d8ed4" exitCode=0 Feb 23 07:52:29 crc kubenswrapper[5047]: I0223 07:52:29.038694 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6z24" event={"ID":"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f","Type":"ContainerDied","Data":"313a1a5ad7cd61f989e77565237fffd22ff6b8030065acb5358ec564a12d8ed4"} Feb 23 07:52:29 crc kubenswrapper[5047]: I0223 07:52:29.038726 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6z24" event={"ID":"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f","Type":"ContainerStarted","Data":"7088dd59483baeebb923855822466250539248c05c3791dc50a3fece1465e5cf"} Feb 23 07:52:29 crc kubenswrapper[5047]: I0223 07:52:29.041240 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:52:30 crc kubenswrapper[5047]: I0223 07:52:30.051351 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6z24" event={"ID":"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f","Type":"ContainerStarted","Data":"98d9f5f3d2af5c8ed5ab998523dd3f6ab1b68db8005302af6822cb6ea2597472"} Feb 23 07:52:31 crc kubenswrapper[5047]: I0223 07:52:31.063209 5047 generic.go:334] "Generic (PLEG): container finished" podID="66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" containerID="98d9f5f3d2af5c8ed5ab998523dd3f6ab1b68db8005302af6822cb6ea2597472" exitCode=0 Feb 23 07:52:31 crc kubenswrapper[5047]: I0223 07:52:31.063357 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6z24" event={"ID":"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f","Type":"ContainerDied","Data":"98d9f5f3d2af5c8ed5ab998523dd3f6ab1b68db8005302af6822cb6ea2597472"} Feb 23 07:52:32 crc kubenswrapper[5047]: I0223 07:52:32.073174 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6z24" event={"ID":"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f","Type":"ContainerStarted","Data":"3f2986fc48b5bb53c9f41ed1c1cd293dab20896d5b456d12e1a492474f54c2bd"} Feb 23 07:52:32 crc kubenswrapper[5047]: I0223 07:52:32.103415 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k6z24" podStartSLOduration=2.462980177 podStartE2EDuration="5.103398449s" podCreationTimestamp="2026-02-23 07:52:27 +0000 UTC" firstStartedPulling="2026-02-23 07:52:29.040997502 +0000 UTC m=+4071.292324636" lastFinishedPulling="2026-02-23 07:52:31.681415754 +0000 UTC m=+4073.932742908" observedRunningTime="2026-02-23 07:52:32.097206792 +0000 UTC m=+4074.348533936" watchObservedRunningTime="2026-02-23 07:52:32.103398449 +0000 UTC m=+4074.354725583" Feb 23 07:52:37 crc kubenswrapper[5047]: I0223 07:52:37.674692 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:37 crc kubenswrapper[5047]: I0223 07:52:37.675406 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:38 crc kubenswrapper[5047]: I0223 07:52:38.725269 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k6z24" podUID="66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" containerName="registry-server" probeResult="failure" output=< Feb 23 07:52:38 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 07:52:38 crc kubenswrapper[5047]: > Feb 23 07:52:41 crc kubenswrapper[5047]: I0223 07:52:41.340989 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:52:41 crc kubenswrapper[5047]: E0223 07:52:41.341385 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:52:47 crc kubenswrapper[5047]: I0223 07:52:47.752434 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:47 crc kubenswrapper[5047]: I0223 07:52:47.821852 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:48 crc kubenswrapper[5047]: I0223 07:52:48.011671 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k6z24"] Feb 23 07:52:49 crc kubenswrapper[5047]: I0223 07:52:49.217034 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k6z24" podUID="66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" containerName="registry-server" containerID="cri-o://3f2986fc48b5bb53c9f41ed1c1cd293dab20896d5b456d12e1a492474f54c2bd" gracePeriod=2 Feb 23 07:52:49 crc kubenswrapper[5047]: I0223 07:52:49.681877 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:49 crc kubenswrapper[5047]: I0223 07:52:49.715275 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-utilities\") pod \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\" (UID: \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\") " Feb 23 07:52:49 crc kubenswrapper[5047]: I0223 07:52:49.715432 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-catalog-content\") pod \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\" (UID: \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\") " Feb 23 07:52:49 crc kubenswrapper[5047]: I0223 07:52:49.715516 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g75p\" (UniqueName: \"kubernetes.io/projected/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-kube-api-access-7g75p\") pod \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\" (UID: \"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f\") " Feb 23 07:52:49 crc kubenswrapper[5047]: I0223 07:52:49.716423 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-utilities" (OuterVolumeSpecName: "utilities") pod "66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" (UID: "66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:52:49 crc kubenswrapper[5047]: I0223 07:52:49.723279 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-kube-api-access-7g75p" (OuterVolumeSpecName: "kube-api-access-7g75p") pod "66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" (UID: "66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f"). InnerVolumeSpecName "kube-api-access-7g75p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:52:49 crc kubenswrapper[5047]: I0223 07:52:49.817772 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:52:49 crc kubenswrapper[5047]: I0223 07:52:49.817837 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g75p\" (UniqueName: \"kubernetes.io/projected/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-kube-api-access-7g75p\") on node \"crc\" DevicePath \"\"" Feb 23 07:52:49 crc kubenswrapper[5047]: I0223 07:52:49.835244 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" (UID: "66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:52:49 crc kubenswrapper[5047]: I0223 07:52:49.919046 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.225720 5047 generic.go:334] "Generic (PLEG): container finished" podID="66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" containerID="3f2986fc48b5bb53c9f41ed1c1cd293dab20896d5b456d12e1a492474f54c2bd" exitCode=0 Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.225768 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6z24" event={"ID":"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f","Type":"ContainerDied","Data":"3f2986fc48b5bb53c9f41ed1c1cd293dab20896d5b456d12e1a492474f54c2bd"} Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.225803 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6z24" event={"ID":"66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f","Type":"ContainerDied","Data":"7088dd59483baeebb923855822466250539248c05c3791dc50a3fece1465e5cf"} Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.225803 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6z24" Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.225822 5047 scope.go:117] "RemoveContainer" containerID="3f2986fc48b5bb53c9f41ed1c1cd293dab20896d5b456d12e1a492474f54c2bd" Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.247066 5047 scope.go:117] "RemoveContainer" containerID="98d9f5f3d2af5c8ed5ab998523dd3f6ab1b68db8005302af6822cb6ea2597472" Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.259659 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k6z24"] Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.263837 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k6z24"] Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.272463 5047 scope.go:117] "RemoveContainer" containerID="313a1a5ad7cd61f989e77565237fffd22ff6b8030065acb5358ec564a12d8ed4" Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.297057 5047 scope.go:117] "RemoveContainer" containerID="3f2986fc48b5bb53c9f41ed1c1cd293dab20896d5b456d12e1a492474f54c2bd" Feb 23 07:52:50 crc kubenswrapper[5047]: E0223 07:52:50.297449 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f2986fc48b5bb53c9f41ed1c1cd293dab20896d5b456d12e1a492474f54c2bd\": container with ID starting with 3f2986fc48b5bb53c9f41ed1c1cd293dab20896d5b456d12e1a492474f54c2bd not found: ID does not exist" containerID="3f2986fc48b5bb53c9f41ed1c1cd293dab20896d5b456d12e1a492474f54c2bd" Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.297479 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2986fc48b5bb53c9f41ed1c1cd293dab20896d5b456d12e1a492474f54c2bd"} err="failed to get container status \"3f2986fc48b5bb53c9f41ed1c1cd293dab20896d5b456d12e1a492474f54c2bd\": rpc error: code = NotFound desc = could not find container \"3f2986fc48b5bb53c9f41ed1c1cd293dab20896d5b456d12e1a492474f54c2bd\": container with ID starting with 3f2986fc48b5bb53c9f41ed1c1cd293dab20896d5b456d12e1a492474f54c2bd not found: ID does not exist" Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.297501 5047 scope.go:117] "RemoveContainer" containerID="98d9f5f3d2af5c8ed5ab998523dd3f6ab1b68db8005302af6822cb6ea2597472" Feb 23 07:52:50 crc kubenswrapper[5047]: E0223 07:52:50.297716 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d9f5f3d2af5c8ed5ab998523dd3f6ab1b68db8005302af6822cb6ea2597472\": container with ID starting with 98d9f5f3d2af5c8ed5ab998523dd3f6ab1b68db8005302af6822cb6ea2597472 not found: ID does not exist" containerID="98d9f5f3d2af5c8ed5ab998523dd3f6ab1b68db8005302af6822cb6ea2597472" Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.297736 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d9f5f3d2af5c8ed5ab998523dd3f6ab1b68db8005302af6822cb6ea2597472"} err="failed to get container status \"98d9f5f3d2af5c8ed5ab998523dd3f6ab1b68db8005302af6822cb6ea2597472\": rpc error: code = NotFound desc = could not find container \"98d9f5f3d2af5c8ed5ab998523dd3f6ab1b68db8005302af6822cb6ea2597472\": container with ID starting with 98d9f5f3d2af5c8ed5ab998523dd3f6ab1b68db8005302af6822cb6ea2597472 not found: ID does not exist" Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.297747 5047 scope.go:117] "RemoveContainer" containerID="313a1a5ad7cd61f989e77565237fffd22ff6b8030065acb5358ec564a12d8ed4" Feb 23 07:52:50 crc kubenswrapper[5047]: E0223 07:52:50.298046 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"313a1a5ad7cd61f989e77565237fffd22ff6b8030065acb5358ec564a12d8ed4\": container with ID starting with 313a1a5ad7cd61f989e77565237fffd22ff6b8030065acb5358ec564a12d8ed4 not found: ID does not exist" containerID="313a1a5ad7cd61f989e77565237fffd22ff6b8030065acb5358ec564a12d8ed4" Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.298070 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"313a1a5ad7cd61f989e77565237fffd22ff6b8030065acb5358ec564a12d8ed4"} err="failed to get container status \"313a1a5ad7cd61f989e77565237fffd22ff6b8030065acb5358ec564a12d8ed4\": rpc error: code = NotFound desc = could not find container \"313a1a5ad7cd61f989e77565237fffd22ff6b8030065acb5358ec564a12d8ed4\": container with ID starting with 313a1a5ad7cd61f989e77565237fffd22ff6b8030065acb5358ec564a12d8ed4 not found: ID does not exist" Feb 23 07:52:50 crc kubenswrapper[5047]: I0223 07:52:50.347337 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" path="/var/lib/kubelet/pods/66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f/volumes" Feb 23 07:52:55 crc kubenswrapper[5047]: I0223 07:52:55.341598 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:52:55 crc kubenswrapper[5047]: E0223 07:52:55.342512 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:53:09 crc kubenswrapper[5047]: I0223 07:53:09.341188 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:53:09 crc kubenswrapper[5047]: E0223 07:53:09.342422 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:53:24 crc kubenswrapper[5047]: I0223 07:53:24.340945 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:53:24 crc kubenswrapper[5047]: E0223 07:53:24.342201 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:53:25 crc kubenswrapper[5047]: I0223 07:53:25.806866 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dxkhr"] Feb 23 07:53:25 crc kubenswrapper[5047]: E0223 07:53:25.808222 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" containerName="extract-content" Feb 23 07:53:25 crc kubenswrapper[5047]: I0223 07:53:25.808318 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" containerName="extract-content" Feb 23 07:53:25 crc kubenswrapper[5047]: E0223 07:53:25.808390 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" containerName="extract-utilities" Feb 23 07:53:25 crc kubenswrapper[5047]: I0223 07:53:25.808448 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" containerName="extract-utilities" Feb 23 07:53:25 crc kubenswrapper[5047]: E0223 07:53:25.808508 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" containerName="registry-server" Feb 23 07:53:25 crc kubenswrapper[5047]: I0223 07:53:25.808565 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" containerName="registry-server" Feb 23 07:53:25 crc kubenswrapper[5047]: I0223 07:53:25.808836 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="66f5fb76-ce5b-4d13-8036-51ae0ffc4f8f" containerName="registry-server" Feb 23 07:53:25 crc kubenswrapper[5047]: I0223 07:53:25.810639 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:25 crc kubenswrapper[5047]: I0223 07:53:25.844640 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dxkhr"] Feb 23 07:53:25 crc kubenswrapper[5047]: I0223 07:53:25.921826 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbjsk\" (UniqueName: \"kubernetes.io/projected/426b37ab-98cd-4088-91a7-81854c832db6-kube-api-access-gbjsk\") pod \"certified-operators-dxkhr\" (UID: \"426b37ab-98cd-4088-91a7-81854c832db6\") " pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:25 crc kubenswrapper[5047]: I0223 07:53:25.922009 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426b37ab-98cd-4088-91a7-81854c832db6-utilities\") pod \"certified-operators-dxkhr\" (UID: \"426b37ab-98cd-4088-91a7-81854c832db6\") " pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:25 crc kubenswrapper[5047]: I0223 07:53:25.922088 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426b37ab-98cd-4088-91a7-81854c832db6-catalog-content\") pod \"certified-operators-dxkhr\" (UID: \"426b37ab-98cd-4088-91a7-81854c832db6\") " pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:26 crc kubenswrapper[5047]: I0223 07:53:26.023507 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426b37ab-98cd-4088-91a7-81854c832db6-catalog-content\") pod \"certified-operators-dxkhr\" (UID: \"426b37ab-98cd-4088-91a7-81854c832db6\") " pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:26 crc kubenswrapper[5047]: I0223 07:53:26.023583 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbjsk\" (UniqueName: \"kubernetes.io/projected/426b37ab-98cd-4088-91a7-81854c832db6-kube-api-access-gbjsk\") pod \"certified-operators-dxkhr\" (UID: \"426b37ab-98cd-4088-91a7-81854c832db6\") " pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:26 crc kubenswrapper[5047]: I0223 07:53:26.023680 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426b37ab-98cd-4088-91a7-81854c832db6-utilities\") pod \"certified-operators-dxkhr\" (UID: \"426b37ab-98cd-4088-91a7-81854c832db6\") " pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:26 crc kubenswrapper[5047]: I0223 07:53:26.024392 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426b37ab-98cd-4088-91a7-81854c832db6-catalog-content\") pod \"certified-operators-dxkhr\" (UID: \"426b37ab-98cd-4088-91a7-81854c832db6\") " pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:26 crc kubenswrapper[5047]: I0223 07:53:26.024436 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426b37ab-98cd-4088-91a7-81854c832db6-utilities\") pod \"certified-operators-dxkhr\" (UID: \"426b37ab-98cd-4088-91a7-81854c832db6\") " pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:26 crc kubenswrapper[5047]: I0223 07:53:26.051341 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbjsk\" (UniqueName: \"kubernetes.io/projected/426b37ab-98cd-4088-91a7-81854c832db6-kube-api-access-gbjsk\") pod \"certified-operators-dxkhr\" (UID: \"426b37ab-98cd-4088-91a7-81854c832db6\") " pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:26 crc kubenswrapper[5047]: I0223 07:53:26.141118 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:26 crc kubenswrapper[5047]: I0223 07:53:26.424290 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dxkhr"] Feb 23 07:53:26 crc kubenswrapper[5047]: I0223 07:53:26.556537 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxkhr" event={"ID":"426b37ab-98cd-4088-91a7-81854c832db6","Type":"ContainerStarted","Data":"93cd2d6f34b37495bb9d7f6e4779f8a769b1e798fd6a07ca61a698c915158cc5"} Feb 23 07:53:27 crc kubenswrapper[5047]: I0223 07:53:27.568755 5047 generic.go:334] "Generic (PLEG): container finished" podID="426b37ab-98cd-4088-91a7-81854c832db6" containerID="ba5a9d294ace5ea0bba6ab114043ba64670d8b6b8a8e79a02843cb573bfe34ee" exitCode=0 Feb 23 07:53:27 crc kubenswrapper[5047]: I0223 07:53:27.568868 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxkhr" event={"ID":"426b37ab-98cd-4088-91a7-81854c832db6","Type":"ContainerDied","Data":"ba5a9d294ace5ea0bba6ab114043ba64670d8b6b8a8e79a02843cb573bfe34ee"} Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.415554 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mv2wk"] Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.418602 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.437283 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mv2wk"] Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.477986 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465dec46-379b-4071-bea8-06eacd0286b5-utilities\") pod \"community-operators-mv2wk\" (UID: \"465dec46-379b-4071-bea8-06eacd0286b5\") " pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.478708 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465dec46-379b-4071-bea8-06eacd0286b5-catalog-content\") pod \"community-operators-mv2wk\" (UID: \"465dec46-379b-4071-bea8-06eacd0286b5\") " pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.478859 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wshv4\" (UniqueName: \"kubernetes.io/projected/465dec46-379b-4071-bea8-06eacd0286b5-kube-api-access-wshv4\") pod \"community-operators-mv2wk\" (UID: \"465dec46-379b-4071-bea8-06eacd0286b5\") " pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.580716 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465dec46-379b-4071-bea8-06eacd0286b5-utilities\") pod \"community-operators-mv2wk\" (UID: \"465dec46-379b-4071-bea8-06eacd0286b5\") " pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.580768 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465dec46-379b-4071-bea8-06eacd0286b5-catalog-content\") pod \"community-operators-mv2wk\" (UID: \"465dec46-379b-4071-bea8-06eacd0286b5\") " pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.580804 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wshv4\" (UniqueName: \"kubernetes.io/projected/465dec46-379b-4071-bea8-06eacd0286b5-kube-api-access-wshv4\") pod \"community-operators-mv2wk\" (UID: \"465dec46-379b-4071-bea8-06eacd0286b5\") " pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.580949 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxkhr" event={"ID":"426b37ab-98cd-4088-91a7-81854c832db6","Type":"ContainerStarted","Data":"be9b092b79b1a2912812cd6337b94e42d1c7433ddd9492e3f076735678533f45"} Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.581875 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465dec46-379b-4071-bea8-06eacd0286b5-catalog-content\") pod \"community-operators-mv2wk\" (UID: \"465dec46-379b-4071-bea8-06eacd0286b5\") " pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.582667 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465dec46-379b-4071-bea8-06eacd0286b5-utilities\") pod \"community-operators-mv2wk\" (UID: \"465dec46-379b-4071-bea8-06eacd0286b5\") " pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.620273 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wshv4\" (UniqueName: \"kubernetes.io/projected/465dec46-379b-4071-bea8-06eacd0286b5-kube-api-access-wshv4\") pod \"community-operators-mv2wk\" (UID: \"465dec46-379b-4071-bea8-06eacd0286b5\") " pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:28 crc kubenswrapper[5047]: I0223 07:53:28.778893 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:29 crc kubenswrapper[5047]: I0223 07:53:29.265641 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mv2wk"] Feb 23 07:53:29 crc kubenswrapper[5047]: I0223 07:53:29.595239 5047 generic.go:334] "Generic (PLEG): container finished" podID="465dec46-379b-4071-bea8-06eacd0286b5" containerID="b452a7bf4115d17b4397e620c9929009c57b03075d68337488421a2182dadd88" exitCode=0 Feb 23 07:53:29 crc kubenswrapper[5047]: I0223 07:53:29.595314 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mv2wk" event={"ID":"465dec46-379b-4071-bea8-06eacd0286b5","Type":"ContainerDied","Data":"b452a7bf4115d17b4397e620c9929009c57b03075d68337488421a2182dadd88"} Feb 23 07:53:29 crc kubenswrapper[5047]: I0223 07:53:29.597000 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mv2wk" event={"ID":"465dec46-379b-4071-bea8-06eacd0286b5","Type":"ContainerStarted","Data":"a1798676396e2c303e0198a2c1651b692f3886dbe03c24d307399bac0f052042"} Feb 23 07:53:29 crc kubenswrapper[5047]: I0223 07:53:29.604248 5047 generic.go:334] "Generic (PLEG): container finished" podID="426b37ab-98cd-4088-91a7-81854c832db6" containerID="be9b092b79b1a2912812cd6337b94e42d1c7433ddd9492e3f076735678533f45" exitCode=0 Feb 23 07:53:29 crc kubenswrapper[5047]: I0223 07:53:29.604284 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxkhr" event={"ID":"426b37ab-98cd-4088-91a7-81854c832db6","Type":"ContainerDied","Data":"be9b092b79b1a2912812cd6337b94e42d1c7433ddd9492e3f076735678533f45"} Feb 23 07:53:30 crc kubenswrapper[5047]: I0223 07:53:30.612457 5047 generic.go:334] "Generic (PLEG): container finished" podID="465dec46-379b-4071-bea8-06eacd0286b5" containerID="a619790e3f692640ba63a350f833e71821cbb8649b4ca052843fd3d59298c1a0" exitCode=0 Feb 23 07:53:30 crc kubenswrapper[5047]: I0223 07:53:30.612572 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mv2wk" event={"ID":"465dec46-379b-4071-bea8-06eacd0286b5","Type":"ContainerDied","Data":"a619790e3f692640ba63a350f833e71821cbb8649b4ca052843fd3d59298c1a0"} Feb 23 07:53:30 crc kubenswrapper[5047]: I0223 07:53:30.615567 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxkhr" event={"ID":"426b37ab-98cd-4088-91a7-81854c832db6","Type":"ContainerStarted","Data":"fdf4ce4674ef318ad936a1867764fdb7d78c0dc1cf10a5d243327180034d30b9"} Feb 23 07:53:30 crc kubenswrapper[5047]: I0223 07:53:30.670739 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dxkhr" podStartSLOduration=3.173286256 podStartE2EDuration="5.670714076s" podCreationTimestamp="2026-02-23 07:53:25 +0000 UTC" firstStartedPulling="2026-02-23 07:53:27.571783572 +0000 UTC m=+4129.823110716" lastFinishedPulling="2026-02-23 07:53:30.069211362 +0000 UTC m=+4132.320538536" observedRunningTime="2026-02-23 07:53:30.662375161 +0000 UTC m=+4132.913702295" watchObservedRunningTime="2026-02-23 07:53:30.670714076 +0000 UTC m=+4132.922041230" Feb 23 07:53:31 crc kubenswrapper[5047]: I0223 07:53:31.627267 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mv2wk" event={"ID":"465dec46-379b-4071-bea8-06eacd0286b5","Type":"ContainerStarted","Data":"b783ff62d5fdf5f395d81128db897860aa7c94e833cf56f767005ebd502c4386"} Feb 23 07:53:31 crc kubenswrapper[5047]: I0223 07:53:31.657192 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mv2wk" podStartSLOduration=2.185368342 podStartE2EDuration="3.657171295s" podCreationTimestamp="2026-02-23 07:53:28 +0000 UTC" firstStartedPulling="2026-02-23 07:53:29.598462062 +0000 UTC m=+4131.849789196" lastFinishedPulling="2026-02-23 07:53:31.070265005 +0000 UTC m=+4133.321592149" observedRunningTime="2026-02-23 07:53:31.653377933 +0000 UTC m=+4133.904705077" watchObservedRunningTime="2026-02-23 07:53:31.657171295 +0000 UTC m=+4133.908498439" Feb 23 07:53:35 crc kubenswrapper[5047]: I0223 07:53:35.340864 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:53:35 crc kubenswrapper[5047]: E0223 07:53:35.341605 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:53:36 crc kubenswrapper[5047]: I0223 07:53:36.141886 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:36 crc kubenswrapper[5047]: I0223 07:53:36.141981 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:36 crc kubenswrapper[5047]: I0223 07:53:36.193342 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:36 crc kubenswrapper[5047]: I0223 07:53:36.736688 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:36 crc kubenswrapper[5047]: I0223 07:53:36.817886 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dxkhr"] Feb 23 07:53:38 crc kubenswrapper[5047]: I0223 07:53:38.685017 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dxkhr" podUID="426b37ab-98cd-4088-91a7-81854c832db6" containerName="registry-server" containerID="cri-o://fdf4ce4674ef318ad936a1867764fdb7d78c0dc1cf10a5d243327180034d30b9" gracePeriod=2 Feb 23 07:53:38 crc kubenswrapper[5047]: I0223 07:53:38.779325 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:38 crc kubenswrapper[5047]: I0223 07:53:38.779390 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:38 crc kubenswrapper[5047]: I0223 07:53:38.832895 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.138749 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.304697 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbjsk\" (UniqueName: \"kubernetes.io/projected/426b37ab-98cd-4088-91a7-81854c832db6-kube-api-access-gbjsk\") pod \"426b37ab-98cd-4088-91a7-81854c832db6\" (UID: \"426b37ab-98cd-4088-91a7-81854c832db6\") " Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.304853 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426b37ab-98cd-4088-91a7-81854c832db6-utilities\") pod \"426b37ab-98cd-4088-91a7-81854c832db6\" (UID: \"426b37ab-98cd-4088-91a7-81854c832db6\") " Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.305024 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426b37ab-98cd-4088-91a7-81854c832db6-catalog-content\") pod \"426b37ab-98cd-4088-91a7-81854c832db6\" (UID: \"426b37ab-98cd-4088-91a7-81854c832db6\") " Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.305897 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426b37ab-98cd-4088-91a7-81854c832db6-utilities" (OuterVolumeSpecName: "utilities") pod "426b37ab-98cd-4088-91a7-81854c832db6" (UID: "426b37ab-98cd-4088-91a7-81854c832db6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.311161 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/426b37ab-98cd-4088-91a7-81854c832db6-kube-api-access-gbjsk" (OuterVolumeSpecName: "kube-api-access-gbjsk") pod "426b37ab-98cd-4088-91a7-81854c832db6" (UID: "426b37ab-98cd-4088-91a7-81854c832db6"). InnerVolumeSpecName "kube-api-access-gbjsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.363777 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426b37ab-98cd-4088-91a7-81854c832db6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "426b37ab-98cd-4088-91a7-81854c832db6" (UID: "426b37ab-98cd-4088-91a7-81854c832db6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.406863 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbjsk\" (UniqueName: \"kubernetes.io/projected/426b37ab-98cd-4088-91a7-81854c832db6-kube-api-access-gbjsk\") on node \"crc\" DevicePath \"\"" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.406930 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426b37ab-98cd-4088-91a7-81854c832db6-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.406946 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426b37ab-98cd-4088-91a7-81854c832db6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.699418 5047 generic.go:334] "Generic (PLEG): container finished" podID="426b37ab-98cd-4088-91a7-81854c832db6" containerID="fdf4ce4674ef318ad936a1867764fdb7d78c0dc1cf10a5d243327180034d30b9" exitCode=0 Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.699496 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxkhr" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.699548 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxkhr" event={"ID":"426b37ab-98cd-4088-91a7-81854c832db6","Type":"ContainerDied","Data":"fdf4ce4674ef318ad936a1867764fdb7d78c0dc1cf10a5d243327180034d30b9"} Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.699620 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxkhr" event={"ID":"426b37ab-98cd-4088-91a7-81854c832db6","Type":"ContainerDied","Data":"93cd2d6f34b37495bb9d7f6e4779f8a769b1e798fd6a07ca61a698c915158cc5"} Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.699657 5047 scope.go:117] "RemoveContainer" containerID="fdf4ce4674ef318ad936a1867764fdb7d78c0dc1cf10a5d243327180034d30b9" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.739501 5047 scope.go:117] "RemoveContainer" containerID="be9b092b79b1a2912812cd6337b94e42d1c7433ddd9492e3f076735678533f45" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.749595 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dxkhr"] Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.767288 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dxkhr"] Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.773242 5047 scope.go:117] "RemoveContainer" containerID="ba5a9d294ace5ea0bba6ab114043ba64670d8b6b8a8e79a02843cb573bfe34ee" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.776198 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.819634 5047 scope.go:117] "RemoveContainer" containerID="fdf4ce4674ef318ad936a1867764fdb7d78c0dc1cf10a5d243327180034d30b9" Feb 23 07:53:39 crc kubenswrapper[5047]: E0223 07:53:39.820424 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf4ce4674ef318ad936a1867764fdb7d78c0dc1cf10a5d243327180034d30b9\": container with ID starting with fdf4ce4674ef318ad936a1867764fdb7d78c0dc1cf10a5d243327180034d30b9 not found: ID does not exist" containerID="fdf4ce4674ef318ad936a1867764fdb7d78c0dc1cf10a5d243327180034d30b9" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.820497 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf4ce4674ef318ad936a1867764fdb7d78c0dc1cf10a5d243327180034d30b9"} err="failed to get container status \"fdf4ce4674ef318ad936a1867764fdb7d78c0dc1cf10a5d243327180034d30b9\": rpc error: code = NotFound desc = could not find container \"fdf4ce4674ef318ad936a1867764fdb7d78c0dc1cf10a5d243327180034d30b9\": container with ID starting with fdf4ce4674ef318ad936a1867764fdb7d78c0dc1cf10a5d243327180034d30b9 not found: ID does not exist" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.820553 5047 scope.go:117] "RemoveContainer" containerID="be9b092b79b1a2912812cd6337b94e42d1c7433ddd9492e3f076735678533f45" Feb 23 07:53:39 crc kubenswrapper[5047]: E0223 07:53:39.821124 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9b092b79b1a2912812cd6337b94e42d1c7433ddd9492e3f076735678533f45\": container with ID starting with be9b092b79b1a2912812cd6337b94e42d1c7433ddd9492e3f076735678533f45 not found: ID does not exist" containerID="be9b092b79b1a2912812cd6337b94e42d1c7433ddd9492e3f076735678533f45" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.821185 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9b092b79b1a2912812cd6337b94e42d1c7433ddd9492e3f076735678533f45"} err="failed to get container status \"be9b092b79b1a2912812cd6337b94e42d1c7433ddd9492e3f076735678533f45\": rpc error: code = NotFound desc = could not find container \"be9b092b79b1a2912812cd6337b94e42d1c7433ddd9492e3f076735678533f45\": container with ID starting with be9b092b79b1a2912812cd6337b94e42d1c7433ddd9492e3f076735678533f45 not found: ID does not exist" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.821229 5047 scope.go:117] "RemoveContainer" containerID="ba5a9d294ace5ea0bba6ab114043ba64670d8b6b8a8e79a02843cb573bfe34ee" Feb 23 07:53:39 crc kubenswrapper[5047]: E0223 07:53:39.822247 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba5a9d294ace5ea0bba6ab114043ba64670d8b6b8a8e79a02843cb573bfe34ee\": container with ID starting with ba5a9d294ace5ea0bba6ab114043ba64670d8b6b8a8e79a02843cb573bfe34ee not found: ID does not exist" containerID="ba5a9d294ace5ea0bba6ab114043ba64670d8b6b8a8e79a02843cb573bfe34ee" Feb 23 07:53:39 crc kubenswrapper[5047]: I0223 07:53:39.822282 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5a9d294ace5ea0bba6ab114043ba64670d8b6b8a8e79a02843cb573bfe34ee"} err="failed to get container status \"ba5a9d294ace5ea0bba6ab114043ba64670d8b6b8a8e79a02843cb573bfe34ee\": rpc error: code = NotFound desc = could not find container \"ba5a9d294ace5ea0bba6ab114043ba64670d8b6b8a8e79a02843cb573bfe34ee\": container with ID starting with ba5a9d294ace5ea0bba6ab114043ba64670d8b6b8a8e79a02843cb573bfe34ee not found: ID does not exist" Feb 23 07:53:40 crc kubenswrapper[5047]: I0223 07:53:40.348685 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="426b37ab-98cd-4088-91a7-81854c832db6" path="/var/lib/kubelet/pods/426b37ab-98cd-4088-91a7-81854c832db6/volumes" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.036120 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mv2wk"] Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.038527 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mv2wk" podUID="465dec46-379b-4071-bea8-06eacd0286b5" containerName="registry-server" containerID="cri-o://b783ff62d5fdf5f395d81128db897860aa7c94e833cf56f767005ebd502c4386" gracePeriod=2 Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.499607 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.661806 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465dec46-379b-4071-bea8-06eacd0286b5-catalog-content\") pod \"465dec46-379b-4071-bea8-06eacd0286b5\" (UID: \"465dec46-379b-4071-bea8-06eacd0286b5\") " Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.661893 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wshv4\" (UniqueName: \"kubernetes.io/projected/465dec46-379b-4071-bea8-06eacd0286b5-kube-api-access-wshv4\") pod \"465dec46-379b-4071-bea8-06eacd0286b5\" (UID: \"465dec46-379b-4071-bea8-06eacd0286b5\") " Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.661977 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465dec46-379b-4071-bea8-06eacd0286b5-utilities\") pod \"465dec46-379b-4071-bea8-06eacd0286b5\" (UID: \"465dec46-379b-4071-bea8-06eacd0286b5\") " Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.662963 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/465dec46-379b-4071-bea8-06eacd0286b5-utilities" (OuterVolumeSpecName: "utilities") pod "465dec46-379b-4071-bea8-06eacd0286b5" (UID: "465dec46-379b-4071-bea8-06eacd0286b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.670579 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/465dec46-379b-4071-bea8-06eacd0286b5-kube-api-access-wshv4" (OuterVolumeSpecName: "kube-api-access-wshv4") pod "465dec46-379b-4071-bea8-06eacd0286b5" (UID: "465dec46-379b-4071-bea8-06eacd0286b5"). InnerVolumeSpecName "kube-api-access-wshv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.720987 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/465dec46-379b-4071-bea8-06eacd0286b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "465dec46-379b-4071-bea8-06eacd0286b5" (UID: "465dec46-379b-4071-bea8-06eacd0286b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.736634 5047 generic.go:334] "Generic (PLEG): container finished" podID="465dec46-379b-4071-bea8-06eacd0286b5" containerID="b783ff62d5fdf5f395d81128db897860aa7c94e833cf56f767005ebd502c4386" exitCode=0 Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.736720 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mv2wk" event={"ID":"465dec46-379b-4071-bea8-06eacd0286b5","Type":"ContainerDied","Data":"b783ff62d5fdf5f395d81128db897860aa7c94e833cf56f767005ebd502c4386"} Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.736790 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mv2wk" event={"ID":"465dec46-379b-4071-bea8-06eacd0286b5","Type":"ContainerDied","Data":"a1798676396e2c303e0198a2c1651b692f3886dbe03c24d307399bac0f052042"} Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.736829 5047 scope.go:117] "RemoveContainer" containerID="b783ff62d5fdf5f395d81128db897860aa7c94e833cf56f767005ebd502c4386" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.737645 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mv2wk" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.756023 5047 scope.go:117] "RemoveContainer" containerID="a619790e3f692640ba63a350f833e71821cbb8649b4ca052843fd3d59298c1a0" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.764465 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465dec46-379b-4071-bea8-06eacd0286b5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.764550 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wshv4\" (UniqueName: \"kubernetes.io/projected/465dec46-379b-4071-bea8-06eacd0286b5-kube-api-access-wshv4\") on node \"crc\" DevicePath \"\"" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.764580 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465dec46-379b-4071-bea8-06eacd0286b5-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.785595 5047 scope.go:117] "RemoveContainer" containerID="b452a7bf4115d17b4397e620c9929009c57b03075d68337488421a2182dadd88" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.789549 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mv2wk"] Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.795662 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mv2wk"] Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.810162 5047 scope.go:117] "RemoveContainer" containerID="b783ff62d5fdf5f395d81128db897860aa7c94e833cf56f767005ebd502c4386" Feb 23 07:53:42 crc kubenswrapper[5047]: E0223 07:53:42.810790 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b783ff62d5fdf5f395d81128db897860aa7c94e833cf56f767005ebd502c4386\": container with ID starting with b783ff62d5fdf5f395d81128db897860aa7c94e833cf56f767005ebd502c4386 not found: ID does not exist" containerID="b783ff62d5fdf5f395d81128db897860aa7c94e833cf56f767005ebd502c4386" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.810861 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b783ff62d5fdf5f395d81128db897860aa7c94e833cf56f767005ebd502c4386"} err="failed to get container status \"b783ff62d5fdf5f395d81128db897860aa7c94e833cf56f767005ebd502c4386\": rpc error: code = NotFound desc = could not find container \"b783ff62d5fdf5f395d81128db897860aa7c94e833cf56f767005ebd502c4386\": container with ID starting with b783ff62d5fdf5f395d81128db897860aa7c94e833cf56f767005ebd502c4386 not found: ID does not exist" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.810957 5047 scope.go:117] "RemoveContainer" containerID="a619790e3f692640ba63a350f833e71821cbb8649b4ca052843fd3d59298c1a0" Feb 23 07:53:42 crc kubenswrapper[5047]: E0223 07:53:42.811501 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a619790e3f692640ba63a350f833e71821cbb8649b4ca052843fd3d59298c1a0\": container with ID starting with a619790e3f692640ba63a350f833e71821cbb8649b4ca052843fd3d59298c1a0 not found: ID does not exist" containerID="a619790e3f692640ba63a350f833e71821cbb8649b4ca052843fd3d59298c1a0" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.811540 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a619790e3f692640ba63a350f833e71821cbb8649b4ca052843fd3d59298c1a0"} err="failed to get container status \"a619790e3f692640ba63a350f833e71821cbb8649b4ca052843fd3d59298c1a0\": rpc error: code = NotFound desc = could not find container \"a619790e3f692640ba63a350f833e71821cbb8649b4ca052843fd3d59298c1a0\": container with ID starting with a619790e3f692640ba63a350f833e71821cbb8649b4ca052843fd3d59298c1a0 not found: ID does not exist" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.811568 5047 scope.go:117] "RemoveContainer" containerID="b452a7bf4115d17b4397e620c9929009c57b03075d68337488421a2182dadd88" Feb 23 07:53:42 crc kubenswrapper[5047]: E0223 07:53:42.811849 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b452a7bf4115d17b4397e620c9929009c57b03075d68337488421a2182dadd88\": container with ID starting with b452a7bf4115d17b4397e620c9929009c57b03075d68337488421a2182dadd88 not found: ID does not exist" containerID="b452a7bf4115d17b4397e620c9929009c57b03075d68337488421a2182dadd88" Feb 23 07:53:42 crc kubenswrapper[5047]: I0223 07:53:42.811871 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b452a7bf4115d17b4397e620c9929009c57b03075d68337488421a2182dadd88"} err="failed to get container status \"b452a7bf4115d17b4397e620c9929009c57b03075d68337488421a2182dadd88\": rpc error: code = NotFound desc = could not find container \"b452a7bf4115d17b4397e620c9929009c57b03075d68337488421a2182dadd88\": container with ID starting with b452a7bf4115d17b4397e620c9929009c57b03075d68337488421a2182dadd88 not found: ID does not exist" Feb 23 07:53:44 crc kubenswrapper[5047]: I0223 07:53:44.361787 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="465dec46-379b-4071-bea8-06eacd0286b5" path="/var/lib/kubelet/pods/465dec46-379b-4071-bea8-06eacd0286b5/volumes" Feb 23 07:53:47 crc kubenswrapper[5047]: I0223 07:53:47.341062 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:53:47 crc kubenswrapper[5047]: E0223 07:53:47.341473 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:53:58 crc kubenswrapper[5047]: I0223 07:53:58.348428 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:53:58 crc kubenswrapper[5047]: E0223 07:53:58.350067 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:54:12 crc kubenswrapper[5047]: I0223 07:54:12.342401 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:54:12 crc kubenswrapper[5047]: E0223 07:54:12.343974 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:54:23 crc kubenswrapper[5047]: I0223 07:54:23.341247 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:54:23 crc kubenswrapper[5047]: E0223 07:54:23.343301 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:54:38 crc kubenswrapper[5047]: I0223 07:54:38.345133 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:54:38 crc kubenswrapper[5047]: E0223 07:54:38.346289 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 07:54:51 crc kubenswrapper[5047]: I0223 07:54:51.341410 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 07:54:52 crc kubenswrapper[5047]: I0223 07:54:52.513246 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"86b06842011a67573ea74653031203467bbbc28f6f24b0e78e76839c0311fd0b"} Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.456266 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tjlxd"] Feb 23 07:55:37 crc kubenswrapper[5047]: E0223 07:55:37.457628 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426b37ab-98cd-4088-91a7-81854c832db6" containerName="extract-utilities" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.457648 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="426b37ab-98cd-4088-91a7-81854c832db6" containerName="extract-utilities" Feb 23 07:55:37 crc kubenswrapper[5047]: E0223 07:55:37.457685 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426b37ab-98cd-4088-91a7-81854c832db6" containerName="extract-content" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.457696 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="426b37ab-98cd-4088-91a7-81854c832db6" containerName="extract-content" Feb 23 07:55:37 crc kubenswrapper[5047]: E0223 07:55:37.465064 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465dec46-379b-4071-bea8-06eacd0286b5" containerName="extract-content" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.465144 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="465dec46-379b-4071-bea8-06eacd0286b5" containerName="extract-content" Feb 23 07:55:37 crc kubenswrapper[5047]: E0223 07:55:37.465213 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465dec46-379b-4071-bea8-06eacd0286b5" containerName="registry-server" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.465227 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="465dec46-379b-4071-bea8-06eacd0286b5" containerName="registry-server" Feb 23 07:55:37 crc kubenswrapper[5047]: E0223 07:55:37.465262 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426b37ab-98cd-4088-91a7-81854c832db6" containerName="registry-server" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.465273 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="426b37ab-98cd-4088-91a7-81854c832db6" containerName="registry-server" Feb 23 07:55:37 crc kubenswrapper[5047]: E0223 07:55:37.465289 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465dec46-379b-4071-bea8-06eacd0286b5" containerName="extract-utilities" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.465302 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="465dec46-379b-4071-bea8-06eacd0286b5" containerName="extract-utilities" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.465749 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="426b37ab-98cd-4088-91a7-81854c832db6" containerName="registry-server" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.465779 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="465dec46-379b-4071-bea8-06eacd0286b5" containerName="registry-server" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.469080 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjlxd"] Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.469235 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.540373 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6bc2c91-146b-4bb7-9371-a29c04998803-utilities\") pod \"redhat-marketplace-tjlxd\" (UID: \"d6bc2c91-146b-4bb7-9371-a29c04998803\") " pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.540436 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4kzw\" (UniqueName: \"kubernetes.io/projected/d6bc2c91-146b-4bb7-9371-a29c04998803-kube-api-access-s4kzw\") pod \"redhat-marketplace-tjlxd\" (UID: \"d6bc2c91-146b-4bb7-9371-a29c04998803\") " pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.540477 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6bc2c91-146b-4bb7-9371-a29c04998803-catalog-content\") pod \"redhat-marketplace-tjlxd\" (UID: \"d6bc2c91-146b-4bb7-9371-a29c04998803\") " pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.642569 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6bc2c91-146b-4bb7-9371-a29c04998803-utilities\") pod \"redhat-marketplace-tjlxd\" (UID: \"d6bc2c91-146b-4bb7-9371-a29c04998803\") " pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.642686 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4kzw\" (UniqueName: \"kubernetes.io/projected/d6bc2c91-146b-4bb7-9371-a29c04998803-kube-api-access-s4kzw\") pod \"redhat-marketplace-tjlxd\" (UID: \"d6bc2c91-146b-4bb7-9371-a29c04998803\") " pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.642723 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6bc2c91-146b-4bb7-9371-a29c04998803-catalog-content\") pod \"redhat-marketplace-tjlxd\" (UID: \"d6bc2c91-146b-4bb7-9371-a29c04998803\") " pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.643465 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6bc2c91-146b-4bb7-9371-a29c04998803-utilities\") pod \"redhat-marketplace-tjlxd\" (UID: \"d6bc2c91-146b-4bb7-9371-a29c04998803\") " pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.643626 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6bc2c91-146b-4bb7-9371-a29c04998803-catalog-content\") pod \"redhat-marketplace-tjlxd\" (UID: \"d6bc2c91-146b-4bb7-9371-a29c04998803\") " pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.669696 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4kzw\" (UniqueName: \"kubernetes.io/projected/d6bc2c91-146b-4bb7-9371-a29c04998803-kube-api-access-s4kzw\") pod \"redhat-marketplace-tjlxd\" (UID: \"d6bc2c91-146b-4bb7-9371-a29c04998803\") " pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:37 crc kubenswrapper[5047]: I0223 07:55:37.821789 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:38 crc kubenswrapper[5047]: I0223 07:55:38.270701 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjlxd"] Feb 23 07:55:39 crc kubenswrapper[5047]: I0223 07:55:39.002884 5047 generic.go:334] "Generic (PLEG): container finished" podID="d6bc2c91-146b-4bb7-9371-a29c04998803" containerID="17b4b3d3c091b502ea23a20503f515328363b97d3570e955a7715efc2a46db21" exitCode=0 Feb 23 07:55:39 crc kubenswrapper[5047]: I0223 07:55:39.002960 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjlxd" event={"ID":"d6bc2c91-146b-4bb7-9371-a29c04998803","Type":"ContainerDied","Data":"17b4b3d3c091b502ea23a20503f515328363b97d3570e955a7715efc2a46db21"} Feb 23 07:55:39 crc kubenswrapper[5047]: I0223 07:55:39.006166 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjlxd" event={"ID":"d6bc2c91-146b-4bb7-9371-a29c04998803","Type":"ContainerStarted","Data":"ee8f0dbafd8f62e4842896680677bc6456d97a72419cebd66ac868939d3300ca"} Feb 23 07:55:40 crc kubenswrapper[5047]: I0223 07:55:40.018394 5047 generic.go:334] "Generic (PLEG): container finished" podID="d6bc2c91-146b-4bb7-9371-a29c04998803" containerID="bf902152dd702d104cc3aa578ab8b4780c76e6698591ae5effe65db63b849cd2" exitCode=0 Feb 23 07:55:40 crc kubenswrapper[5047]: I0223 07:55:40.018474 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjlxd" event={"ID":"d6bc2c91-146b-4bb7-9371-a29c04998803","Type":"ContainerDied","Data":"bf902152dd702d104cc3aa578ab8b4780c76e6698591ae5effe65db63b849cd2"} Feb 23 07:55:41 crc kubenswrapper[5047]: I0223 07:55:41.031846 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjlxd" event={"ID":"d6bc2c91-146b-4bb7-9371-a29c04998803","Type":"ContainerStarted","Data":"872fd390f35e45148035485fca26116e5a28993c7f20bf92c98b390c955fe65f"} Feb 23 07:55:41 crc kubenswrapper[5047]: I0223 07:55:41.064574 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tjlxd" podStartSLOduration=2.666449938 podStartE2EDuration="4.064556314s" podCreationTimestamp="2026-02-23 07:55:37 +0000 UTC" firstStartedPulling="2026-02-23 07:55:39.00467894 +0000 UTC m=+4261.256006114" lastFinishedPulling="2026-02-23 07:55:40.402785336 +0000 UTC m=+4262.654112490" observedRunningTime="2026-02-23 07:55:41.058006947 +0000 UTC m=+4263.309334081" watchObservedRunningTime="2026-02-23 07:55:41.064556314 +0000 UTC m=+4263.315883448" Feb 23 07:55:47 crc kubenswrapper[5047]: I0223 07:55:47.822988 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:47 crc kubenswrapper[5047]: I0223 07:55:47.823752 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:47 crc kubenswrapper[5047]: I0223 07:55:47.908490 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:48 crc kubenswrapper[5047]: I0223 07:55:48.174556 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:48 crc kubenswrapper[5047]: I0223 07:55:48.233711 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjlxd"] Feb 23 07:55:50 crc kubenswrapper[5047]: I0223 07:55:50.124758 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tjlxd" podUID="d6bc2c91-146b-4bb7-9371-a29c04998803" containerName="registry-server" containerID="cri-o://872fd390f35e45148035485fca26116e5a28993c7f20bf92c98b390c955fe65f" gracePeriod=2 Feb 23 07:55:50 crc kubenswrapper[5047]: I0223 07:55:50.645804 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:50 crc kubenswrapper[5047]: I0223 07:55:50.749606 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6bc2c91-146b-4bb7-9371-a29c04998803-utilities\") pod \"d6bc2c91-146b-4bb7-9371-a29c04998803\" (UID: \"d6bc2c91-146b-4bb7-9371-a29c04998803\") " Feb 23 07:55:50 crc kubenswrapper[5047]: I0223 07:55:50.749835 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6bc2c91-146b-4bb7-9371-a29c04998803-catalog-content\") pod \"d6bc2c91-146b-4bb7-9371-a29c04998803\" (UID: \"d6bc2c91-146b-4bb7-9371-a29c04998803\") " Feb 23 07:55:50 crc kubenswrapper[5047]: I0223 07:55:50.749938 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4kzw\" (UniqueName: \"kubernetes.io/projected/d6bc2c91-146b-4bb7-9371-a29c04998803-kube-api-access-s4kzw\") pod \"d6bc2c91-146b-4bb7-9371-a29c04998803\" (UID: \"d6bc2c91-146b-4bb7-9371-a29c04998803\") " Feb 23 07:55:50 crc kubenswrapper[5047]: I0223 07:55:50.751755 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6bc2c91-146b-4bb7-9371-a29c04998803-utilities" (OuterVolumeSpecName: "utilities") pod "d6bc2c91-146b-4bb7-9371-a29c04998803" (UID: "d6bc2c91-146b-4bb7-9371-a29c04998803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:55:50 crc kubenswrapper[5047]: I0223 07:55:50.759058 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6bc2c91-146b-4bb7-9371-a29c04998803-kube-api-access-s4kzw" (OuterVolumeSpecName: "kube-api-access-s4kzw") pod "d6bc2c91-146b-4bb7-9371-a29c04998803" (UID: "d6bc2c91-146b-4bb7-9371-a29c04998803"). InnerVolumeSpecName "kube-api-access-s4kzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:55:50 crc kubenswrapper[5047]: I0223 07:55:50.774638 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6bc2c91-146b-4bb7-9371-a29c04998803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6bc2c91-146b-4bb7-9371-a29c04998803" (UID: "d6bc2c91-146b-4bb7-9371-a29c04998803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:55:50 crc kubenswrapper[5047]: I0223 07:55:50.853178 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6bc2c91-146b-4bb7-9371-a29c04998803-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:55:50 crc kubenswrapper[5047]: I0223 07:55:50.853283 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6bc2c91-146b-4bb7-9371-a29c04998803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:55:50 crc kubenswrapper[5047]: I0223 07:55:50.853313 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4kzw\" (UniqueName: \"kubernetes.io/projected/d6bc2c91-146b-4bb7-9371-a29c04998803-kube-api-access-s4kzw\") on node \"crc\" DevicePath \"\"" Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.140187 5047 generic.go:334] "Generic (PLEG): container finished" podID="d6bc2c91-146b-4bb7-9371-a29c04998803" containerID="872fd390f35e45148035485fca26116e5a28993c7f20bf92c98b390c955fe65f" exitCode=0 Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.140260 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjlxd" event={"ID":"d6bc2c91-146b-4bb7-9371-a29c04998803","Type":"ContainerDied","Data":"872fd390f35e45148035485fca26116e5a28993c7f20bf92c98b390c955fe65f"} Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.140319 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjlxd" event={"ID":"d6bc2c91-146b-4bb7-9371-a29c04998803","Type":"ContainerDied","Data":"ee8f0dbafd8f62e4842896680677bc6456d97a72419cebd66ac868939d3300ca"} Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.140396 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjlxd" Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.140412 5047 scope.go:117] "RemoveContainer" containerID="872fd390f35e45148035485fca26116e5a28993c7f20bf92c98b390c955fe65f" Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.175247 5047 scope.go:117] "RemoveContainer" containerID="bf902152dd702d104cc3aa578ab8b4780c76e6698591ae5effe65db63b849cd2" Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.210079 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjlxd"] Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.218776 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjlxd"] Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.232078 5047 scope.go:117] "RemoveContainer" containerID="17b4b3d3c091b502ea23a20503f515328363b97d3570e955a7715efc2a46db21" Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.256307 5047 scope.go:117] "RemoveContainer" containerID="872fd390f35e45148035485fca26116e5a28993c7f20bf92c98b390c955fe65f" Feb 23 07:55:51 crc kubenswrapper[5047]: E0223 07:55:51.256795 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"872fd390f35e45148035485fca26116e5a28993c7f20bf92c98b390c955fe65f\": container with ID starting with 872fd390f35e45148035485fca26116e5a28993c7f20bf92c98b390c955fe65f not found: ID does not exist" containerID="872fd390f35e45148035485fca26116e5a28993c7f20bf92c98b390c955fe65f" Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.256840 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872fd390f35e45148035485fca26116e5a28993c7f20bf92c98b390c955fe65f"} err="failed to get container status \"872fd390f35e45148035485fca26116e5a28993c7f20bf92c98b390c955fe65f\": rpc error: code = NotFound desc = could not find container \"872fd390f35e45148035485fca26116e5a28993c7f20bf92c98b390c955fe65f\": container with ID starting with 872fd390f35e45148035485fca26116e5a28993c7f20bf92c98b390c955fe65f not found: ID does not exist" Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.256879 5047 scope.go:117] "RemoveContainer" containerID="bf902152dd702d104cc3aa578ab8b4780c76e6698591ae5effe65db63b849cd2" Feb 23 07:55:51 crc kubenswrapper[5047]: E0223 07:55:51.257205 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf902152dd702d104cc3aa578ab8b4780c76e6698591ae5effe65db63b849cd2\": container with ID starting with bf902152dd702d104cc3aa578ab8b4780c76e6698591ae5effe65db63b849cd2 not found: ID does not exist" containerID="bf902152dd702d104cc3aa578ab8b4780c76e6698591ae5effe65db63b849cd2" Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.257237 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf902152dd702d104cc3aa578ab8b4780c76e6698591ae5effe65db63b849cd2"} err="failed to get container status \"bf902152dd702d104cc3aa578ab8b4780c76e6698591ae5effe65db63b849cd2\": rpc error: code = NotFound desc = could not find container \"bf902152dd702d104cc3aa578ab8b4780c76e6698591ae5effe65db63b849cd2\": container with ID starting with bf902152dd702d104cc3aa578ab8b4780c76e6698591ae5effe65db63b849cd2 not found: ID does not exist" Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.257262 5047 scope.go:117] "RemoveContainer" containerID="17b4b3d3c091b502ea23a20503f515328363b97d3570e955a7715efc2a46db21" Feb 23 07:55:51 crc kubenswrapper[5047]: E0223 07:55:51.257690 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b4b3d3c091b502ea23a20503f515328363b97d3570e955a7715efc2a46db21\": container with ID starting with 17b4b3d3c091b502ea23a20503f515328363b97d3570e955a7715efc2a46db21 not found: ID does not exist" containerID="17b4b3d3c091b502ea23a20503f515328363b97d3570e955a7715efc2a46db21" Feb 23 07:55:51 crc kubenswrapper[5047]: I0223 07:55:51.257724 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b4b3d3c091b502ea23a20503f515328363b97d3570e955a7715efc2a46db21"} err="failed to get container status \"17b4b3d3c091b502ea23a20503f515328363b97d3570e955a7715efc2a46db21\": rpc error: code = NotFound desc = could not find container \"17b4b3d3c091b502ea23a20503f515328363b97d3570e955a7715efc2a46db21\": container with ID starting with 17b4b3d3c091b502ea23a20503f515328363b97d3570e955a7715efc2a46db21 not found: ID does not exist" Feb 23 07:55:52 crc kubenswrapper[5047]: I0223 07:55:52.361309 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6bc2c91-146b-4bb7-9371-a29c04998803" path="/var/lib/kubelet/pods/d6bc2c91-146b-4bb7-9371-a29c04998803/volumes" Feb 23 07:57:16 crc kubenswrapper[5047]: I0223 07:57:16.759814 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:57:16 crc kubenswrapper[5047]: I0223 07:57:16.760591 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:57:46 crc kubenswrapper[5047]: I0223 07:57:46.760465 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:57:46 crc kubenswrapper[5047]: I0223 07:57:46.761750 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:58:16 crc kubenswrapper[5047]: I0223 07:58:16.760321 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:58:16 crc kubenswrapper[5047]: I0223 07:58:16.762230 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:58:16 crc kubenswrapper[5047]: I0223 07:58:16.762350 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 07:58:16 crc kubenswrapper[5047]: I0223 07:58:16.763717 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86b06842011a67573ea74653031203467bbbc28f6f24b0e78e76839c0311fd0b"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:58:16 crc kubenswrapper[5047]: I0223 07:58:16.763839 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://86b06842011a67573ea74653031203467bbbc28f6f24b0e78e76839c0311fd0b" gracePeriod=600 Feb 23 07:58:17 crc kubenswrapper[5047]: I0223 07:58:17.569057 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="86b06842011a67573ea74653031203467bbbc28f6f24b0e78e76839c0311fd0b" exitCode=0 Feb 23 07:58:17 crc kubenswrapper[5047]: I0223 07:58:17.569167 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"86b06842011a67573ea74653031203467bbbc28f6f24b0e78e76839c0311fd0b"} Feb 23 07:58:17 crc kubenswrapper[5047]: I0223 07:58:17.569865 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062"} Feb 23 07:58:17 crc kubenswrapper[5047]: I0223 07:58:17.569900 5047 scope.go:117] "RemoveContainer" containerID="0b6d70dda441503db6d5eab9f1c1e6504002340040ce9e9452913f013ecfb863" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.223046 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s"] Feb 23 08:00:00 crc kubenswrapper[5047]: E0223 08:00:00.223826 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6bc2c91-146b-4bb7-9371-a29c04998803" containerName="extract-content" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.223838 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6bc2c91-146b-4bb7-9371-a29c04998803" containerName="extract-content" Feb 23 08:00:00 crc kubenswrapper[5047]: E0223 08:00:00.223854 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6bc2c91-146b-4bb7-9371-a29c04998803" containerName="extract-utilities" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.223860 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6bc2c91-146b-4bb7-9371-a29c04998803" containerName="extract-utilities" Feb 23 08:00:00 crc kubenswrapper[5047]: E0223 08:00:00.223871 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6bc2c91-146b-4bb7-9371-a29c04998803" containerName="registry-server" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.223877 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6bc2c91-146b-4bb7-9371-a29c04998803" containerName="registry-server" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.224074 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6bc2c91-146b-4bb7-9371-a29c04998803" containerName="registry-server" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.224520 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.226887 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.227109 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.248740 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s"] Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.332748 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2ppv\" (UniqueName: \"kubernetes.io/projected/64c334ee-69a0-4a5d-9c5c-7b253af16a18-kube-api-access-k2ppv\") pod \"collect-profiles-29530560-vqz2s\" (UID: \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.332839 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64c334ee-69a0-4a5d-9c5c-7b253af16a18-config-volume\") pod \"collect-profiles-29530560-vqz2s\" (UID: \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.332888 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64c334ee-69a0-4a5d-9c5c-7b253af16a18-secret-volume\") pod \"collect-profiles-29530560-vqz2s\" (UID: \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.434056 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64c334ee-69a0-4a5d-9c5c-7b253af16a18-config-volume\") pod \"collect-profiles-29530560-vqz2s\" (UID: \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.434210 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64c334ee-69a0-4a5d-9c5c-7b253af16a18-secret-volume\") pod \"collect-profiles-29530560-vqz2s\" (UID: \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.434370 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2ppv\" (UniqueName: \"kubernetes.io/projected/64c334ee-69a0-4a5d-9c5c-7b253af16a18-kube-api-access-k2ppv\") pod \"collect-profiles-29530560-vqz2s\" (UID: \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.435403 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64c334ee-69a0-4a5d-9c5c-7b253af16a18-config-volume\") pod \"collect-profiles-29530560-vqz2s\" (UID: \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.441321 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64c334ee-69a0-4a5d-9c5c-7b253af16a18-secret-volume\") pod \"collect-profiles-29530560-vqz2s\" (UID: \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.453806 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2ppv\" (UniqueName: \"kubernetes.io/projected/64c334ee-69a0-4a5d-9c5c-7b253af16a18-kube-api-access-k2ppv\") pod \"collect-profiles-29530560-vqz2s\" (UID: \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" Feb 23 08:00:00 crc kubenswrapper[5047]: I0223 08:00:00.541929 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" Feb 23 08:00:01 crc kubenswrapper[5047]: I0223 08:00:01.020478 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s"] Feb 23 08:00:01 crc kubenswrapper[5047]: I0223 08:00:01.568066 5047 generic.go:334] "Generic (PLEG): container finished" podID="64c334ee-69a0-4a5d-9c5c-7b253af16a18" containerID="bf3b0236277e00464a5629357de2ec8dbe421e7731e772f51908e1af8425b9cc" exitCode=0 Feb 23 08:00:01 crc kubenswrapper[5047]: I0223 08:00:01.568154 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" event={"ID":"64c334ee-69a0-4a5d-9c5c-7b253af16a18","Type":"ContainerDied","Data":"bf3b0236277e00464a5629357de2ec8dbe421e7731e772f51908e1af8425b9cc"} Feb 23 08:00:01 crc kubenswrapper[5047]: I0223 08:00:01.568631 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" event={"ID":"64c334ee-69a0-4a5d-9c5c-7b253af16a18","Type":"ContainerStarted","Data":"e9e1e6177096d005df427b7efcbc60794607625ab919ad47f928f4c2dc323100"} Feb 23 08:00:02 crc kubenswrapper[5047]: I0223 08:00:02.876279 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" Feb 23 08:00:02 crc kubenswrapper[5047]: I0223 08:00:02.973332 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2ppv\" (UniqueName: \"kubernetes.io/projected/64c334ee-69a0-4a5d-9c5c-7b253af16a18-kube-api-access-k2ppv\") pod \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\" (UID: \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\") " Feb 23 08:00:02 crc kubenswrapper[5047]: I0223 08:00:02.973521 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64c334ee-69a0-4a5d-9c5c-7b253af16a18-secret-volume\") pod \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\" (UID: \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\") " Feb 23 08:00:02 crc kubenswrapper[5047]: I0223 08:00:02.973967 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64c334ee-69a0-4a5d-9c5c-7b253af16a18-config-volume\") pod \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\" (UID: \"64c334ee-69a0-4a5d-9c5c-7b253af16a18\") " Feb 23 08:00:02 crc kubenswrapper[5047]: I0223 08:00:02.974700 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64c334ee-69a0-4a5d-9c5c-7b253af16a18-config-volume" (OuterVolumeSpecName: "config-volume") pod "64c334ee-69a0-4a5d-9c5c-7b253af16a18" (UID: "64c334ee-69a0-4a5d-9c5c-7b253af16a18"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:00:02 crc kubenswrapper[5047]: I0223 08:00:02.981834 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c334ee-69a0-4a5d-9c5c-7b253af16a18-kube-api-access-k2ppv" (OuterVolumeSpecName: "kube-api-access-k2ppv") pod "64c334ee-69a0-4a5d-9c5c-7b253af16a18" (UID: "64c334ee-69a0-4a5d-9c5c-7b253af16a18"). InnerVolumeSpecName "kube-api-access-k2ppv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:00:02 crc kubenswrapper[5047]: I0223 08:00:02.983219 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64c334ee-69a0-4a5d-9c5c-7b253af16a18-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "64c334ee-69a0-4a5d-9c5c-7b253af16a18" (UID: "64c334ee-69a0-4a5d-9c5c-7b253af16a18"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:00:03 crc kubenswrapper[5047]: I0223 08:00:03.075870 5047 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64c334ee-69a0-4a5d-9c5c-7b253af16a18-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:00:03 crc kubenswrapper[5047]: I0223 08:00:03.075940 5047 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64c334ee-69a0-4a5d-9c5c-7b253af16a18-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:00:03 crc kubenswrapper[5047]: I0223 08:00:03.075961 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2ppv\" (UniqueName: \"kubernetes.io/projected/64c334ee-69a0-4a5d-9c5c-7b253af16a18-kube-api-access-k2ppv\") on node \"crc\" DevicePath \"\"" Feb 23 08:00:03 crc kubenswrapper[5047]: I0223 08:00:03.590045 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" event={"ID":"64c334ee-69a0-4a5d-9c5c-7b253af16a18","Type":"ContainerDied","Data":"e9e1e6177096d005df427b7efcbc60794607625ab919ad47f928f4c2dc323100"} Feb 23 08:00:03 crc kubenswrapper[5047]: I0223 08:00:03.590113 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e1e6177096d005df427b7efcbc60794607625ab919ad47f928f4c2dc323100" Feb 23 08:00:03 crc kubenswrapper[5047]: I0223 08:00:03.590141 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s" Feb 23 08:00:03 crc kubenswrapper[5047]: I0223 08:00:03.947846 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz"] Feb 23 08:00:03 crc kubenswrapper[5047]: I0223 08:00:03.952899 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-t6vcz"] Feb 23 08:00:04 crc kubenswrapper[5047]: I0223 08:00:04.353711 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698c5219-ee1d-4076-a7fd-db26594dc4a8" path="/var/lib/kubelet/pods/698c5219-ee1d-4076-a7fd-db26594dc4a8/volumes" Feb 23 08:00:46 crc kubenswrapper[5047]: I0223 08:00:46.759608 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:00:46 crc kubenswrapper[5047]: I0223 08:00:46.760575 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:00:54 crc kubenswrapper[5047]: I0223 08:00:54.432131 5047 scope.go:117] "RemoveContainer" containerID="a60c336d0e0b6f5656493549d938a4e8047d61f249e7bffc4c7a30c54466afe4" Feb 23 08:01:16 crc kubenswrapper[5047]: I0223 08:01:16.759484 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:01:16 crc kubenswrapper[5047]: I0223 08:01:16.760261 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:01:46 crc kubenswrapper[5047]: I0223 08:01:46.759886 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:01:46 crc kubenswrapper[5047]: I0223 08:01:46.760620 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:01:46 crc kubenswrapper[5047]: I0223 08:01:46.760694 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 08:01:46 crc kubenswrapper[5047]: I0223 08:01:46.762156 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:01:46 crc kubenswrapper[5047]: I0223 08:01:46.762495 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" gracePeriod=600 Feb 23 08:01:46 crc kubenswrapper[5047]: E0223 08:01:46.902989 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:01:47 crc kubenswrapper[5047]: I0223 08:01:47.588726 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" exitCode=0 Feb 23 08:01:47 crc kubenswrapper[5047]: I0223 08:01:47.588815 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062"} Feb 23 08:01:47 crc kubenswrapper[5047]: I0223 08:01:47.588936 5047 scope.go:117] "RemoveContainer" containerID="86b06842011a67573ea74653031203467bbbc28f6f24b0e78e76839c0311fd0b" Feb 23 08:01:47 crc kubenswrapper[5047]: I0223 08:01:47.589967 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:01:47 crc kubenswrapper[5047]: E0223 08:01:47.590419 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:01:58 crc kubenswrapper[5047]: I0223 08:01:58.344642 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:01:58 crc kubenswrapper[5047]: E0223 08:01:58.345838 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:02:13 crc kubenswrapper[5047]: I0223 08:02:13.341166 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:02:13 crc kubenswrapper[5047]: E0223 08:02:13.342646 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:02:25 crc kubenswrapper[5047]: I0223 08:02:25.341342 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:02:25 crc kubenswrapper[5047]: E0223 08:02:25.343067 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:02:40 crc kubenswrapper[5047]: I0223 08:02:40.341662 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:02:40 crc kubenswrapper[5047]: E0223 08:02:40.342676 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:02:55 crc kubenswrapper[5047]: I0223 08:02:55.340970 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:02:55 crc kubenswrapper[5047]: E0223 08:02:55.342147 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:03:07 crc kubenswrapper[5047]: I0223 08:03:07.341598 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:03:07 crc kubenswrapper[5047]: E0223 08:03:07.343006 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:03:20 crc kubenswrapper[5047]: I0223 08:03:20.341882 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:03:20 crc kubenswrapper[5047]: E0223 08:03:20.343433 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:03:31 crc kubenswrapper[5047]: I0223 08:03:31.371648 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:03:31 crc kubenswrapper[5047]: E0223 08:03:31.373710 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:03:46 crc kubenswrapper[5047]: I0223 08:03:46.341584 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:03:46 crc kubenswrapper[5047]: E0223 08:03:46.342738 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.225489 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dpjmx"] Feb 23 08:03:47 crc kubenswrapper[5047]: E0223 08:03:47.226275 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c334ee-69a0-4a5d-9c5c-7b253af16a18" containerName="collect-profiles" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.226293 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c334ee-69a0-4a5d-9c5c-7b253af16a18" containerName="collect-profiles" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.286541 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c334ee-69a0-4a5d-9c5c-7b253af16a18" containerName="collect-profiles" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.290717 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dpjmx"] Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.290879 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.492151 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-catalog-content\") pod \"community-operators-dpjmx\" (UID: \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\") " pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.492244 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-utilities\") pod \"community-operators-dpjmx\" (UID: \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\") " pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.492301 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52kwl\" (UniqueName: \"kubernetes.io/projected/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-kube-api-access-52kwl\") pod \"community-operators-dpjmx\" (UID: \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\") " pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.593847 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-catalog-content\") pod \"community-operators-dpjmx\" (UID: \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\") " pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.594270 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-utilities\") pod \"community-operators-dpjmx\" (UID: \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\") " pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.594420 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52kwl\" (UniqueName: \"kubernetes.io/projected/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-kube-api-access-52kwl\") pod \"community-operators-dpjmx\" (UID: \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\") " pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.594607 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-catalog-content\") pod \"community-operators-dpjmx\" (UID: \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\") " pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.594995 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-utilities\") pod \"community-operators-dpjmx\" (UID: \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\") " pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.637583 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52kwl\" (UniqueName: \"kubernetes.io/projected/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-kube-api-access-52kwl\") pod \"community-operators-dpjmx\" (UID: \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\") " pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:47 crc kubenswrapper[5047]: I0223 08:03:47.928152 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:48 crc kubenswrapper[5047]: I0223 08:03:48.431877 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dpjmx"] Feb 23 08:03:48 crc kubenswrapper[5047]: I0223 08:03:48.814296 5047 generic.go:334] "Generic (PLEG): container finished" podID="6e2572fd-c94e-4f8f-99e5-daa26e7031e6" containerID="666547f13b5a441b69a1dd70c0c4e0b2bf25c0bffbfebc8f41d2f550ede7d253" exitCode=0 Feb 23 08:03:48 crc kubenswrapper[5047]: I0223 08:03:48.814386 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dpjmx" event={"ID":"6e2572fd-c94e-4f8f-99e5-daa26e7031e6","Type":"ContainerDied","Data":"666547f13b5a441b69a1dd70c0c4e0b2bf25c0bffbfebc8f41d2f550ede7d253"} Feb 23 08:03:48 crc kubenswrapper[5047]: I0223 08:03:48.814427 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dpjmx" event={"ID":"6e2572fd-c94e-4f8f-99e5-daa26e7031e6","Type":"ContainerStarted","Data":"883690ca81872cb958bd1398c9db3eb8187e405568b3986254efff5d2f8a6101"} Feb 23 08:03:48 crc kubenswrapper[5047]: I0223 08:03:48.816641 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:03:50 crc kubenswrapper[5047]: I0223 08:03:50.840216 5047 generic.go:334] "Generic (PLEG): container finished" podID="6e2572fd-c94e-4f8f-99e5-daa26e7031e6" containerID="a2af2f6a83c58b68c170e7b18f36eefaab7aa2930c993bec9bef1f327c01ca05" exitCode=0 Feb 23 08:03:50 crc kubenswrapper[5047]: I0223 08:03:50.840361 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dpjmx" event={"ID":"6e2572fd-c94e-4f8f-99e5-daa26e7031e6","Type":"ContainerDied","Data":"a2af2f6a83c58b68c170e7b18f36eefaab7aa2930c993bec9bef1f327c01ca05"} Feb 23 08:03:51 crc kubenswrapper[5047]: I0223 08:03:51.850972 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dpjmx" event={"ID":"6e2572fd-c94e-4f8f-99e5-daa26e7031e6","Type":"ContainerStarted","Data":"4c19528b0a0c6613cb2bdf1be89cfa95f134e45ec37e1a40f4833388bb14dbbe"} Feb 23 08:03:51 crc kubenswrapper[5047]: I0223 08:03:51.872415 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dpjmx" podStartSLOduration=2.087735286 podStartE2EDuration="4.87239155s" podCreationTimestamp="2026-02-23 08:03:47 +0000 UTC" firstStartedPulling="2026-02-23 08:03:48.816343464 +0000 UTC m=+4751.067670598" lastFinishedPulling="2026-02-23 08:03:51.600999738 +0000 UTC m=+4753.852326862" observedRunningTime="2026-02-23 08:03:51.869975425 +0000 UTC m=+4754.121302579" watchObservedRunningTime="2026-02-23 08:03:51.87239155 +0000 UTC m=+4754.123718704" Feb 23 08:03:57 crc kubenswrapper[5047]: I0223 08:03:57.340972 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:03:57 crc kubenswrapper[5047]: E0223 08:03:57.341778 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:03:57 crc kubenswrapper[5047]: I0223 08:03:57.928514 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:57 crc kubenswrapper[5047]: I0223 08:03:57.929063 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:57 crc kubenswrapper[5047]: I0223 08:03:57.984227 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:58 crc kubenswrapper[5047]: I0223 08:03:58.994799 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:03:59 crc kubenswrapper[5047]: I0223 08:03:59.069668 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dpjmx"] Feb 23 08:04:00 crc kubenswrapper[5047]: I0223 08:04:00.939055 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dpjmx" podUID="6e2572fd-c94e-4f8f-99e5-daa26e7031e6" containerName="registry-server" containerID="cri-o://4c19528b0a0c6613cb2bdf1be89cfa95f134e45ec37e1a40f4833388bb14dbbe" gracePeriod=2 Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.480546 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.662575 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-catalog-content\") pod \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\" (UID: \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\") " Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.662789 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-utilities\") pod \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\" (UID: \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\") " Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.662889 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52kwl\" (UniqueName: \"kubernetes.io/projected/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-kube-api-access-52kwl\") pod \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\" (UID: \"6e2572fd-c94e-4f8f-99e5-daa26e7031e6\") " Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.664560 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-utilities" (OuterVolumeSpecName: "utilities") pod "6e2572fd-c94e-4f8f-99e5-daa26e7031e6" (UID: "6e2572fd-c94e-4f8f-99e5-daa26e7031e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.670924 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-kube-api-access-52kwl" (OuterVolumeSpecName: "kube-api-access-52kwl") pod "6e2572fd-c94e-4f8f-99e5-daa26e7031e6" (UID: "6e2572fd-c94e-4f8f-99e5-daa26e7031e6"). InnerVolumeSpecName "kube-api-access-52kwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.736174 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e2572fd-c94e-4f8f-99e5-daa26e7031e6" (UID: "6e2572fd-c94e-4f8f-99e5-daa26e7031e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.764807 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52kwl\" (UniqueName: \"kubernetes.io/projected/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-kube-api-access-52kwl\") on node \"crc\" DevicePath \"\"" Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.764851 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.764862 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e2572fd-c94e-4f8f-99e5-daa26e7031e6-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.951534 5047 generic.go:334] "Generic (PLEG): container finished" podID="6e2572fd-c94e-4f8f-99e5-daa26e7031e6" containerID="4c19528b0a0c6613cb2bdf1be89cfa95f134e45ec37e1a40f4833388bb14dbbe" exitCode=0 Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.951613 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dpjmx" event={"ID":"6e2572fd-c94e-4f8f-99e5-daa26e7031e6","Type":"ContainerDied","Data":"4c19528b0a0c6613cb2bdf1be89cfa95f134e45ec37e1a40f4833388bb14dbbe"} Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.951694 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dpjmx" event={"ID":"6e2572fd-c94e-4f8f-99e5-daa26e7031e6","Type":"ContainerDied","Data":"883690ca81872cb958bd1398c9db3eb8187e405568b3986254efff5d2f8a6101"} Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.951730 5047 scope.go:117] "RemoveContainer" containerID="4c19528b0a0c6613cb2bdf1be89cfa95f134e45ec37e1a40f4833388bb14dbbe" Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.951742 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dpjmx" Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.979642 5047 scope.go:117] "RemoveContainer" containerID="a2af2f6a83c58b68c170e7b18f36eefaab7aa2930c993bec9bef1f327c01ca05" Feb 23 08:04:01 crc kubenswrapper[5047]: I0223 08:04:01.993510 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dpjmx"] Feb 23 08:04:02 crc kubenswrapper[5047]: I0223 08:04:02.000312 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dpjmx"] Feb 23 08:04:02 crc kubenswrapper[5047]: I0223 08:04:02.013096 5047 scope.go:117] "RemoveContainer" containerID="666547f13b5a441b69a1dd70c0c4e0b2bf25c0bffbfebc8f41d2f550ede7d253" Feb 23 08:04:02 crc kubenswrapper[5047]: I0223 08:04:02.033492 5047 scope.go:117] "RemoveContainer" containerID="4c19528b0a0c6613cb2bdf1be89cfa95f134e45ec37e1a40f4833388bb14dbbe" Feb 23 08:04:02 crc kubenswrapper[5047]: E0223 08:04:02.034486 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c19528b0a0c6613cb2bdf1be89cfa95f134e45ec37e1a40f4833388bb14dbbe\": container with ID starting with 4c19528b0a0c6613cb2bdf1be89cfa95f134e45ec37e1a40f4833388bb14dbbe not found: ID does not exist" containerID="4c19528b0a0c6613cb2bdf1be89cfa95f134e45ec37e1a40f4833388bb14dbbe" Feb 23 08:04:02 crc kubenswrapper[5047]: I0223 08:04:02.034537 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c19528b0a0c6613cb2bdf1be89cfa95f134e45ec37e1a40f4833388bb14dbbe"} err="failed to get container status \"4c19528b0a0c6613cb2bdf1be89cfa95f134e45ec37e1a40f4833388bb14dbbe\": rpc error: code = NotFound desc = could not find container \"4c19528b0a0c6613cb2bdf1be89cfa95f134e45ec37e1a40f4833388bb14dbbe\": container with ID starting with 4c19528b0a0c6613cb2bdf1be89cfa95f134e45ec37e1a40f4833388bb14dbbe not found: ID does not exist" Feb 23 08:04:02 crc kubenswrapper[5047]: I0223 08:04:02.034566 5047 scope.go:117] "RemoveContainer" containerID="a2af2f6a83c58b68c170e7b18f36eefaab7aa2930c993bec9bef1f327c01ca05" Feb 23 08:04:02 crc kubenswrapper[5047]: E0223 08:04:02.035209 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2af2f6a83c58b68c170e7b18f36eefaab7aa2930c993bec9bef1f327c01ca05\": container with ID starting with a2af2f6a83c58b68c170e7b18f36eefaab7aa2930c993bec9bef1f327c01ca05 not found: ID does not exist" containerID="a2af2f6a83c58b68c170e7b18f36eefaab7aa2930c993bec9bef1f327c01ca05" Feb 23 08:04:02 crc kubenswrapper[5047]: I0223 08:04:02.035290 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2af2f6a83c58b68c170e7b18f36eefaab7aa2930c993bec9bef1f327c01ca05"} err="failed to get container status \"a2af2f6a83c58b68c170e7b18f36eefaab7aa2930c993bec9bef1f327c01ca05\": rpc error: code = NotFound desc = could not find container \"a2af2f6a83c58b68c170e7b18f36eefaab7aa2930c993bec9bef1f327c01ca05\": container with ID starting with a2af2f6a83c58b68c170e7b18f36eefaab7aa2930c993bec9bef1f327c01ca05 not found: ID does not exist" Feb 23 08:04:02 crc kubenswrapper[5047]: I0223 08:04:02.035345 5047 scope.go:117] "RemoveContainer" containerID="666547f13b5a441b69a1dd70c0c4e0b2bf25c0bffbfebc8f41d2f550ede7d253" Feb 23 08:04:02 crc kubenswrapper[5047]: E0223 08:04:02.035808 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"666547f13b5a441b69a1dd70c0c4e0b2bf25c0bffbfebc8f41d2f550ede7d253\": container with ID starting with 666547f13b5a441b69a1dd70c0c4e0b2bf25c0bffbfebc8f41d2f550ede7d253 not found: ID does not exist" containerID="666547f13b5a441b69a1dd70c0c4e0b2bf25c0bffbfebc8f41d2f550ede7d253" Feb 23 08:04:02 crc kubenswrapper[5047]: I0223 08:04:02.035882 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"666547f13b5a441b69a1dd70c0c4e0b2bf25c0bffbfebc8f41d2f550ede7d253"} err="failed to get container status \"666547f13b5a441b69a1dd70c0c4e0b2bf25c0bffbfebc8f41d2f550ede7d253\": rpc error: code = NotFound desc = could not find container \"666547f13b5a441b69a1dd70c0c4e0b2bf25c0bffbfebc8f41d2f550ede7d253\": container with ID starting with 666547f13b5a441b69a1dd70c0c4e0b2bf25c0bffbfebc8f41d2f550ede7d253 not found: ID does not exist" Feb 23 08:04:02 crc kubenswrapper[5047]: I0223 08:04:02.353208 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e2572fd-c94e-4f8f-99e5-daa26e7031e6" path="/var/lib/kubelet/pods/6e2572fd-c94e-4f8f-99e5-daa26e7031e6/volumes" Feb 23 08:04:12 crc kubenswrapper[5047]: I0223 08:04:12.342150 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:04:12 crc kubenswrapper[5047]: E0223 08:04:12.343510 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:04:27 crc kubenswrapper[5047]: I0223 08:04:27.340980 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:04:27 crc kubenswrapper[5047]: E0223 08:04:27.342181 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:04:41 crc kubenswrapper[5047]: I0223 08:04:41.341216 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:04:41 crc kubenswrapper[5047]: E0223 08:04:41.342744 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:04:56 crc kubenswrapper[5047]: I0223 08:04:56.341109 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:04:56 crc kubenswrapper[5047]: E0223 08:04:56.342286 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:05:09 crc kubenswrapper[5047]: I0223 08:05:09.341141 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:05:09 crc kubenswrapper[5047]: E0223 08:05:09.341998 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:05:22 crc kubenswrapper[5047]: I0223 08:05:22.341854 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:05:22 crc kubenswrapper[5047]: E0223 08:05:22.343267 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:05:33 crc kubenswrapper[5047]: I0223 08:05:33.341743 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:05:33 crc kubenswrapper[5047]: E0223 08:05:33.342516 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:05:44 crc kubenswrapper[5047]: I0223 08:05:44.341419 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:05:44 crc kubenswrapper[5047]: E0223 08:05:44.342346 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:05:55 crc kubenswrapper[5047]: I0223 08:05:55.341248 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:05:55 crc kubenswrapper[5047]: E0223 08:05:55.342347 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:06:09 crc kubenswrapper[5047]: I0223 08:06:09.341436 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:06:09 crc kubenswrapper[5047]: E0223 08:06:09.342465 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.477406 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-896kd"] Feb 23 08:06:11 crc kubenswrapper[5047]: E0223 08:06:11.478100 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2572fd-c94e-4f8f-99e5-daa26e7031e6" containerName="extract-content" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.478113 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2572fd-c94e-4f8f-99e5-daa26e7031e6" containerName="extract-content" Feb 23 08:06:11 crc kubenswrapper[5047]: E0223 08:06:11.478129 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2572fd-c94e-4f8f-99e5-daa26e7031e6" containerName="registry-server" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.478139 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2572fd-c94e-4f8f-99e5-daa26e7031e6" containerName="registry-server" Feb 23 08:06:11 crc kubenswrapper[5047]: E0223 08:06:11.478156 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2572fd-c94e-4f8f-99e5-daa26e7031e6" containerName="extract-utilities" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.478164 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2572fd-c94e-4f8f-99e5-daa26e7031e6" containerName="extract-utilities" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.478327 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2572fd-c94e-4f8f-99e5-daa26e7031e6" containerName="registry-server" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.479298 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.507954 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-896kd"] Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.619527 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837a24f9-2451-463c-a01b-aad36c4afad1-utilities\") pod \"redhat-marketplace-896kd\" (UID: \"837a24f9-2451-463c-a01b-aad36c4afad1\") " pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.619721 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837a24f9-2451-463c-a01b-aad36c4afad1-catalog-content\") pod \"redhat-marketplace-896kd\" (UID: \"837a24f9-2451-463c-a01b-aad36c4afad1\") " pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.619924 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frsdm\" (UniqueName: \"kubernetes.io/projected/837a24f9-2451-463c-a01b-aad36c4afad1-kube-api-access-frsdm\") pod \"redhat-marketplace-896kd\" (UID: \"837a24f9-2451-463c-a01b-aad36c4afad1\") " pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.722013 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837a24f9-2451-463c-a01b-aad36c4afad1-utilities\") pod \"redhat-marketplace-896kd\" (UID: \"837a24f9-2451-463c-a01b-aad36c4afad1\") " pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.722263 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837a24f9-2451-463c-a01b-aad36c4afad1-catalog-content\") pod \"redhat-marketplace-896kd\" (UID: \"837a24f9-2451-463c-a01b-aad36c4afad1\") " pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.722864 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837a24f9-2451-463c-a01b-aad36c4afad1-utilities\") pod \"redhat-marketplace-896kd\" (UID: \"837a24f9-2451-463c-a01b-aad36c4afad1\") " pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.723016 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837a24f9-2451-463c-a01b-aad36c4afad1-catalog-content\") pod \"redhat-marketplace-896kd\" (UID: \"837a24f9-2451-463c-a01b-aad36c4afad1\") " pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.723149 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frsdm\" (UniqueName: \"kubernetes.io/projected/837a24f9-2451-463c-a01b-aad36c4afad1-kube-api-access-frsdm\") pod \"redhat-marketplace-896kd\" (UID: \"837a24f9-2451-463c-a01b-aad36c4afad1\") " pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.760717 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frsdm\" (UniqueName: \"kubernetes.io/projected/837a24f9-2451-463c-a01b-aad36c4afad1-kube-api-access-frsdm\") pod \"redhat-marketplace-896kd\" (UID: \"837a24f9-2451-463c-a01b-aad36c4afad1\") " pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:11 crc kubenswrapper[5047]: I0223 08:06:11.803955 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:12 crc kubenswrapper[5047]: I0223 08:06:12.278977 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-896kd"] Feb 23 08:06:13 crc kubenswrapper[5047]: I0223 08:06:13.250561 5047 generic.go:334] "Generic (PLEG): container finished" podID="837a24f9-2451-463c-a01b-aad36c4afad1" containerID="70566ce90fd98b0b7a5aa2406442c1fdcffb1d0b79c4138e1a0d86fcaefe79c9" exitCode=0 Feb 23 08:06:13 crc kubenswrapper[5047]: I0223 08:06:13.250651 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-896kd" event={"ID":"837a24f9-2451-463c-a01b-aad36c4afad1","Type":"ContainerDied","Data":"70566ce90fd98b0b7a5aa2406442c1fdcffb1d0b79c4138e1a0d86fcaefe79c9"} Feb 23 08:06:13 crc kubenswrapper[5047]: I0223 08:06:13.251122 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-896kd" event={"ID":"837a24f9-2451-463c-a01b-aad36c4afad1","Type":"ContainerStarted","Data":"21495c83d9088fb351b168a33e04d2f89ce115f749d666c6b41df49725474cb5"} Feb 23 08:06:14 crc kubenswrapper[5047]: I0223 08:06:14.265545 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-896kd" event={"ID":"837a24f9-2451-463c-a01b-aad36c4afad1","Type":"ContainerStarted","Data":"2f6818cefa94078be962e81deb17c81c74692b9d0e855d9d01924ead3facd53f"} Feb 23 08:06:15 crc kubenswrapper[5047]: I0223 08:06:15.280455 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-896kd" event={"ID":"837a24f9-2451-463c-a01b-aad36c4afad1","Type":"ContainerDied","Data":"2f6818cefa94078be962e81deb17c81c74692b9d0e855d9d01924ead3facd53f"} Feb 23 08:06:15 crc kubenswrapper[5047]: I0223 08:06:15.280357 5047 generic.go:334] "Generic (PLEG): container finished" podID="837a24f9-2451-463c-a01b-aad36c4afad1" containerID="2f6818cefa94078be962e81deb17c81c74692b9d0e855d9d01924ead3facd53f" exitCode=0 Feb 23 08:06:16 crc kubenswrapper[5047]: I0223 08:06:16.297017 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-896kd" event={"ID":"837a24f9-2451-463c-a01b-aad36c4afad1","Type":"ContainerStarted","Data":"e7aaaad8375c31660c3762e9a6a0b7b8ca41de8ed33c78efa2d1a1f2d7b298c3"} Feb 23 08:06:16 crc kubenswrapper[5047]: I0223 08:06:16.336673 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-896kd" podStartSLOduration=2.865779682 podStartE2EDuration="5.336643352s" podCreationTimestamp="2026-02-23 08:06:11 +0000 UTC" firstStartedPulling="2026-02-23 08:06:13.25294393 +0000 UTC m=+4895.504271074" lastFinishedPulling="2026-02-23 08:06:15.72380757 +0000 UTC m=+4897.975134744" observedRunningTime="2026-02-23 08:06:16.327010472 +0000 UTC m=+4898.578337676" watchObservedRunningTime="2026-02-23 08:06:16.336643352 +0000 UTC m=+4898.587970526" Feb 23 08:06:21 crc kubenswrapper[5047]: I0223 08:06:21.805380 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:21 crc kubenswrapper[5047]: I0223 08:06:21.807513 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:21 crc kubenswrapper[5047]: I0223 08:06:21.886402 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:22 crc kubenswrapper[5047]: I0223 08:06:22.341251 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:06:22 crc kubenswrapper[5047]: E0223 08:06:22.341757 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:06:22 crc kubenswrapper[5047]: I0223 08:06:22.431194 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:22 crc kubenswrapper[5047]: I0223 08:06:22.491729 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-896kd"] Feb 23 08:06:24 crc kubenswrapper[5047]: I0223 08:06:24.364659 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-896kd" podUID="837a24f9-2451-463c-a01b-aad36c4afad1" containerName="registry-server" containerID="cri-o://e7aaaad8375c31660c3762e9a6a0b7b8ca41de8ed33c78efa2d1a1f2d7b298c3" gracePeriod=2 Feb 23 08:06:24 crc kubenswrapper[5047]: I0223 08:06:24.833048 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:24 crc kubenswrapper[5047]: I0223 08:06:24.964843 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frsdm\" (UniqueName: \"kubernetes.io/projected/837a24f9-2451-463c-a01b-aad36c4afad1-kube-api-access-frsdm\") pod \"837a24f9-2451-463c-a01b-aad36c4afad1\" (UID: \"837a24f9-2451-463c-a01b-aad36c4afad1\") " Feb 23 08:06:24 crc kubenswrapper[5047]: I0223 08:06:24.964980 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837a24f9-2451-463c-a01b-aad36c4afad1-utilities\") pod \"837a24f9-2451-463c-a01b-aad36c4afad1\" (UID: \"837a24f9-2451-463c-a01b-aad36c4afad1\") " Feb 23 08:06:24 crc kubenswrapper[5047]: I0223 08:06:24.966171 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837a24f9-2451-463c-a01b-aad36c4afad1-catalog-content\") pod \"837a24f9-2451-463c-a01b-aad36c4afad1\" (UID: \"837a24f9-2451-463c-a01b-aad36c4afad1\") " Feb 23 08:06:24 crc kubenswrapper[5047]: I0223 08:06:24.966603 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837a24f9-2451-463c-a01b-aad36c4afad1-utilities" (OuterVolumeSpecName: "utilities") pod "837a24f9-2451-463c-a01b-aad36c4afad1" (UID: "837a24f9-2451-463c-a01b-aad36c4afad1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:06:24 crc kubenswrapper[5047]: I0223 08:06:24.966747 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837a24f9-2451-463c-a01b-aad36c4afad1-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:06:24 crc kubenswrapper[5047]: I0223 08:06:24.977271 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837a24f9-2451-463c-a01b-aad36c4afad1-kube-api-access-frsdm" (OuterVolumeSpecName: "kube-api-access-frsdm") pod "837a24f9-2451-463c-a01b-aad36c4afad1" (UID: "837a24f9-2451-463c-a01b-aad36c4afad1"). InnerVolumeSpecName "kube-api-access-frsdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.018016 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837a24f9-2451-463c-a01b-aad36c4afad1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "837a24f9-2451-463c-a01b-aad36c4afad1" (UID: "837a24f9-2451-463c-a01b-aad36c4afad1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.068565 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837a24f9-2451-463c-a01b-aad36c4afad1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.068680 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frsdm\" (UniqueName: \"kubernetes.io/projected/837a24f9-2451-463c-a01b-aad36c4afad1-kube-api-access-frsdm\") on node \"crc\" DevicePath \"\"" Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.380472 5047 generic.go:334] "Generic (PLEG): container finished" podID="837a24f9-2451-463c-a01b-aad36c4afad1" containerID="e7aaaad8375c31660c3762e9a6a0b7b8ca41de8ed33c78efa2d1a1f2d7b298c3" exitCode=0 Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.380586 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-896kd" event={"ID":"837a24f9-2451-463c-a01b-aad36c4afad1","Type":"ContainerDied","Data":"e7aaaad8375c31660c3762e9a6a0b7b8ca41de8ed33c78efa2d1a1f2d7b298c3"} Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.380695 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-896kd" Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.381714 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-896kd" event={"ID":"837a24f9-2451-463c-a01b-aad36c4afad1","Type":"ContainerDied","Data":"21495c83d9088fb351b168a33e04d2f89ce115f749d666c6b41df49725474cb5"} Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.381736 5047 scope.go:117] "RemoveContainer" containerID="e7aaaad8375c31660c3762e9a6a0b7b8ca41de8ed33c78efa2d1a1f2d7b298c3" Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.455944 5047 scope.go:117] "RemoveContainer" containerID="2f6818cefa94078be962e81deb17c81c74692b9d0e855d9d01924ead3facd53f" Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.464158 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-896kd"] Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.473839 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-896kd"] Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.479299 5047 scope.go:117] "RemoveContainer" containerID="70566ce90fd98b0b7a5aa2406442c1fdcffb1d0b79c4138e1a0d86fcaefe79c9" Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.522048 5047 scope.go:117] "RemoveContainer" containerID="e7aaaad8375c31660c3762e9a6a0b7b8ca41de8ed33c78efa2d1a1f2d7b298c3" Feb 23 08:06:25 crc kubenswrapper[5047]: E0223 08:06:25.523133 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7aaaad8375c31660c3762e9a6a0b7b8ca41de8ed33c78efa2d1a1f2d7b298c3\": container with ID starting with e7aaaad8375c31660c3762e9a6a0b7b8ca41de8ed33c78efa2d1a1f2d7b298c3 not found: ID does not exist" containerID="e7aaaad8375c31660c3762e9a6a0b7b8ca41de8ed33c78efa2d1a1f2d7b298c3" Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.523193 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7aaaad8375c31660c3762e9a6a0b7b8ca41de8ed33c78efa2d1a1f2d7b298c3"} err="failed to get container status \"e7aaaad8375c31660c3762e9a6a0b7b8ca41de8ed33c78efa2d1a1f2d7b298c3\": rpc error: code = NotFound desc = could not find container \"e7aaaad8375c31660c3762e9a6a0b7b8ca41de8ed33c78efa2d1a1f2d7b298c3\": container with ID starting with e7aaaad8375c31660c3762e9a6a0b7b8ca41de8ed33c78efa2d1a1f2d7b298c3 not found: ID does not exist" Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.523230 5047 scope.go:117] "RemoveContainer" containerID="2f6818cefa94078be962e81deb17c81c74692b9d0e855d9d01924ead3facd53f" Feb 23 08:06:25 crc kubenswrapper[5047]: E0223 08:06:25.524632 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f6818cefa94078be962e81deb17c81c74692b9d0e855d9d01924ead3facd53f\": container with ID starting with 2f6818cefa94078be962e81deb17c81c74692b9d0e855d9d01924ead3facd53f not found: ID does not exist" containerID="2f6818cefa94078be962e81deb17c81c74692b9d0e855d9d01924ead3facd53f" Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.524684 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f6818cefa94078be962e81deb17c81c74692b9d0e855d9d01924ead3facd53f"} err="failed to get container status \"2f6818cefa94078be962e81deb17c81c74692b9d0e855d9d01924ead3facd53f\": rpc error: code = NotFound desc = could not find container \"2f6818cefa94078be962e81deb17c81c74692b9d0e855d9d01924ead3facd53f\": container with ID starting with 2f6818cefa94078be962e81deb17c81c74692b9d0e855d9d01924ead3facd53f not found: ID does not exist" Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.524712 5047 scope.go:117] "RemoveContainer" containerID="70566ce90fd98b0b7a5aa2406442c1fdcffb1d0b79c4138e1a0d86fcaefe79c9" Feb 23 08:06:25 crc kubenswrapper[5047]: E0223 08:06:25.525039 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70566ce90fd98b0b7a5aa2406442c1fdcffb1d0b79c4138e1a0d86fcaefe79c9\": container with ID starting with 70566ce90fd98b0b7a5aa2406442c1fdcffb1d0b79c4138e1a0d86fcaefe79c9 not found: ID does not exist" containerID="70566ce90fd98b0b7a5aa2406442c1fdcffb1d0b79c4138e1a0d86fcaefe79c9" Feb 23 08:06:25 crc kubenswrapper[5047]: I0223 08:06:25.525066 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70566ce90fd98b0b7a5aa2406442c1fdcffb1d0b79c4138e1a0d86fcaefe79c9"} err="failed to get container status \"70566ce90fd98b0b7a5aa2406442c1fdcffb1d0b79c4138e1a0d86fcaefe79c9\": rpc error: code = NotFound desc = could not find container \"70566ce90fd98b0b7a5aa2406442c1fdcffb1d0b79c4138e1a0d86fcaefe79c9\": container with ID starting with 70566ce90fd98b0b7a5aa2406442c1fdcffb1d0b79c4138e1a0d86fcaefe79c9 not found: ID does not exist" Feb 23 08:06:26 crc kubenswrapper[5047]: I0223 08:06:26.363974 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="837a24f9-2451-463c-a01b-aad36c4afad1" path="/var/lib/kubelet/pods/837a24f9-2451-463c-a01b-aad36c4afad1/volumes" Feb 23 08:06:34 crc kubenswrapper[5047]: I0223 08:06:34.341458 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:06:34 crc kubenswrapper[5047]: E0223 08:06:34.342637 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:06:47 crc kubenswrapper[5047]: I0223 08:06:47.341382 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:06:47 crc kubenswrapper[5047]: I0223 08:06:47.609104 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"332cc8df4762a91bdb66c6860a540e49fed45f15e7251e3a2ad529fe539ef41a"} Feb 23 08:08:28 crc kubenswrapper[5047]: I0223 08:08:28.958675 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nmdx8"] Feb 23 08:08:28 crc kubenswrapper[5047]: E0223 08:08:28.960754 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837a24f9-2451-463c-a01b-aad36c4afad1" containerName="registry-server" Feb 23 08:08:28 crc kubenswrapper[5047]: I0223 08:08:28.960779 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="837a24f9-2451-463c-a01b-aad36c4afad1" containerName="registry-server" Feb 23 08:08:28 crc kubenswrapper[5047]: E0223 08:08:28.960834 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837a24f9-2451-463c-a01b-aad36c4afad1" containerName="extract-utilities" Feb 23 08:08:28 crc kubenswrapper[5047]: I0223 08:08:28.960846 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="837a24f9-2451-463c-a01b-aad36c4afad1" containerName="extract-utilities" Feb 23 08:08:28 crc kubenswrapper[5047]: E0223 08:08:28.960880 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837a24f9-2451-463c-a01b-aad36c4afad1" containerName="extract-content" Feb 23 08:08:28 crc kubenswrapper[5047]: I0223 08:08:28.960893 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="837a24f9-2451-463c-a01b-aad36c4afad1" containerName="extract-content" Feb 23 08:08:28 crc kubenswrapper[5047]: I0223 08:08:28.961171 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="837a24f9-2451-463c-a01b-aad36c4afad1" containerName="registry-server" Feb 23 08:08:28 crc kubenswrapper[5047]: I0223 08:08:28.962893 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:28 crc kubenswrapper[5047]: I0223 08:08:28.982836 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nmdx8"] Feb 23 08:08:29 crc kubenswrapper[5047]: I0223 08:08:29.046335 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe13277-9652-4d33-8f78-83ec11da0556-catalog-content\") pod \"redhat-operators-nmdx8\" (UID: \"6fe13277-9652-4d33-8f78-83ec11da0556\") " pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:29 crc kubenswrapper[5047]: I0223 08:08:29.046403 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe13277-9652-4d33-8f78-83ec11da0556-utilities\") pod \"redhat-operators-nmdx8\" (UID: \"6fe13277-9652-4d33-8f78-83ec11da0556\") " pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:29 crc kubenswrapper[5047]: I0223 08:08:29.046575 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngfrg\" (UniqueName: \"kubernetes.io/projected/6fe13277-9652-4d33-8f78-83ec11da0556-kube-api-access-ngfrg\") pod \"redhat-operators-nmdx8\" (UID: \"6fe13277-9652-4d33-8f78-83ec11da0556\") " pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:29 crc kubenswrapper[5047]: I0223 08:08:29.152032 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe13277-9652-4d33-8f78-83ec11da0556-catalog-content\") pod \"redhat-operators-nmdx8\" (UID: \"6fe13277-9652-4d33-8f78-83ec11da0556\") " pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:29 crc kubenswrapper[5047]: I0223 08:08:29.152509 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe13277-9652-4d33-8f78-83ec11da0556-utilities\") pod \"redhat-operators-nmdx8\" (UID: \"6fe13277-9652-4d33-8f78-83ec11da0556\") " pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:29 crc kubenswrapper[5047]: I0223 08:08:29.152582 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngfrg\" (UniqueName: \"kubernetes.io/projected/6fe13277-9652-4d33-8f78-83ec11da0556-kube-api-access-ngfrg\") pod \"redhat-operators-nmdx8\" (UID: \"6fe13277-9652-4d33-8f78-83ec11da0556\") " pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:29 crc kubenswrapper[5047]: I0223 08:08:29.152689 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe13277-9652-4d33-8f78-83ec11da0556-catalog-content\") pod \"redhat-operators-nmdx8\" (UID: \"6fe13277-9652-4d33-8f78-83ec11da0556\") " pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:29 crc kubenswrapper[5047]: I0223 08:08:29.153144 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe13277-9652-4d33-8f78-83ec11da0556-utilities\") pod \"redhat-operators-nmdx8\" (UID: \"6fe13277-9652-4d33-8f78-83ec11da0556\") " pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:29 crc kubenswrapper[5047]: I0223 08:08:29.187433 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngfrg\" (UniqueName: \"kubernetes.io/projected/6fe13277-9652-4d33-8f78-83ec11da0556-kube-api-access-ngfrg\") pod \"redhat-operators-nmdx8\" (UID: \"6fe13277-9652-4d33-8f78-83ec11da0556\") " pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:29 crc kubenswrapper[5047]: I0223 08:08:29.321184 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:29 crc kubenswrapper[5047]: I0223 08:08:29.798034 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nmdx8"] Feb 23 08:08:30 crc kubenswrapper[5047]: I0223 08:08:30.704700 5047 generic.go:334] "Generic (PLEG): container finished" podID="6fe13277-9652-4d33-8f78-83ec11da0556" containerID="a05a62ea1eb8479ceb72dfad5f3db82c80fa7095f5650eb5b76f2fd1dae5fc67" exitCode=0 Feb 23 08:08:30 crc kubenswrapper[5047]: I0223 08:08:30.704796 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmdx8" event={"ID":"6fe13277-9652-4d33-8f78-83ec11da0556","Type":"ContainerDied","Data":"a05a62ea1eb8479ceb72dfad5f3db82c80fa7095f5650eb5b76f2fd1dae5fc67"} Feb 23 08:08:30 crc kubenswrapper[5047]: I0223 08:08:30.705070 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmdx8" event={"ID":"6fe13277-9652-4d33-8f78-83ec11da0556","Type":"ContainerStarted","Data":"1ef9c21f3823f1d16caecf07fd1b755add70eeffe3f820da45df35e02a5c9da3"} Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.350360 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7zd94"] Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.352300 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.362980 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zd94"] Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.498234 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2455b784-ab99-4cda-8194-5e4fc8fc6550-utilities\") pod \"certified-operators-7zd94\" (UID: \"2455b784-ab99-4cda-8194-5e4fc8fc6550\") " pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.498794 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zvrq\" (UniqueName: \"kubernetes.io/projected/2455b784-ab99-4cda-8194-5e4fc8fc6550-kube-api-access-8zvrq\") pod \"certified-operators-7zd94\" (UID: \"2455b784-ab99-4cda-8194-5e4fc8fc6550\") " pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.498832 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2455b784-ab99-4cda-8194-5e4fc8fc6550-catalog-content\") pod \"certified-operators-7zd94\" (UID: \"2455b784-ab99-4cda-8194-5e4fc8fc6550\") " pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.600731 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2455b784-ab99-4cda-8194-5e4fc8fc6550-utilities\") pod \"certified-operators-7zd94\" (UID: \"2455b784-ab99-4cda-8194-5e4fc8fc6550\") " pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.600797 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zvrq\" (UniqueName: \"kubernetes.io/projected/2455b784-ab99-4cda-8194-5e4fc8fc6550-kube-api-access-8zvrq\") pod \"certified-operators-7zd94\" (UID: \"2455b784-ab99-4cda-8194-5e4fc8fc6550\") " pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.600818 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2455b784-ab99-4cda-8194-5e4fc8fc6550-catalog-content\") pod \"certified-operators-7zd94\" (UID: \"2455b784-ab99-4cda-8194-5e4fc8fc6550\") " pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.601371 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2455b784-ab99-4cda-8194-5e4fc8fc6550-catalog-content\") pod \"certified-operators-7zd94\" (UID: \"2455b784-ab99-4cda-8194-5e4fc8fc6550\") " pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.601855 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2455b784-ab99-4cda-8194-5e4fc8fc6550-utilities\") pod \"certified-operators-7zd94\" (UID: \"2455b784-ab99-4cda-8194-5e4fc8fc6550\") " pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.621779 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zvrq\" (UniqueName: \"kubernetes.io/projected/2455b784-ab99-4cda-8194-5e4fc8fc6550-kube-api-access-8zvrq\") pod \"certified-operators-7zd94\" (UID: \"2455b784-ab99-4cda-8194-5e4fc8fc6550\") " pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.717060 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmdx8" event={"ID":"6fe13277-9652-4d33-8f78-83ec11da0556","Type":"ContainerStarted","Data":"c2c7b26b6c36589d3e62965e154f366c1a162e42eaf4917ce107230babe014c0"} Feb 23 08:08:31 crc kubenswrapper[5047]: I0223 08:08:31.753489 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:32 crc kubenswrapper[5047]: I0223 08:08:32.435688 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zd94"] Feb 23 08:08:32 crc kubenswrapper[5047]: W0223 08:08:32.443506 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2455b784_ab99_4cda_8194_5e4fc8fc6550.slice/crio-7f94212d8ecf5841fa00efe9461b1e469e0ff7c35670e6173d73563dd000ad2f WatchSource:0}: Error finding container 7f94212d8ecf5841fa00efe9461b1e469e0ff7c35670e6173d73563dd000ad2f: Status 404 returned error can't find the container with id 7f94212d8ecf5841fa00efe9461b1e469e0ff7c35670e6173d73563dd000ad2f Feb 23 08:08:32 crc kubenswrapper[5047]: I0223 08:08:32.733500 5047 generic.go:334] "Generic (PLEG): container finished" podID="6fe13277-9652-4d33-8f78-83ec11da0556" containerID="c2c7b26b6c36589d3e62965e154f366c1a162e42eaf4917ce107230babe014c0" exitCode=0 Feb 23 08:08:32 crc kubenswrapper[5047]: I0223 08:08:32.733589 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmdx8" event={"ID":"6fe13277-9652-4d33-8f78-83ec11da0556","Type":"ContainerDied","Data":"c2c7b26b6c36589d3e62965e154f366c1a162e42eaf4917ce107230babe014c0"} Feb 23 08:08:32 crc kubenswrapper[5047]: I0223 08:08:32.735346 5047 generic.go:334] "Generic (PLEG): container finished" podID="2455b784-ab99-4cda-8194-5e4fc8fc6550" containerID="f225a86c1fef385784738547d054990b28b0b5acf01b506bdcd21ebf659495d0" exitCode=0 Feb 23 08:08:32 crc kubenswrapper[5047]: I0223 08:08:32.735370 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zd94" event={"ID":"2455b784-ab99-4cda-8194-5e4fc8fc6550","Type":"ContainerDied","Data":"f225a86c1fef385784738547d054990b28b0b5acf01b506bdcd21ebf659495d0"} Feb 23 08:08:32 crc kubenswrapper[5047]: I0223 08:08:32.735384 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zd94" event={"ID":"2455b784-ab99-4cda-8194-5e4fc8fc6550","Type":"ContainerStarted","Data":"7f94212d8ecf5841fa00efe9461b1e469e0ff7c35670e6173d73563dd000ad2f"} Feb 23 08:08:33 crc kubenswrapper[5047]: I0223 08:08:33.748397 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmdx8" event={"ID":"6fe13277-9652-4d33-8f78-83ec11da0556","Type":"ContainerStarted","Data":"37392ba55726c91b957e4d2a5e0be4816d1d259c68d74ba4114a2295b905c73d"} Feb 23 08:08:33 crc kubenswrapper[5047]: I0223 08:08:33.751324 5047 generic.go:334] "Generic (PLEG): container finished" podID="2455b784-ab99-4cda-8194-5e4fc8fc6550" containerID="f1a8d1fe1dbc72f6bcd5afad2391511d12cf126ba2cdd37bd21af25b26302521" exitCode=0 Feb 23 08:08:33 crc kubenswrapper[5047]: I0223 08:08:33.751358 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zd94" event={"ID":"2455b784-ab99-4cda-8194-5e4fc8fc6550","Type":"ContainerDied","Data":"f1a8d1fe1dbc72f6bcd5afad2391511d12cf126ba2cdd37bd21af25b26302521"} Feb 23 08:08:33 crc kubenswrapper[5047]: I0223 08:08:33.786095 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nmdx8" podStartSLOduration=3.342843763 podStartE2EDuration="5.786070829s" podCreationTimestamp="2026-02-23 08:08:28 +0000 UTC" firstStartedPulling="2026-02-23 08:08:30.706639943 +0000 UTC m=+5032.957967107" lastFinishedPulling="2026-02-23 08:08:33.149867029 +0000 UTC m=+5035.401194173" observedRunningTime="2026-02-23 08:08:33.780679584 +0000 UTC m=+5036.032006718" watchObservedRunningTime="2026-02-23 08:08:33.786070829 +0000 UTC m=+5036.037397963" Feb 23 08:08:34 crc kubenswrapper[5047]: I0223 08:08:34.760826 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zd94" event={"ID":"2455b784-ab99-4cda-8194-5e4fc8fc6550","Type":"ContainerStarted","Data":"a826c3646ba19fd5e360db6db9e2071d1f9ad84d80fe1302aa7de959a594fd50"} Feb 23 08:08:34 crc kubenswrapper[5047]: I0223 08:08:34.781933 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7zd94" podStartSLOduration=2.388225728 podStartE2EDuration="3.781895503s" podCreationTimestamp="2026-02-23 08:08:31 +0000 UTC" firstStartedPulling="2026-02-23 08:08:32.737026053 +0000 UTC m=+5034.988353187" lastFinishedPulling="2026-02-23 08:08:34.130695818 +0000 UTC m=+5036.382022962" observedRunningTime="2026-02-23 08:08:34.781583425 +0000 UTC m=+5037.032910559" watchObservedRunningTime="2026-02-23 08:08:34.781895503 +0000 UTC m=+5037.033222637" Feb 23 08:08:39 crc kubenswrapper[5047]: I0223 08:08:39.322138 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:39 crc kubenswrapper[5047]: I0223 08:08:39.322213 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:40 crc kubenswrapper[5047]: I0223 08:08:40.375728 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nmdx8" podUID="6fe13277-9652-4d33-8f78-83ec11da0556" containerName="registry-server" probeResult="failure" output=< Feb 23 08:08:40 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 08:08:40 crc kubenswrapper[5047]: > Feb 23 08:08:41 crc kubenswrapper[5047]: I0223 08:08:41.754602 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:41 crc kubenswrapper[5047]: I0223 08:08:41.754670 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:41 crc kubenswrapper[5047]: I0223 08:08:41.814878 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:41 crc kubenswrapper[5047]: I0223 08:08:41.874881 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:42 crc kubenswrapper[5047]: I0223 08:08:42.065834 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zd94"] Feb 23 08:08:43 crc kubenswrapper[5047]: I0223 08:08:43.831893 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7zd94" podUID="2455b784-ab99-4cda-8194-5e4fc8fc6550" containerName="registry-server" containerID="cri-o://a826c3646ba19fd5e360db6db9e2071d1f9ad84d80fe1302aa7de959a594fd50" gracePeriod=2 Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.348256 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.459601 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zvrq\" (UniqueName: \"kubernetes.io/projected/2455b784-ab99-4cda-8194-5e4fc8fc6550-kube-api-access-8zvrq\") pod \"2455b784-ab99-4cda-8194-5e4fc8fc6550\" (UID: \"2455b784-ab99-4cda-8194-5e4fc8fc6550\") " Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.459808 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2455b784-ab99-4cda-8194-5e4fc8fc6550-utilities\") pod \"2455b784-ab99-4cda-8194-5e4fc8fc6550\" (UID: \"2455b784-ab99-4cda-8194-5e4fc8fc6550\") " Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.459882 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2455b784-ab99-4cda-8194-5e4fc8fc6550-catalog-content\") pod \"2455b784-ab99-4cda-8194-5e4fc8fc6550\" (UID: \"2455b784-ab99-4cda-8194-5e4fc8fc6550\") " Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.461039 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2455b784-ab99-4cda-8194-5e4fc8fc6550-utilities" (OuterVolumeSpecName: "utilities") pod "2455b784-ab99-4cda-8194-5e4fc8fc6550" (UID: "2455b784-ab99-4cda-8194-5e4fc8fc6550"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.466364 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2455b784-ab99-4cda-8194-5e4fc8fc6550-kube-api-access-8zvrq" (OuterVolumeSpecName: "kube-api-access-8zvrq") pod "2455b784-ab99-4cda-8194-5e4fc8fc6550" (UID: "2455b784-ab99-4cda-8194-5e4fc8fc6550"). InnerVolumeSpecName "kube-api-access-8zvrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.516073 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2455b784-ab99-4cda-8194-5e4fc8fc6550-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2455b784-ab99-4cda-8194-5e4fc8fc6550" (UID: "2455b784-ab99-4cda-8194-5e4fc8fc6550"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.563531 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zvrq\" (UniqueName: \"kubernetes.io/projected/2455b784-ab99-4cda-8194-5e4fc8fc6550-kube-api-access-8zvrq\") on node \"crc\" DevicePath \"\"" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.563591 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2455b784-ab99-4cda-8194-5e4fc8fc6550-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.563610 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2455b784-ab99-4cda-8194-5e4fc8fc6550-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.843448 5047 generic.go:334] "Generic (PLEG): container finished" podID="2455b784-ab99-4cda-8194-5e4fc8fc6550" containerID="a826c3646ba19fd5e360db6db9e2071d1f9ad84d80fe1302aa7de959a594fd50" exitCode=0 Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.843515 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zd94" event={"ID":"2455b784-ab99-4cda-8194-5e4fc8fc6550","Type":"ContainerDied","Data":"a826c3646ba19fd5e360db6db9e2071d1f9ad84d80fe1302aa7de959a594fd50"} Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.843548 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zd94" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.843577 5047 scope.go:117] "RemoveContainer" containerID="a826c3646ba19fd5e360db6db9e2071d1f9ad84d80fe1302aa7de959a594fd50" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.843561 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zd94" event={"ID":"2455b784-ab99-4cda-8194-5e4fc8fc6550","Type":"ContainerDied","Data":"7f94212d8ecf5841fa00efe9461b1e469e0ff7c35670e6173d73563dd000ad2f"} Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.870777 5047 scope.go:117] "RemoveContainer" containerID="f1a8d1fe1dbc72f6bcd5afad2391511d12cf126ba2cdd37bd21af25b26302521" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.910994 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zd94"] Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.919165 5047 scope.go:117] "RemoveContainer" containerID="f225a86c1fef385784738547d054990b28b0b5acf01b506bdcd21ebf659495d0" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.922945 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7zd94"] Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.951222 5047 scope.go:117] "RemoveContainer" containerID="a826c3646ba19fd5e360db6db9e2071d1f9ad84d80fe1302aa7de959a594fd50" Feb 23 08:08:44 crc kubenswrapper[5047]: E0223 08:08:44.951652 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a826c3646ba19fd5e360db6db9e2071d1f9ad84d80fe1302aa7de959a594fd50\": container with ID starting with a826c3646ba19fd5e360db6db9e2071d1f9ad84d80fe1302aa7de959a594fd50 not found: ID does not exist" containerID="a826c3646ba19fd5e360db6db9e2071d1f9ad84d80fe1302aa7de959a594fd50" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.951700 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a826c3646ba19fd5e360db6db9e2071d1f9ad84d80fe1302aa7de959a594fd50"} err="failed to get container status \"a826c3646ba19fd5e360db6db9e2071d1f9ad84d80fe1302aa7de959a594fd50\": rpc error: code = NotFound desc = could not find container \"a826c3646ba19fd5e360db6db9e2071d1f9ad84d80fe1302aa7de959a594fd50\": container with ID starting with a826c3646ba19fd5e360db6db9e2071d1f9ad84d80fe1302aa7de959a594fd50 not found: ID does not exist" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.951730 5047 scope.go:117] "RemoveContainer" containerID="f1a8d1fe1dbc72f6bcd5afad2391511d12cf126ba2cdd37bd21af25b26302521" Feb 23 08:08:44 crc kubenswrapper[5047]: E0223 08:08:44.951993 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1a8d1fe1dbc72f6bcd5afad2391511d12cf126ba2cdd37bd21af25b26302521\": container with ID starting with f1a8d1fe1dbc72f6bcd5afad2391511d12cf126ba2cdd37bd21af25b26302521 not found: ID does not exist" containerID="f1a8d1fe1dbc72f6bcd5afad2391511d12cf126ba2cdd37bd21af25b26302521" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.952022 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1a8d1fe1dbc72f6bcd5afad2391511d12cf126ba2cdd37bd21af25b26302521"} err="failed to get container status \"f1a8d1fe1dbc72f6bcd5afad2391511d12cf126ba2cdd37bd21af25b26302521\": rpc error: code = NotFound desc = could not find container \"f1a8d1fe1dbc72f6bcd5afad2391511d12cf126ba2cdd37bd21af25b26302521\": container with ID starting with f1a8d1fe1dbc72f6bcd5afad2391511d12cf126ba2cdd37bd21af25b26302521 not found: ID does not exist" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.952040 5047 scope.go:117] "RemoveContainer" containerID="f225a86c1fef385784738547d054990b28b0b5acf01b506bdcd21ebf659495d0" Feb 23 08:08:44 crc kubenswrapper[5047]: E0223 08:08:44.952787 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f225a86c1fef385784738547d054990b28b0b5acf01b506bdcd21ebf659495d0\": container with ID starting with f225a86c1fef385784738547d054990b28b0b5acf01b506bdcd21ebf659495d0 not found: ID does not exist" containerID="f225a86c1fef385784738547d054990b28b0b5acf01b506bdcd21ebf659495d0" Feb 23 08:08:44 crc kubenswrapper[5047]: I0223 08:08:44.952814 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f225a86c1fef385784738547d054990b28b0b5acf01b506bdcd21ebf659495d0"} err="failed to get container status \"f225a86c1fef385784738547d054990b28b0b5acf01b506bdcd21ebf659495d0\": rpc error: code = NotFound desc = could not find container \"f225a86c1fef385784738547d054990b28b0b5acf01b506bdcd21ebf659495d0\": container with ID starting with f225a86c1fef385784738547d054990b28b0b5acf01b506bdcd21ebf659495d0 not found: ID does not exist" Feb 23 08:08:46 crc kubenswrapper[5047]: I0223 08:08:46.352296 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2455b784-ab99-4cda-8194-5e4fc8fc6550" path="/var/lib/kubelet/pods/2455b784-ab99-4cda-8194-5e4fc8fc6550/volumes" Feb 23 08:08:49 crc kubenswrapper[5047]: I0223 08:08:49.395846 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:49 crc kubenswrapper[5047]: I0223 08:08:49.462677 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:49 crc kubenswrapper[5047]: I0223 08:08:49.665582 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nmdx8"] Feb 23 08:08:50 crc kubenswrapper[5047]: I0223 08:08:50.897471 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nmdx8" podUID="6fe13277-9652-4d33-8f78-83ec11da0556" containerName="registry-server" containerID="cri-o://37392ba55726c91b957e4d2a5e0be4816d1d259c68d74ba4114a2295b905c73d" gracePeriod=2 Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.385675 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.484687 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngfrg\" (UniqueName: \"kubernetes.io/projected/6fe13277-9652-4d33-8f78-83ec11da0556-kube-api-access-ngfrg\") pod \"6fe13277-9652-4d33-8f78-83ec11da0556\" (UID: \"6fe13277-9652-4d33-8f78-83ec11da0556\") " Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.484791 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe13277-9652-4d33-8f78-83ec11da0556-catalog-content\") pod \"6fe13277-9652-4d33-8f78-83ec11da0556\" (UID: \"6fe13277-9652-4d33-8f78-83ec11da0556\") " Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.484860 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe13277-9652-4d33-8f78-83ec11da0556-utilities\") pod \"6fe13277-9652-4d33-8f78-83ec11da0556\" (UID: \"6fe13277-9652-4d33-8f78-83ec11da0556\") " Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.488031 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe13277-9652-4d33-8f78-83ec11da0556-utilities" (OuterVolumeSpecName: "utilities") pod "6fe13277-9652-4d33-8f78-83ec11da0556" (UID: "6fe13277-9652-4d33-8f78-83ec11da0556"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.495253 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe13277-9652-4d33-8f78-83ec11da0556-kube-api-access-ngfrg" (OuterVolumeSpecName: "kube-api-access-ngfrg") pod "6fe13277-9652-4d33-8f78-83ec11da0556" (UID: "6fe13277-9652-4d33-8f78-83ec11da0556"). InnerVolumeSpecName "kube-api-access-ngfrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.586318 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe13277-9652-4d33-8f78-83ec11da0556-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.586360 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngfrg\" (UniqueName: \"kubernetes.io/projected/6fe13277-9652-4d33-8f78-83ec11da0556-kube-api-access-ngfrg\") on node \"crc\" DevicePath \"\"" Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.637412 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe13277-9652-4d33-8f78-83ec11da0556-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fe13277-9652-4d33-8f78-83ec11da0556" (UID: "6fe13277-9652-4d33-8f78-83ec11da0556"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.687340 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe13277-9652-4d33-8f78-83ec11da0556-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.911653 5047 generic.go:334] "Generic (PLEG): container finished" podID="6fe13277-9652-4d33-8f78-83ec11da0556" containerID="37392ba55726c91b957e4d2a5e0be4816d1d259c68d74ba4114a2295b905c73d" exitCode=0 Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.911739 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmdx8" Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.911734 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmdx8" event={"ID":"6fe13277-9652-4d33-8f78-83ec11da0556","Type":"ContainerDied","Data":"37392ba55726c91b957e4d2a5e0be4816d1d259c68d74ba4114a2295b905c73d"} Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.915008 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmdx8" event={"ID":"6fe13277-9652-4d33-8f78-83ec11da0556","Type":"ContainerDied","Data":"1ef9c21f3823f1d16caecf07fd1b755add70eeffe3f820da45df35e02a5c9da3"} Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.915092 5047 scope.go:117] "RemoveContainer" containerID="37392ba55726c91b957e4d2a5e0be4816d1d259c68d74ba4114a2295b905c73d" Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.946369 5047 scope.go:117] "RemoveContainer" containerID="c2c7b26b6c36589d3e62965e154f366c1a162e42eaf4917ce107230babe014c0" Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.980301 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nmdx8"] Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.991083 5047 scope.go:117] "RemoveContainer" containerID="a05a62ea1eb8479ceb72dfad5f3db82c80fa7095f5650eb5b76f2fd1dae5fc67" Feb 23 08:08:51 crc kubenswrapper[5047]: I0223 08:08:51.995352 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nmdx8"] Feb 23 08:08:52 crc kubenswrapper[5047]: I0223 08:08:52.008522 5047 scope.go:117] "RemoveContainer" containerID="37392ba55726c91b957e4d2a5e0be4816d1d259c68d74ba4114a2295b905c73d" Feb 23 08:08:52 crc kubenswrapper[5047]: E0223 08:08:52.009002 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37392ba55726c91b957e4d2a5e0be4816d1d259c68d74ba4114a2295b905c73d\": container with ID starting with 37392ba55726c91b957e4d2a5e0be4816d1d259c68d74ba4114a2295b905c73d not found: ID does not exist" containerID="37392ba55726c91b957e4d2a5e0be4816d1d259c68d74ba4114a2295b905c73d" Feb 23 08:08:52 crc kubenswrapper[5047]: I0223 08:08:52.009078 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37392ba55726c91b957e4d2a5e0be4816d1d259c68d74ba4114a2295b905c73d"} err="failed to get container status \"37392ba55726c91b957e4d2a5e0be4816d1d259c68d74ba4114a2295b905c73d\": rpc error: code = NotFound desc = could not find container \"37392ba55726c91b957e4d2a5e0be4816d1d259c68d74ba4114a2295b905c73d\": container with ID starting with 37392ba55726c91b957e4d2a5e0be4816d1d259c68d74ba4114a2295b905c73d not found: ID does not exist" Feb 23 08:08:52 crc kubenswrapper[5047]: I0223 08:08:52.009110 5047 scope.go:117] "RemoveContainer" containerID="c2c7b26b6c36589d3e62965e154f366c1a162e42eaf4917ce107230babe014c0" Feb 23 08:08:52 crc kubenswrapper[5047]: E0223 08:08:52.009494 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2c7b26b6c36589d3e62965e154f366c1a162e42eaf4917ce107230babe014c0\": container with ID starting with c2c7b26b6c36589d3e62965e154f366c1a162e42eaf4917ce107230babe014c0 not found: ID does not exist" containerID="c2c7b26b6c36589d3e62965e154f366c1a162e42eaf4917ce107230babe014c0" Feb 23 08:08:52 crc kubenswrapper[5047]: I0223 08:08:52.009529 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2c7b26b6c36589d3e62965e154f366c1a162e42eaf4917ce107230babe014c0"} err="failed to get container status \"c2c7b26b6c36589d3e62965e154f366c1a162e42eaf4917ce107230babe014c0\": rpc error: code = NotFound desc = could not find container \"c2c7b26b6c36589d3e62965e154f366c1a162e42eaf4917ce107230babe014c0\": container with ID starting with c2c7b26b6c36589d3e62965e154f366c1a162e42eaf4917ce107230babe014c0 not found: ID does not exist" Feb 23 08:08:52 crc kubenswrapper[5047]: I0223 08:08:52.009544 5047 scope.go:117] "RemoveContainer" containerID="a05a62ea1eb8479ceb72dfad5f3db82c80fa7095f5650eb5b76f2fd1dae5fc67" Feb 23 08:08:52 crc kubenswrapper[5047]: E0223 08:08:52.010097 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a05a62ea1eb8479ceb72dfad5f3db82c80fa7095f5650eb5b76f2fd1dae5fc67\": container with ID starting with a05a62ea1eb8479ceb72dfad5f3db82c80fa7095f5650eb5b76f2fd1dae5fc67 not found: ID does not exist" containerID="a05a62ea1eb8479ceb72dfad5f3db82c80fa7095f5650eb5b76f2fd1dae5fc67" Feb 23 08:08:52 crc kubenswrapper[5047]: I0223 08:08:52.010122 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a05a62ea1eb8479ceb72dfad5f3db82c80fa7095f5650eb5b76f2fd1dae5fc67"} err="failed to get container status \"a05a62ea1eb8479ceb72dfad5f3db82c80fa7095f5650eb5b76f2fd1dae5fc67\": rpc error: code = NotFound desc = could not find container \"a05a62ea1eb8479ceb72dfad5f3db82c80fa7095f5650eb5b76f2fd1dae5fc67\": container with ID starting with a05a62ea1eb8479ceb72dfad5f3db82c80fa7095f5650eb5b76f2fd1dae5fc67 not found: ID does not exist" Feb 23 08:08:52 crc kubenswrapper[5047]: I0223 08:08:52.350970 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe13277-9652-4d33-8f78-83ec11da0556" path="/var/lib/kubelet/pods/6fe13277-9652-4d33-8f78-83ec11da0556/volumes" Feb 23 08:09:16 crc kubenswrapper[5047]: I0223 08:09:16.760022 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:09:16 crc kubenswrapper[5047]: I0223 08:09:16.760859 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:09:46 crc kubenswrapper[5047]: I0223 08:09:46.759379 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:09:46 crc kubenswrapper[5047]: I0223 08:09:46.760324 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:10:16 crc kubenswrapper[5047]: I0223 08:10:16.759343 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:10:16 crc kubenswrapper[5047]: I0223 08:10:16.760362 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:10:16 crc kubenswrapper[5047]: I0223 08:10:16.760434 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 08:10:16 crc kubenswrapper[5047]: I0223 08:10:16.761269 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"332cc8df4762a91bdb66c6860a540e49fed45f15e7251e3a2ad529fe539ef41a"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:10:16 crc kubenswrapper[5047]: I0223 08:10:16.761373 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://332cc8df4762a91bdb66c6860a540e49fed45f15e7251e3a2ad529fe539ef41a" gracePeriod=600 Feb 23 08:10:17 crc kubenswrapper[5047]: I0223 08:10:17.790114 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="332cc8df4762a91bdb66c6860a540e49fed45f15e7251e3a2ad529fe539ef41a" exitCode=0 Feb 23 08:10:17 crc kubenswrapper[5047]: I0223 08:10:17.790143 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"332cc8df4762a91bdb66c6860a540e49fed45f15e7251e3a2ad529fe539ef41a"} Feb 23 08:10:17 crc kubenswrapper[5047]: I0223 08:10:17.790838 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59"} Feb 23 08:10:17 crc kubenswrapper[5047]: I0223 08:10:17.790975 5047 scope.go:117] "RemoveContainer" containerID="efeb77298b51518310a0b1a3b79963bed9617a8e3ec26c6c9dd7e709b08d4062" Feb 23 08:12:46 crc kubenswrapper[5047]: I0223 08:12:46.760190 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:12:46 crc kubenswrapper[5047]: I0223 08:12:46.761003 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:13:16 crc kubenswrapper[5047]: I0223 08:13:16.760028 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:13:16 crc kubenswrapper[5047]: I0223 08:13:16.760798 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:13:46 crc kubenswrapper[5047]: I0223 08:13:46.760340 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:13:46 crc kubenswrapper[5047]: I0223 08:13:46.761362 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:13:46 crc kubenswrapper[5047]: I0223 08:13:46.761445 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 08:13:46 crc kubenswrapper[5047]: I0223 08:13:46.762762 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:13:46 crc kubenswrapper[5047]: I0223 08:13:46.762935 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" gracePeriod=600 Feb 23 08:13:46 crc kubenswrapper[5047]: E0223 08:13:46.898942 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:13:47 crc kubenswrapper[5047]: I0223 08:13:47.854189 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" exitCode=0 Feb 23 08:13:47 crc kubenswrapper[5047]: I0223 08:13:47.854272 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59"} Feb 23 08:13:47 crc kubenswrapper[5047]: I0223 08:13:47.854340 5047 scope.go:117] "RemoveContainer" containerID="332cc8df4762a91bdb66c6860a540e49fed45f15e7251e3a2ad529fe539ef41a" Feb 23 08:13:47 crc kubenswrapper[5047]: I0223 08:13:47.855348 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:13:47 crc kubenswrapper[5047]: E0223 08:13:47.855816 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:14:00 crc kubenswrapper[5047]: I0223 08:14:00.340481 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:14:00 crc kubenswrapper[5047]: E0223 08:14:00.341653 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:14:13 crc kubenswrapper[5047]: I0223 08:14:13.340634 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:14:13 crc kubenswrapper[5047]: E0223 08:14:13.341684 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:14:28 crc kubenswrapper[5047]: I0223 08:14:28.348895 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:14:28 crc kubenswrapper[5047]: E0223 08:14:28.350247 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:14:28 crc kubenswrapper[5047]: I0223 08:14:28.998134 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9zsss"] Feb 23 08:14:28 crc kubenswrapper[5047]: E0223 08:14:28.998725 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe13277-9652-4d33-8f78-83ec11da0556" containerName="extract-utilities" Feb 23 08:14:28 crc kubenswrapper[5047]: I0223 08:14:28.998765 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe13277-9652-4d33-8f78-83ec11da0556" containerName="extract-utilities" Feb 23 08:14:28 crc kubenswrapper[5047]: E0223 08:14:28.998797 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2455b784-ab99-4cda-8194-5e4fc8fc6550" containerName="extract-utilities" Feb 23 08:14:28 crc kubenswrapper[5047]: I0223 08:14:28.998815 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2455b784-ab99-4cda-8194-5e4fc8fc6550" containerName="extract-utilities" Feb 23 08:14:28 crc kubenswrapper[5047]: E0223 08:14:28.998866 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2455b784-ab99-4cda-8194-5e4fc8fc6550" containerName="extract-content" Feb 23 08:14:28 crc kubenswrapper[5047]: I0223 08:14:28.998883 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2455b784-ab99-4cda-8194-5e4fc8fc6550" containerName="extract-content" Feb 23 08:14:28 crc kubenswrapper[5047]: E0223 08:14:28.998987 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2455b784-ab99-4cda-8194-5e4fc8fc6550" containerName="registry-server" Feb 23 08:14:28 crc kubenswrapper[5047]: I0223 08:14:28.999004 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2455b784-ab99-4cda-8194-5e4fc8fc6550" containerName="registry-server" Feb 23 08:14:28 crc kubenswrapper[5047]: E0223 08:14:28.999029 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe13277-9652-4d33-8f78-83ec11da0556" containerName="registry-server" Feb 23 08:14:28 crc kubenswrapper[5047]: I0223 08:14:28.999045 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe13277-9652-4d33-8f78-83ec11da0556" containerName="registry-server" Feb 23 08:14:28 crc kubenswrapper[5047]: E0223 08:14:28.999077 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe13277-9652-4d33-8f78-83ec11da0556" containerName="extract-content" Feb 23 08:14:28 crc kubenswrapper[5047]: I0223 08:14:28.999093 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe13277-9652-4d33-8f78-83ec11da0556" containerName="extract-content" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:28.999413 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe13277-9652-4d33-8f78-83ec11da0556" containerName="registry-server" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:28.999471 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="2455b784-ab99-4cda-8194-5e4fc8fc6550" containerName="registry-server" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:29.001822 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:29.033434 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zsss"] Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:29.179147 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-catalog-content\") pod \"community-operators-9zsss\" (UID: \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\") " pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:29.179331 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24h6g\" (UniqueName: \"kubernetes.io/projected/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-kube-api-access-24h6g\") pod \"community-operators-9zsss\" (UID: \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\") " pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:29.179386 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-utilities\") pod \"community-operators-9zsss\" (UID: \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\") " pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:29.280988 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24h6g\" (UniqueName: \"kubernetes.io/projected/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-kube-api-access-24h6g\") pod \"community-operators-9zsss\" (UID: \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\") " pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:29.281045 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-utilities\") pod \"community-operators-9zsss\" (UID: \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\") " pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:29.281080 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-catalog-content\") pod \"community-operators-9zsss\" (UID: \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\") " pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:29.281722 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-utilities\") pod \"community-operators-9zsss\" (UID: \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\") " pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:29.281778 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-catalog-content\") pod \"community-operators-9zsss\" (UID: \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\") " pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:29.307262 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24h6g\" (UniqueName: \"kubernetes.io/projected/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-kube-api-access-24h6g\") pod \"community-operators-9zsss\" (UID: \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\") " pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:29.330839 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:29 crc kubenswrapper[5047]: I0223 08:14:29.846252 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zsss"] Feb 23 08:14:30 crc kubenswrapper[5047]: I0223 08:14:30.304464 5047 generic.go:334] "Generic (PLEG): container finished" podID="4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" containerID="4fb4b51be988bcb9680cccf9b07ace911107db53a2d78f62f6ee1f935c7eff8b" exitCode=0 Feb 23 08:14:30 crc kubenswrapper[5047]: I0223 08:14:30.304551 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zsss" event={"ID":"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75","Type":"ContainerDied","Data":"4fb4b51be988bcb9680cccf9b07ace911107db53a2d78f62f6ee1f935c7eff8b"} Feb 23 08:14:30 crc kubenswrapper[5047]: I0223 08:14:30.304627 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zsss" event={"ID":"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75","Type":"ContainerStarted","Data":"16880f206e69152eeac368069930fe4792d124646e61e3dbcc58b15ab93a95e6"} Feb 23 08:14:30 crc kubenswrapper[5047]: I0223 08:14:30.307682 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:14:31 crc kubenswrapper[5047]: I0223 08:14:31.318088 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zsss" event={"ID":"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75","Type":"ContainerStarted","Data":"43f869f29acc1bae657f6aec0c97357fab1190251e81ddbbd7306afc05000734"} Feb 23 08:14:32 crc kubenswrapper[5047]: I0223 08:14:32.327980 5047 generic.go:334] "Generic (PLEG): container finished" podID="4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" containerID="43f869f29acc1bae657f6aec0c97357fab1190251e81ddbbd7306afc05000734" exitCode=0 Feb 23 08:14:32 crc kubenswrapper[5047]: I0223 08:14:32.328048 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zsss" event={"ID":"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75","Type":"ContainerDied","Data":"43f869f29acc1bae657f6aec0c97357fab1190251e81ddbbd7306afc05000734"} Feb 23 08:14:33 crc kubenswrapper[5047]: I0223 08:14:33.339754 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zsss" event={"ID":"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75","Type":"ContainerStarted","Data":"742459d5706742087f88bf4e1bd528cf58068fe93794d6877cf7905069cfae9f"} Feb 23 08:14:33 crc kubenswrapper[5047]: I0223 08:14:33.382163 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9zsss" podStartSLOduration=2.948642638 podStartE2EDuration="5.382135893s" podCreationTimestamp="2026-02-23 08:14:28 +0000 UTC" firstStartedPulling="2026-02-23 08:14:30.307018282 +0000 UTC m=+5392.558345416" lastFinishedPulling="2026-02-23 08:14:32.740511537 +0000 UTC m=+5394.991838671" observedRunningTime="2026-02-23 08:14:33.370454819 +0000 UTC m=+5395.621781973" watchObservedRunningTime="2026-02-23 08:14:33.382135893 +0000 UTC m=+5395.633463067" Feb 23 08:14:39 crc kubenswrapper[5047]: I0223 08:14:39.331472 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:39 crc kubenswrapper[5047]: I0223 08:14:39.332388 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:39 crc kubenswrapper[5047]: I0223 08:14:39.414810 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:40 crc kubenswrapper[5047]: I0223 08:14:40.341574 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:14:40 crc kubenswrapper[5047]: E0223 08:14:40.342005 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:14:40 crc kubenswrapper[5047]: I0223 08:14:40.482541 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:40 crc kubenswrapper[5047]: I0223 08:14:40.561344 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zsss"] Feb 23 08:14:42 crc kubenswrapper[5047]: I0223 08:14:42.433221 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9zsss" podUID="4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" containerName="registry-server" containerID="cri-o://742459d5706742087f88bf4e1bd528cf58068fe93794d6877cf7905069cfae9f" gracePeriod=2 Feb 23 08:14:42 crc kubenswrapper[5047]: I0223 08:14:42.934757 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.135869 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24h6g\" (UniqueName: \"kubernetes.io/projected/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-kube-api-access-24h6g\") pod \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\" (UID: \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\") " Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.136096 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-catalog-content\") pod \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\" (UID: \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\") " Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.136252 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-utilities\") pod \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\" (UID: \"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75\") " Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.137579 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-utilities" (OuterVolumeSpecName: "utilities") pod "4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" (UID: "4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.144550 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-kube-api-access-24h6g" (OuterVolumeSpecName: "kube-api-access-24h6g") pod "4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" (UID: "4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75"). InnerVolumeSpecName "kube-api-access-24h6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.223688 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" (UID: "4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.239118 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24h6g\" (UniqueName: \"kubernetes.io/projected/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-kube-api-access-24h6g\") on node \"crc\" DevicePath \"\"" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.239175 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.239193 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.450104 5047 generic.go:334] "Generic (PLEG): container finished" podID="4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" containerID="742459d5706742087f88bf4e1bd528cf58068fe93794d6877cf7905069cfae9f" exitCode=0 Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.450197 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zsss" event={"ID":"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75","Type":"ContainerDied","Data":"742459d5706742087f88bf4e1bd528cf58068fe93794d6877cf7905069cfae9f"} Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.450208 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zsss" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.450266 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zsss" event={"ID":"4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75","Type":"ContainerDied","Data":"16880f206e69152eeac368069930fe4792d124646e61e3dbcc58b15ab93a95e6"} Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.450323 5047 scope.go:117] "RemoveContainer" containerID="742459d5706742087f88bf4e1bd528cf58068fe93794d6877cf7905069cfae9f" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.483725 5047 scope.go:117] "RemoveContainer" containerID="43f869f29acc1bae657f6aec0c97357fab1190251e81ddbbd7306afc05000734" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.514160 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zsss"] Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.517866 5047 scope.go:117] "RemoveContainer" containerID="4fb4b51be988bcb9680cccf9b07ace911107db53a2d78f62f6ee1f935c7eff8b" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.530899 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9zsss"] Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.566334 5047 scope.go:117] "RemoveContainer" containerID="742459d5706742087f88bf4e1bd528cf58068fe93794d6877cf7905069cfae9f" Feb 23 08:14:43 crc kubenswrapper[5047]: E0223 08:14:43.568393 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"742459d5706742087f88bf4e1bd528cf58068fe93794d6877cf7905069cfae9f\": container with ID starting with 742459d5706742087f88bf4e1bd528cf58068fe93794d6877cf7905069cfae9f not found: ID does not exist" containerID="742459d5706742087f88bf4e1bd528cf58068fe93794d6877cf7905069cfae9f" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.568512 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742459d5706742087f88bf4e1bd528cf58068fe93794d6877cf7905069cfae9f"} err="failed to get container status \"742459d5706742087f88bf4e1bd528cf58068fe93794d6877cf7905069cfae9f\": rpc error: code = NotFound desc = could not find container \"742459d5706742087f88bf4e1bd528cf58068fe93794d6877cf7905069cfae9f\": container with ID starting with 742459d5706742087f88bf4e1bd528cf58068fe93794d6877cf7905069cfae9f not found: ID does not exist" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.568578 5047 scope.go:117] "RemoveContainer" containerID="43f869f29acc1bae657f6aec0c97357fab1190251e81ddbbd7306afc05000734" Feb 23 08:14:43 crc kubenswrapper[5047]: E0223 08:14:43.569207 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f869f29acc1bae657f6aec0c97357fab1190251e81ddbbd7306afc05000734\": container with ID starting with 43f869f29acc1bae657f6aec0c97357fab1190251e81ddbbd7306afc05000734 not found: ID does not exist" containerID="43f869f29acc1bae657f6aec0c97357fab1190251e81ddbbd7306afc05000734" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.569235 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f869f29acc1bae657f6aec0c97357fab1190251e81ddbbd7306afc05000734"} err="failed to get container status \"43f869f29acc1bae657f6aec0c97357fab1190251e81ddbbd7306afc05000734\": rpc error: code = NotFound desc = could not find container \"43f869f29acc1bae657f6aec0c97357fab1190251e81ddbbd7306afc05000734\": container with ID starting with 43f869f29acc1bae657f6aec0c97357fab1190251e81ddbbd7306afc05000734 not found: ID does not exist" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.569252 5047 scope.go:117] "RemoveContainer" containerID="4fb4b51be988bcb9680cccf9b07ace911107db53a2d78f62f6ee1f935c7eff8b" Feb 23 08:14:43 crc kubenswrapper[5047]: E0223 08:14:43.570062 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fb4b51be988bcb9680cccf9b07ace911107db53a2d78f62f6ee1f935c7eff8b\": container with ID starting with 4fb4b51be988bcb9680cccf9b07ace911107db53a2d78f62f6ee1f935c7eff8b not found: ID does not exist" containerID="4fb4b51be988bcb9680cccf9b07ace911107db53a2d78f62f6ee1f935c7eff8b" Feb 23 08:14:43 crc kubenswrapper[5047]: I0223 08:14:43.570181 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb4b51be988bcb9680cccf9b07ace911107db53a2d78f62f6ee1f935c7eff8b"} err="failed to get container status \"4fb4b51be988bcb9680cccf9b07ace911107db53a2d78f62f6ee1f935c7eff8b\": rpc error: code = NotFound desc = could not find container \"4fb4b51be988bcb9680cccf9b07ace911107db53a2d78f62f6ee1f935c7eff8b\": container with ID starting with 4fb4b51be988bcb9680cccf9b07ace911107db53a2d78f62f6ee1f935c7eff8b not found: ID does not exist" Feb 23 08:14:44 crc kubenswrapper[5047]: I0223 08:14:44.356679 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" path="/var/lib/kubelet/pods/4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75/volumes" Feb 23 08:14:52 crc kubenswrapper[5047]: I0223 08:14:52.342321 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:14:52 crc kubenswrapper[5047]: E0223 08:14:52.343567 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.170417 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn"] Feb 23 08:15:00 crc kubenswrapper[5047]: E0223 08:15:00.171347 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" containerName="extract-content" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.171365 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" containerName="extract-content" Feb 23 08:15:00 crc kubenswrapper[5047]: E0223 08:15:00.171383 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" containerName="registry-server" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.171388 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" containerName="registry-server" Feb 23 08:15:00 crc kubenswrapper[5047]: E0223 08:15:00.171422 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" containerName="extract-utilities" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.171428 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" containerName="extract-utilities" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.171571 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ffb8fa4-46b7-40b8-8d22-50a6a42c5a75" containerName="registry-server" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.172152 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.181111 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.181584 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.190063 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn"] Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.344638 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0d81715-a150-4c19-8c5f-ac88e34317c2-secret-volume\") pod \"collect-profiles-29530575-khwmn\" (UID: \"b0d81715-a150-4c19-8c5f-ac88e34317c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.344693 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s9mg\" (UniqueName: \"kubernetes.io/projected/b0d81715-a150-4c19-8c5f-ac88e34317c2-kube-api-access-9s9mg\") pod \"collect-profiles-29530575-khwmn\" (UID: \"b0d81715-a150-4c19-8c5f-ac88e34317c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.344723 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0d81715-a150-4c19-8c5f-ac88e34317c2-config-volume\") pod \"collect-profiles-29530575-khwmn\" (UID: \"b0d81715-a150-4c19-8c5f-ac88e34317c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.446998 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s9mg\" (UniqueName: \"kubernetes.io/projected/b0d81715-a150-4c19-8c5f-ac88e34317c2-kube-api-access-9s9mg\") pod \"collect-profiles-29530575-khwmn\" (UID: \"b0d81715-a150-4c19-8c5f-ac88e34317c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.447063 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0d81715-a150-4c19-8c5f-ac88e34317c2-config-volume\") pod \"collect-profiles-29530575-khwmn\" (UID: \"b0d81715-a150-4c19-8c5f-ac88e34317c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.447176 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0d81715-a150-4c19-8c5f-ac88e34317c2-secret-volume\") pod \"collect-profiles-29530575-khwmn\" (UID: \"b0d81715-a150-4c19-8c5f-ac88e34317c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.448998 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0d81715-a150-4c19-8c5f-ac88e34317c2-config-volume\") pod \"collect-profiles-29530575-khwmn\" (UID: \"b0d81715-a150-4c19-8c5f-ac88e34317c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.454482 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0d81715-a150-4c19-8c5f-ac88e34317c2-secret-volume\") pod \"collect-profiles-29530575-khwmn\" (UID: \"b0d81715-a150-4c19-8c5f-ac88e34317c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.476734 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s9mg\" (UniqueName: \"kubernetes.io/projected/b0d81715-a150-4c19-8c5f-ac88e34317c2-kube-api-access-9s9mg\") pod \"collect-profiles-29530575-khwmn\" (UID: \"b0d81715-a150-4c19-8c5f-ac88e34317c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" Feb 23 08:15:00 crc kubenswrapper[5047]: I0223 08:15:00.494945 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" Feb 23 08:15:01 crc kubenswrapper[5047]: I0223 08:15:01.022968 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn"] Feb 23 08:15:01 crc kubenswrapper[5047]: I0223 08:15:01.656747 5047 generic.go:334] "Generic (PLEG): container finished" podID="b0d81715-a150-4c19-8c5f-ac88e34317c2" containerID="c2708b681368b38603116c4a55fd3d4ee45d4338cf1ec012ce29a141b51ed45d" exitCode=0 Feb 23 08:15:01 crc kubenswrapper[5047]: I0223 08:15:01.656821 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" event={"ID":"b0d81715-a150-4c19-8c5f-ac88e34317c2","Type":"ContainerDied","Data":"c2708b681368b38603116c4a55fd3d4ee45d4338cf1ec012ce29a141b51ed45d"} Feb 23 08:15:01 crc kubenswrapper[5047]: I0223 08:15:01.656872 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" event={"ID":"b0d81715-a150-4c19-8c5f-ac88e34317c2","Type":"ContainerStarted","Data":"6a279f80192342dc7f16828a43ca11b4861d1a15fce5a6fd4de2ebd61647393d"} Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.033463 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.198037 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0d81715-a150-4c19-8c5f-ac88e34317c2-config-volume\") pod \"b0d81715-a150-4c19-8c5f-ac88e34317c2\" (UID: \"b0d81715-a150-4c19-8c5f-ac88e34317c2\") " Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.198681 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0d81715-a150-4c19-8c5f-ac88e34317c2-secret-volume\") pod \"b0d81715-a150-4c19-8c5f-ac88e34317c2\" (UID: \"b0d81715-a150-4c19-8c5f-ac88e34317c2\") " Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.200162 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s9mg\" (UniqueName: \"kubernetes.io/projected/b0d81715-a150-4c19-8c5f-ac88e34317c2-kube-api-access-9s9mg\") pod \"b0d81715-a150-4c19-8c5f-ac88e34317c2\" (UID: \"b0d81715-a150-4c19-8c5f-ac88e34317c2\") " Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.200408 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d81715-a150-4c19-8c5f-ac88e34317c2-config-volume" (OuterVolumeSpecName: "config-volume") pod "b0d81715-a150-4c19-8c5f-ac88e34317c2" (UID: "b0d81715-a150-4c19-8c5f-ac88e34317c2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.200852 5047 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0d81715-a150-4c19-8c5f-ac88e34317c2-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.207262 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d81715-a150-4c19-8c5f-ac88e34317c2-kube-api-access-9s9mg" (OuterVolumeSpecName: "kube-api-access-9s9mg") pod "b0d81715-a150-4c19-8c5f-ac88e34317c2" (UID: "b0d81715-a150-4c19-8c5f-ac88e34317c2"). InnerVolumeSpecName "kube-api-access-9s9mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.209092 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d81715-a150-4c19-8c5f-ac88e34317c2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b0d81715-a150-4c19-8c5f-ac88e34317c2" (UID: "b0d81715-a150-4c19-8c5f-ac88e34317c2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.303144 5047 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0d81715-a150-4c19-8c5f-ac88e34317c2-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.303204 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s9mg\" (UniqueName: \"kubernetes.io/projected/b0d81715-a150-4c19-8c5f-ac88e34317c2-kube-api-access-9s9mg\") on node \"crc\" DevicePath \"\"" Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.341113 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:15:03 crc kubenswrapper[5047]: E0223 08:15:03.341454 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.675171 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" event={"ID":"b0d81715-a150-4c19-8c5f-ac88e34317c2","Type":"ContainerDied","Data":"6a279f80192342dc7f16828a43ca11b4861d1a15fce5a6fd4de2ebd61647393d"} Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.675245 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a279f80192342dc7f16828a43ca11b4861d1a15fce5a6fd4de2ebd61647393d" Feb 23 08:15:03 crc kubenswrapper[5047]: I0223 08:15:03.675338 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn" Feb 23 08:15:04 crc kubenswrapper[5047]: I0223 08:15:04.135991 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc"] Feb 23 08:15:04 crc kubenswrapper[5047]: I0223 08:15:04.136963 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-m9fjc"] Feb 23 08:15:04 crc kubenswrapper[5047]: I0223 08:15:04.354707 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="843b8015-e7e8-41a1-b138-af5e0c84a030" path="/var/lib/kubelet/pods/843b8015-e7e8-41a1-b138-af5e0c84a030/volumes" Feb 23 08:15:15 crc kubenswrapper[5047]: I0223 08:15:15.342269 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:15:15 crc kubenswrapper[5047]: E0223 08:15:15.344822 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:15:29 crc kubenswrapper[5047]: I0223 08:15:29.341328 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:15:29 crc kubenswrapper[5047]: E0223 08:15:29.341993 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:15:40 crc kubenswrapper[5047]: I0223 08:15:40.342073 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:15:40 crc kubenswrapper[5047]: E0223 08:15:40.343318 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:15:52 crc kubenswrapper[5047]: I0223 08:15:52.341409 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:15:52 crc kubenswrapper[5047]: E0223 08:15:52.342586 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:15:54 crc kubenswrapper[5047]: I0223 08:15:54.882188 5047 scope.go:117] "RemoveContainer" containerID="5e3175a92fc9af5cb28f81925aae27a76c1169a89bc1df5121ffbda49e05a7e6" Feb 23 08:16:06 crc kubenswrapper[5047]: I0223 08:16:06.341232 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:16:06 crc kubenswrapper[5047]: E0223 08:16:06.342517 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:16:21 crc kubenswrapper[5047]: I0223 08:16:21.341361 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:16:21 crc kubenswrapper[5047]: E0223 08:16:21.342818 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:16:36 crc kubenswrapper[5047]: I0223 08:16:36.341876 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:16:36 crc kubenswrapper[5047]: E0223 08:16:36.344249 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.616852 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l9mnk"] Feb 23 08:16:46 crc kubenswrapper[5047]: E0223 08:16:46.618416 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d81715-a150-4c19-8c5f-ac88e34317c2" containerName="collect-profiles" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.618449 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d81715-a150-4c19-8c5f-ac88e34317c2" containerName="collect-profiles" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.618757 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d81715-a150-4c19-8c5f-ac88e34317c2" containerName="collect-profiles" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.620769 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.634347 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9mnk"] Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.680397 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brh68\" (UniqueName: \"kubernetes.io/projected/5a3ea198-8511-4e36-81de-09d86e019dce-kube-api-access-brh68\") pod \"redhat-marketplace-l9mnk\" (UID: \"5a3ea198-8511-4e36-81de-09d86e019dce\") " pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.680863 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a3ea198-8511-4e36-81de-09d86e019dce-catalog-content\") pod \"redhat-marketplace-l9mnk\" (UID: \"5a3ea198-8511-4e36-81de-09d86e019dce\") " pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.680993 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a3ea198-8511-4e36-81de-09d86e019dce-utilities\") pod \"redhat-marketplace-l9mnk\" (UID: \"5a3ea198-8511-4e36-81de-09d86e019dce\") " pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.783008 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a3ea198-8511-4e36-81de-09d86e019dce-utilities\") pod \"redhat-marketplace-l9mnk\" (UID: \"5a3ea198-8511-4e36-81de-09d86e019dce\") " pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.783119 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brh68\" (UniqueName: \"kubernetes.io/projected/5a3ea198-8511-4e36-81de-09d86e019dce-kube-api-access-brh68\") pod \"redhat-marketplace-l9mnk\" (UID: \"5a3ea198-8511-4e36-81de-09d86e019dce\") " pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.783205 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a3ea198-8511-4e36-81de-09d86e019dce-catalog-content\") pod \"redhat-marketplace-l9mnk\" (UID: \"5a3ea198-8511-4e36-81de-09d86e019dce\") " pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.783899 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a3ea198-8511-4e36-81de-09d86e019dce-catalog-content\") pod \"redhat-marketplace-l9mnk\" (UID: \"5a3ea198-8511-4e36-81de-09d86e019dce\") " pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.784502 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a3ea198-8511-4e36-81de-09d86e019dce-utilities\") pod \"redhat-marketplace-l9mnk\" (UID: \"5a3ea198-8511-4e36-81de-09d86e019dce\") " pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.811612 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brh68\" (UniqueName: \"kubernetes.io/projected/5a3ea198-8511-4e36-81de-09d86e019dce-kube-api-access-brh68\") pod \"redhat-marketplace-l9mnk\" (UID: \"5a3ea198-8511-4e36-81de-09d86e019dce\") " pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:46 crc kubenswrapper[5047]: I0223 08:16:46.949725 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:47 crc kubenswrapper[5047]: I0223 08:16:47.183005 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9mnk"] Feb 23 08:16:47 crc kubenswrapper[5047]: I0223 08:16:47.341225 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:16:47 crc kubenswrapper[5047]: E0223 08:16:47.341505 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:16:47 crc kubenswrapper[5047]: I0223 08:16:47.709477 5047 generic.go:334] "Generic (PLEG): container finished" podID="5a3ea198-8511-4e36-81de-09d86e019dce" containerID="0017e9e45a53d8a2fe1278cd6b090a681204d583d5b0c7e2772dba1275e4090a" exitCode=0 Feb 23 08:16:47 crc kubenswrapper[5047]: I0223 08:16:47.709542 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9mnk" event={"ID":"5a3ea198-8511-4e36-81de-09d86e019dce","Type":"ContainerDied","Data":"0017e9e45a53d8a2fe1278cd6b090a681204d583d5b0c7e2772dba1275e4090a"} Feb 23 08:16:47 crc kubenswrapper[5047]: I0223 08:16:47.709580 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9mnk" event={"ID":"5a3ea198-8511-4e36-81de-09d86e019dce","Type":"ContainerStarted","Data":"799b4b7165762b1feb5377c321c16d4c923e933f2805aa88f70405a899db23ef"} Feb 23 08:16:48 crc kubenswrapper[5047]: I0223 08:16:48.719102 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9mnk" event={"ID":"5a3ea198-8511-4e36-81de-09d86e019dce","Type":"ContainerStarted","Data":"5fbde1dd180b66ceff2304cc070df35cd82d2e281c95f72b3b093fb4bb8bd23c"} Feb 23 08:16:49 crc kubenswrapper[5047]: I0223 08:16:49.731293 5047 generic.go:334] "Generic (PLEG): container finished" podID="5a3ea198-8511-4e36-81de-09d86e019dce" containerID="5fbde1dd180b66ceff2304cc070df35cd82d2e281c95f72b3b093fb4bb8bd23c" exitCode=0 Feb 23 08:16:49 crc kubenswrapper[5047]: I0223 08:16:49.731422 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9mnk" event={"ID":"5a3ea198-8511-4e36-81de-09d86e019dce","Type":"ContainerDied","Data":"5fbde1dd180b66ceff2304cc070df35cd82d2e281c95f72b3b093fb4bb8bd23c"} Feb 23 08:16:50 crc kubenswrapper[5047]: I0223 08:16:50.745893 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9mnk" event={"ID":"5a3ea198-8511-4e36-81de-09d86e019dce","Type":"ContainerStarted","Data":"aa55b1ccfd997116c9c11fd5b67a75daf642cc1fdd191b43c283667ba8bcf628"} Feb 23 08:16:50 crc kubenswrapper[5047]: I0223 08:16:50.787773 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l9mnk" podStartSLOduration=2.387914824 podStartE2EDuration="4.787747882s" podCreationTimestamp="2026-02-23 08:16:46 +0000 UTC" firstStartedPulling="2026-02-23 08:16:47.712644822 +0000 UTC m=+5529.963971956" lastFinishedPulling="2026-02-23 08:16:50.11247787 +0000 UTC m=+5532.363805014" observedRunningTime="2026-02-23 08:16:50.784489464 +0000 UTC m=+5533.035816608" watchObservedRunningTime="2026-02-23 08:16:50.787747882 +0000 UTC m=+5533.039075056" Feb 23 08:16:56 crc kubenswrapper[5047]: I0223 08:16:56.950447 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:56 crc kubenswrapper[5047]: I0223 08:16:56.950827 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:57 crc kubenswrapper[5047]: I0223 08:16:57.037672 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:57 crc kubenswrapper[5047]: I0223 08:16:57.875302 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:16:57 crc kubenswrapper[5047]: I0223 08:16:57.931909 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9mnk"] Feb 23 08:16:58 crc kubenswrapper[5047]: I0223 08:16:58.346023 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:16:58 crc kubenswrapper[5047]: E0223 08:16:58.346587 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:16:59 crc kubenswrapper[5047]: I0223 08:16:59.834075 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l9mnk" podUID="5a3ea198-8511-4e36-81de-09d86e019dce" containerName="registry-server" containerID="cri-o://aa55b1ccfd997116c9c11fd5b67a75daf642cc1fdd191b43c283667ba8bcf628" gracePeriod=2 Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.354468 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.428342 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a3ea198-8511-4e36-81de-09d86e019dce-catalog-content\") pod \"5a3ea198-8511-4e36-81de-09d86e019dce\" (UID: \"5a3ea198-8511-4e36-81de-09d86e019dce\") " Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.428533 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brh68\" (UniqueName: \"kubernetes.io/projected/5a3ea198-8511-4e36-81de-09d86e019dce-kube-api-access-brh68\") pod \"5a3ea198-8511-4e36-81de-09d86e019dce\" (UID: \"5a3ea198-8511-4e36-81de-09d86e019dce\") " Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.428646 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a3ea198-8511-4e36-81de-09d86e019dce-utilities\") pod \"5a3ea198-8511-4e36-81de-09d86e019dce\" (UID: \"5a3ea198-8511-4e36-81de-09d86e019dce\") " Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.429854 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a3ea198-8511-4e36-81de-09d86e019dce-utilities" (OuterVolumeSpecName: "utilities") pod "5a3ea198-8511-4e36-81de-09d86e019dce" (UID: "5a3ea198-8511-4e36-81de-09d86e019dce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.437514 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a3ea198-8511-4e36-81de-09d86e019dce-kube-api-access-brh68" (OuterVolumeSpecName: "kube-api-access-brh68") pod "5a3ea198-8511-4e36-81de-09d86e019dce" (UID: "5a3ea198-8511-4e36-81de-09d86e019dce"). InnerVolumeSpecName "kube-api-access-brh68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.454361 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a3ea198-8511-4e36-81de-09d86e019dce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a3ea198-8511-4e36-81de-09d86e019dce" (UID: "5a3ea198-8511-4e36-81de-09d86e019dce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.531304 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a3ea198-8511-4e36-81de-09d86e019dce-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.531385 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a3ea198-8511-4e36-81de-09d86e019dce-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.531415 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brh68\" (UniqueName: \"kubernetes.io/projected/5a3ea198-8511-4e36-81de-09d86e019dce-kube-api-access-brh68\") on node \"crc\" DevicePath \"\"" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.845467 5047 generic.go:334] "Generic (PLEG): container finished" podID="5a3ea198-8511-4e36-81de-09d86e019dce" containerID="aa55b1ccfd997116c9c11fd5b67a75daf642cc1fdd191b43c283667ba8bcf628" exitCode=0 Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.845548 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9mnk" event={"ID":"5a3ea198-8511-4e36-81de-09d86e019dce","Type":"ContainerDied","Data":"aa55b1ccfd997116c9c11fd5b67a75daf642cc1fdd191b43c283667ba8bcf628"} Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.845557 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9mnk" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.845607 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9mnk" event={"ID":"5a3ea198-8511-4e36-81de-09d86e019dce","Type":"ContainerDied","Data":"799b4b7165762b1feb5377c321c16d4c923e933f2805aa88f70405a899db23ef"} Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.845642 5047 scope.go:117] "RemoveContainer" containerID="aa55b1ccfd997116c9c11fd5b67a75daf642cc1fdd191b43c283667ba8bcf628" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.882612 5047 scope.go:117] "RemoveContainer" containerID="5fbde1dd180b66ceff2304cc070df35cd82d2e281c95f72b3b093fb4bb8bd23c" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.888267 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9mnk"] Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.893980 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9mnk"] Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.924054 5047 scope.go:117] "RemoveContainer" containerID="0017e9e45a53d8a2fe1278cd6b090a681204d583d5b0c7e2772dba1275e4090a" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.954879 5047 scope.go:117] "RemoveContainer" containerID="aa55b1ccfd997116c9c11fd5b67a75daf642cc1fdd191b43c283667ba8bcf628" Feb 23 08:17:00 crc kubenswrapper[5047]: E0223 08:17:00.955621 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa55b1ccfd997116c9c11fd5b67a75daf642cc1fdd191b43c283667ba8bcf628\": container with ID starting with aa55b1ccfd997116c9c11fd5b67a75daf642cc1fdd191b43c283667ba8bcf628 not found: ID does not exist" containerID="aa55b1ccfd997116c9c11fd5b67a75daf642cc1fdd191b43c283667ba8bcf628" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.955821 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa55b1ccfd997116c9c11fd5b67a75daf642cc1fdd191b43c283667ba8bcf628"} err="failed to get container status \"aa55b1ccfd997116c9c11fd5b67a75daf642cc1fdd191b43c283667ba8bcf628\": rpc error: code = NotFound desc = could not find container \"aa55b1ccfd997116c9c11fd5b67a75daf642cc1fdd191b43c283667ba8bcf628\": container with ID starting with aa55b1ccfd997116c9c11fd5b67a75daf642cc1fdd191b43c283667ba8bcf628 not found: ID does not exist" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.955872 5047 scope.go:117] "RemoveContainer" containerID="5fbde1dd180b66ceff2304cc070df35cd82d2e281c95f72b3b093fb4bb8bd23c" Feb 23 08:17:00 crc kubenswrapper[5047]: E0223 08:17:00.956311 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fbde1dd180b66ceff2304cc070df35cd82d2e281c95f72b3b093fb4bb8bd23c\": container with ID starting with 5fbde1dd180b66ceff2304cc070df35cd82d2e281c95f72b3b093fb4bb8bd23c not found: ID does not exist" containerID="5fbde1dd180b66ceff2304cc070df35cd82d2e281c95f72b3b093fb4bb8bd23c" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.956372 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fbde1dd180b66ceff2304cc070df35cd82d2e281c95f72b3b093fb4bb8bd23c"} err="failed to get container status \"5fbde1dd180b66ceff2304cc070df35cd82d2e281c95f72b3b093fb4bb8bd23c\": rpc error: code = NotFound desc = could not find container \"5fbde1dd180b66ceff2304cc070df35cd82d2e281c95f72b3b093fb4bb8bd23c\": container with ID starting with 5fbde1dd180b66ceff2304cc070df35cd82d2e281c95f72b3b093fb4bb8bd23c not found: ID does not exist" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.956415 5047 scope.go:117] "RemoveContainer" containerID="0017e9e45a53d8a2fe1278cd6b090a681204d583d5b0c7e2772dba1275e4090a" Feb 23 08:17:00 crc kubenswrapper[5047]: E0223 08:17:00.957107 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0017e9e45a53d8a2fe1278cd6b090a681204d583d5b0c7e2772dba1275e4090a\": container with ID starting with 0017e9e45a53d8a2fe1278cd6b090a681204d583d5b0c7e2772dba1275e4090a not found: ID does not exist" containerID="0017e9e45a53d8a2fe1278cd6b090a681204d583d5b0c7e2772dba1275e4090a" Feb 23 08:17:00 crc kubenswrapper[5047]: I0223 08:17:00.957181 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0017e9e45a53d8a2fe1278cd6b090a681204d583d5b0c7e2772dba1275e4090a"} err="failed to get container status \"0017e9e45a53d8a2fe1278cd6b090a681204d583d5b0c7e2772dba1275e4090a\": rpc error: code = NotFound desc = could not find container \"0017e9e45a53d8a2fe1278cd6b090a681204d583d5b0c7e2772dba1275e4090a\": container with ID starting with 0017e9e45a53d8a2fe1278cd6b090a681204d583d5b0c7e2772dba1275e4090a not found: ID does not exist" Feb 23 08:17:02 crc kubenswrapper[5047]: I0223 08:17:02.353120 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a3ea198-8511-4e36-81de-09d86e019dce" path="/var/lib/kubelet/pods/5a3ea198-8511-4e36-81de-09d86e019dce/volumes" Feb 23 08:17:12 crc kubenswrapper[5047]: I0223 08:17:12.342143 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:17:12 crc kubenswrapper[5047]: E0223 08:17:12.343524 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:17:26 crc kubenswrapper[5047]: I0223 08:17:26.340540 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:17:26 crc kubenswrapper[5047]: E0223 08:17:26.341111 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:17:38 crc kubenswrapper[5047]: I0223 08:17:38.378101 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:17:38 crc kubenswrapper[5047]: E0223 08:17:38.379678 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:17:52 crc kubenswrapper[5047]: I0223 08:17:52.342047 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:17:52 crc kubenswrapper[5047]: E0223 08:17:52.342881 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:18:07 crc kubenswrapper[5047]: I0223 08:18:07.341057 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:18:07 crc kubenswrapper[5047]: E0223 08:18:07.342165 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:18:20 crc kubenswrapper[5047]: I0223 08:18:20.341495 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:18:20 crc kubenswrapper[5047]: E0223 08:18:20.342662 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:18:35 crc kubenswrapper[5047]: I0223 08:18:35.341680 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:18:35 crc kubenswrapper[5047]: E0223 08:18:35.342903 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:18:49 crc kubenswrapper[5047]: I0223 08:18:49.342270 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:18:49 crc kubenswrapper[5047]: I0223 08:18:49.942028 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"7adba1578f3b89f77a4d3dbd29dbc310a04b48452a2a36e2244bdd32ebf8310f"} Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.381298 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kcx28"] Feb 23 08:19:44 crc kubenswrapper[5047]: E0223 08:19:44.382642 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3ea198-8511-4e36-81de-09d86e019dce" containerName="extract-content" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.382669 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3ea198-8511-4e36-81de-09d86e019dce" containerName="extract-content" Feb 23 08:19:44 crc kubenswrapper[5047]: E0223 08:19:44.382693 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3ea198-8511-4e36-81de-09d86e019dce" containerName="registry-server" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.382709 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3ea198-8511-4e36-81de-09d86e019dce" containerName="registry-server" Feb 23 08:19:44 crc kubenswrapper[5047]: E0223 08:19:44.382744 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3ea198-8511-4e36-81de-09d86e019dce" containerName="extract-utilities" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.382760 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3ea198-8511-4e36-81de-09d86e019dce" containerName="extract-utilities" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.383053 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3ea198-8511-4e36-81de-09d86e019dce" containerName="registry-server" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.385442 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.398647 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kcx28"] Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.566614 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-catalog-content\") pod \"redhat-operators-kcx28\" (UID: \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\") " pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.566822 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-utilities\") pod \"redhat-operators-kcx28\" (UID: \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\") " pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.566897 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q69nb\" (UniqueName: \"kubernetes.io/projected/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-kube-api-access-q69nb\") pod \"redhat-operators-kcx28\" (UID: \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\") " pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.668560 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-catalog-content\") pod \"redhat-operators-kcx28\" (UID: \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\") " pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.668712 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-utilities\") pod \"redhat-operators-kcx28\" (UID: \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\") " pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.668763 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q69nb\" (UniqueName: \"kubernetes.io/projected/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-kube-api-access-q69nb\") pod \"redhat-operators-kcx28\" (UID: \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\") " pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.669335 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-utilities\") pod \"redhat-operators-kcx28\" (UID: \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\") " pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.669335 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-catalog-content\") pod \"redhat-operators-kcx28\" (UID: \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\") " pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.705796 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q69nb\" (UniqueName: \"kubernetes.io/projected/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-kube-api-access-q69nb\") pod \"redhat-operators-kcx28\" (UID: \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\") " pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:19:44 crc kubenswrapper[5047]: I0223 08:19:44.718046 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:19:45 crc kubenswrapper[5047]: I0223 08:19:45.052862 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kcx28"] Feb 23 08:19:45 crc kubenswrapper[5047]: I0223 08:19:45.534885 5047 generic.go:334] "Generic (PLEG): container finished" podID="2b948d47-0b1c-404a-b39d-db5fddc4d1f9" containerID="3b8297d430b91fcded0363838996194fd22ebb664800466ae6fc256931aec5cb" exitCode=0 Feb 23 08:19:45 crc kubenswrapper[5047]: I0223 08:19:45.535013 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcx28" event={"ID":"2b948d47-0b1c-404a-b39d-db5fddc4d1f9","Type":"ContainerDied","Data":"3b8297d430b91fcded0363838996194fd22ebb664800466ae6fc256931aec5cb"} Feb 23 08:19:45 crc kubenswrapper[5047]: I0223 08:19:45.535384 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcx28" event={"ID":"2b948d47-0b1c-404a-b39d-db5fddc4d1f9","Type":"ContainerStarted","Data":"3b2350a0aef73257ed037dfa5962524e491c09032a1544e3231fb272b4112c0d"} Feb 23 08:19:45 crc kubenswrapper[5047]: I0223 08:19:45.537245 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:19:46 crc kubenswrapper[5047]: I0223 08:19:46.559255 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcx28" event={"ID":"2b948d47-0b1c-404a-b39d-db5fddc4d1f9","Type":"ContainerStarted","Data":"dc4a3a54f0f682ea775182ee51743524901d4ed04ec7971618a28089ea482247"} Feb 23 08:19:47 crc kubenswrapper[5047]: I0223 08:19:47.574572 5047 generic.go:334] "Generic (PLEG): container finished" podID="2b948d47-0b1c-404a-b39d-db5fddc4d1f9" containerID="dc4a3a54f0f682ea775182ee51743524901d4ed04ec7971618a28089ea482247" exitCode=0 Feb 23 08:19:47 crc kubenswrapper[5047]: I0223 08:19:47.574714 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcx28" event={"ID":"2b948d47-0b1c-404a-b39d-db5fddc4d1f9","Type":"ContainerDied","Data":"dc4a3a54f0f682ea775182ee51743524901d4ed04ec7971618a28089ea482247"} Feb 23 08:19:48 crc kubenswrapper[5047]: I0223 08:19:48.586157 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcx28" event={"ID":"2b948d47-0b1c-404a-b39d-db5fddc4d1f9","Type":"ContainerStarted","Data":"e45bb9ccb1613e8ab0eaf6ea2292e545b782c9138534c90b28c5324614a451db"} Feb 23 08:19:48 crc kubenswrapper[5047]: I0223 08:19:48.621643 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kcx28" podStartSLOduration=2.125247049 podStartE2EDuration="4.621617686s" podCreationTimestamp="2026-02-23 08:19:44 +0000 UTC" firstStartedPulling="2026-02-23 08:19:45.536997109 +0000 UTC m=+5707.788324243" lastFinishedPulling="2026-02-23 08:19:48.033367716 +0000 UTC m=+5710.284694880" observedRunningTime="2026-02-23 08:19:48.616059726 +0000 UTC m=+5710.867386870" watchObservedRunningTime="2026-02-23 08:19:48.621617686 +0000 UTC m=+5710.872944830" Feb 23 08:19:54 crc kubenswrapper[5047]: I0223 08:19:54.719436 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:19:54 crc kubenswrapper[5047]: I0223 08:19:54.720224 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:19:55 crc kubenswrapper[5047]: I0223 08:19:55.782656 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kcx28" podUID="2b948d47-0b1c-404a-b39d-db5fddc4d1f9" containerName="registry-server" probeResult="failure" output=< Feb 23 08:19:55 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 08:19:55 crc kubenswrapper[5047]: > Feb 23 08:20:04 crc kubenswrapper[5047]: I0223 08:20:04.793054 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:20:04 crc kubenswrapper[5047]: I0223 08:20:04.862239 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:20:05 crc kubenswrapper[5047]: I0223 08:20:05.049727 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kcx28"] Feb 23 08:20:06 crc kubenswrapper[5047]: I0223 08:20:06.753326 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kcx28" podUID="2b948d47-0b1c-404a-b39d-db5fddc4d1f9" containerName="registry-server" containerID="cri-o://e45bb9ccb1613e8ab0eaf6ea2292e545b782c9138534c90b28c5324614a451db" gracePeriod=2 Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.216883 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.368413 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-catalog-content\") pod \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\" (UID: \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\") " Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.369475 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q69nb\" (UniqueName: \"kubernetes.io/projected/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-kube-api-access-q69nb\") pod \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\" (UID: \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\") " Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.370154 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-utilities\") pod \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\" (UID: \"2b948d47-0b1c-404a-b39d-db5fddc4d1f9\") " Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.371887 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-utilities" (OuterVolumeSpecName: "utilities") pod "2b948d47-0b1c-404a-b39d-db5fddc4d1f9" (UID: "2b948d47-0b1c-404a-b39d-db5fddc4d1f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.379536 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-kube-api-access-q69nb" (OuterVolumeSpecName: "kube-api-access-q69nb") pod "2b948d47-0b1c-404a-b39d-db5fddc4d1f9" (UID: "2b948d47-0b1c-404a-b39d-db5fddc4d1f9"). InnerVolumeSpecName "kube-api-access-q69nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.473393 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q69nb\" (UniqueName: \"kubernetes.io/projected/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-kube-api-access-q69nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.473891 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.546367 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b948d47-0b1c-404a-b39d-db5fddc4d1f9" (UID: "2b948d47-0b1c-404a-b39d-db5fddc4d1f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.575803 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b948d47-0b1c-404a-b39d-db5fddc4d1f9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.768255 5047 generic.go:334] "Generic (PLEG): container finished" podID="2b948d47-0b1c-404a-b39d-db5fddc4d1f9" containerID="e45bb9ccb1613e8ab0eaf6ea2292e545b782c9138534c90b28c5324614a451db" exitCode=0 Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.768405 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcx28" event={"ID":"2b948d47-0b1c-404a-b39d-db5fddc4d1f9","Type":"ContainerDied","Data":"e45bb9ccb1613e8ab0eaf6ea2292e545b782c9138534c90b28c5324614a451db"} Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.768445 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcx28" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.768526 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcx28" event={"ID":"2b948d47-0b1c-404a-b39d-db5fddc4d1f9","Type":"ContainerDied","Data":"3b2350a0aef73257ed037dfa5962524e491c09032a1544e3231fb272b4112c0d"} Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.768563 5047 scope.go:117] "RemoveContainer" containerID="e45bb9ccb1613e8ab0eaf6ea2292e545b782c9138534c90b28c5324614a451db" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.830550 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kcx28"] Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.837812 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kcx28"] Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.839166 5047 scope.go:117] "RemoveContainer" containerID="dc4a3a54f0f682ea775182ee51743524901d4ed04ec7971618a28089ea482247" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.867590 5047 scope.go:117] "RemoveContainer" containerID="3b8297d430b91fcded0363838996194fd22ebb664800466ae6fc256931aec5cb" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.914096 5047 scope.go:117] "RemoveContainer" containerID="e45bb9ccb1613e8ab0eaf6ea2292e545b782c9138534c90b28c5324614a451db" Feb 23 08:20:07 crc kubenswrapper[5047]: E0223 08:20:07.914944 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e45bb9ccb1613e8ab0eaf6ea2292e545b782c9138534c90b28c5324614a451db\": container with ID starting with e45bb9ccb1613e8ab0eaf6ea2292e545b782c9138534c90b28c5324614a451db not found: ID does not exist" containerID="e45bb9ccb1613e8ab0eaf6ea2292e545b782c9138534c90b28c5324614a451db" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.915024 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e45bb9ccb1613e8ab0eaf6ea2292e545b782c9138534c90b28c5324614a451db"} err="failed to get container status \"e45bb9ccb1613e8ab0eaf6ea2292e545b782c9138534c90b28c5324614a451db\": rpc error: code = NotFound desc = could not find container \"e45bb9ccb1613e8ab0eaf6ea2292e545b782c9138534c90b28c5324614a451db\": container with ID starting with e45bb9ccb1613e8ab0eaf6ea2292e545b782c9138534c90b28c5324614a451db not found: ID does not exist" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.915071 5047 scope.go:117] "RemoveContainer" containerID="dc4a3a54f0f682ea775182ee51743524901d4ed04ec7971618a28089ea482247" Feb 23 08:20:07 crc kubenswrapper[5047]: E0223 08:20:07.915817 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc4a3a54f0f682ea775182ee51743524901d4ed04ec7971618a28089ea482247\": container with ID starting with dc4a3a54f0f682ea775182ee51743524901d4ed04ec7971618a28089ea482247 not found: ID does not exist" containerID="dc4a3a54f0f682ea775182ee51743524901d4ed04ec7971618a28089ea482247" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.915952 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc4a3a54f0f682ea775182ee51743524901d4ed04ec7971618a28089ea482247"} err="failed to get container status \"dc4a3a54f0f682ea775182ee51743524901d4ed04ec7971618a28089ea482247\": rpc error: code = NotFound desc = could not find container \"dc4a3a54f0f682ea775182ee51743524901d4ed04ec7971618a28089ea482247\": container with ID starting with dc4a3a54f0f682ea775182ee51743524901d4ed04ec7971618a28089ea482247 not found: ID does not exist" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.915998 5047 scope.go:117] "RemoveContainer" containerID="3b8297d430b91fcded0363838996194fd22ebb664800466ae6fc256931aec5cb" Feb 23 08:20:07 crc kubenswrapper[5047]: E0223 08:20:07.916560 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b8297d430b91fcded0363838996194fd22ebb664800466ae6fc256931aec5cb\": container with ID starting with 3b8297d430b91fcded0363838996194fd22ebb664800466ae6fc256931aec5cb not found: ID does not exist" containerID="3b8297d430b91fcded0363838996194fd22ebb664800466ae6fc256931aec5cb" Feb 23 08:20:07 crc kubenswrapper[5047]: I0223 08:20:07.916621 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b8297d430b91fcded0363838996194fd22ebb664800466ae6fc256931aec5cb"} err="failed to get container status \"3b8297d430b91fcded0363838996194fd22ebb664800466ae6fc256931aec5cb\": rpc error: code = NotFound desc = could not find container \"3b8297d430b91fcded0363838996194fd22ebb664800466ae6fc256931aec5cb\": container with ID starting with 3b8297d430b91fcded0363838996194fd22ebb664800466ae6fc256931aec5cb not found: ID does not exist" Feb 23 08:20:08 crc kubenswrapper[5047]: I0223 08:20:08.354039 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b948d47-0b1c-404a-b39d-db5fddc4d1f9" path="/var/lib/kubelet/pods/2b948d47-0b1c-404a-b39d-db5fddc4d1f9/volumes" Feb 23 08:21:16 crc kubenswrapper[5047]: I0223 08:21:16.759676 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:21:16 crc kubenswrapper[5047]: I0223 08:21:16.760318 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.068453 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cmx6w"] Feb 23 08:21:25 crc kubenswrapper[5047]: E0223 08:21:25.069554 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b948d47-0b1c-404a-b39d-db5fddc4d1f9" containerName="registry-server" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.069576 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b948d47-0b1c-404a-b39d-db5fddc4d1f9" containerName="registry-server" Feb 23 08:21:25 crc kubenswrapper[5047]: E0223 08:21:25.069594 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b948d47-0b1c-404a-b39d-db5fddc4d1f9" containerName="extract-utilities" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.069606 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b948d47-0b1c-404a-b39d-db5fddc4d1f9" containerName="extract-utilities" Feb 23 08:21:25 crc kubenswrapper[5047]: E0223 08:21:25.069627 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b948d47-0b1c-404a-b39d-db5fddc4d1f9" containerName="extract-content" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.069643 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b948d47-0b1c-404a-b39d-db5fddc4d1f9" containerName="extract-content" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.069939 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b948d47-0b1c-404a-b39d-db5fddc4d1f9" containerName="registry-server" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.072413 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.087722 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmx6w"] Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.176813 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jdjk\" (UniqueName: \"kubernetes.io/projected/68d773ac-daa4-4469-a4cb-1503a1c40509-kube-api-access-6jdjk\") pod \"certified-operators-cmx6w\" (UID: \"68d773ac-daa4-4469-a4cb-1503a1c40509\") " pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.176867 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d773ac-daa4-4469-a4cb-1503a1c40509-utilities\") pod \"certified-operators-cmx6w\" (UID: \"68d773ac-daa4-4469-a4cb-1503a1c40509\") " pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.176917 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d773ac-daa4-4469-a4cb-1503a1c40509-catalog-content\") pod \"certified-operators-cmx6w\" (UID: \"68d773ac-daa4-4469-a4cb-1503a1c40509\") " pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.279983 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jdjk\" (UniqueName: \"kubernetes.io/projected/68d773ac-daa4-4469-a4cb-1503a1c40509-kube-api-access-6jdjk\") pod \"certified-operators-cmx6w\" (UID: \"68d773ac-daa4-4469-a4cb-1503a1c40509\") " pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.280044 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d773ac-daa4-4469-a4cb-1503a1c40509-utilities\") pod \"certified-operators-cmx6w\" (UID: \"68d773ac-daa4-4469-a4cb-1503a1c40509\") " pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.280066 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d773ac-daa4-4469-a4cb-1503a1c40509-catalog-content\") pod \"certified-operators-cmx6w\" (UID: \"68d773ac-daa4-4469-a4cb-1503a1c40509\") " pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.280609 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d773ac-daa4-4469-a4cb-1503a1c40509-catalog-content\") pod \"certified-operators-cmx6w\" (UID: \"68d773ac-daa4-4469-a4cb-1503a1c40509\") " pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.281053 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d773ac-daa4-4469-a4cb-1503a1c40509-utilities\") pod \"certified-operators-cmx6w\" (UID: \"68d773ac-daa4-4469-a4cb-1503a1c40509\") " pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.303523 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jdjk\" (UniqueName: \"kubernetes.io/projected/68d773ac-daa4-4469-a4cb-1503a1c40509-kube-api-access-6jdjk\") pod \"certified-operators-cmx6w\" (UID: \"68d773ac-daa4-4469-a4cb-1503a1c40509\") " pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.431766 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:25 crc kubenswrapper[5047]: I0223 08:21:25.933024 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmx6w"] Feb 23 08:21:26 crc kubenswrapper[5047]: I0223 08:21:26.515881 5047 generic.go:334] "Generic (PLEG): container finished" podID="68d773ac-daa4-4469-a4cb-1503a1c40509" containerID="a2e825680b85399056fbe3d33d5dc3eec291342d5074e93c38f7a1e8d76264da" exitCode=0 Feb 23 08:21:26 crc kubenswrapper[5047]: I0223 08:21:26.516027 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmx6w" event={"ID":"68d773ac-daa4-4469-a4cb-1503a1c40509","Type":"ContainerDied","Data":"a2e825680b85399056fbe3d33d5dc3eec291342d5074e93c38f7a1e8d76264da"} Feb 23 08:21:26 crc kubenswrapper[5047]: I0223 08:21:26.516450 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmx6w" event={"ID":"68d773ac-daa4-4469-a4cb-1503a1c40509","Type":"ContainerStarted","Data":"2224773bba5ebd17f34752196b24d506dfa3be2271cf2d621152c6490d0a57b0"} Feb 23 08:21:27 crc kubenswrapper[5047]: I0223 08:21:27.526007 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmx6w" event={"ID":"68d773ac-daa4-4469-a4cb-1503a1c40509","Type":"ContainerStarted","Data":"e1b6e862bd0d12512df7154770782a357ecc5140e8015f8d512f7be8144b0879"} Feb 23 08:21:28 crc kubenswrapper[5047]: I0223 08:21:28.540339 5047 generic.go:334] "Generic (PLEG): container finished" podID="68d773ac-daa4-4469-a4cb-1503a1c40509" containerID="e1b6e862bd0d12512df7154770782a357ecc5140e8015f8d512f7be8144b0879" exitCode=0 Feb 23 08:21:28 crc kubenswrapper[5047]: I0223 08:21:28.540417 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmx6w" event={"ID":"68d773ac-daa4-4469-a4cb-1503a1c40509","Type":"ContainerDied","Data":"e1b6e862bd0d12512df7154770782a357ecc5140e8015f8d512f7be8144b0879"} Feb 23 08:21:29 crc kubenswrapper[5047]: I0223 08:21:29.552155 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmx6w" event={"ID":"68d773ac-daa4-4469-a4cb-1503a1c40509","Type":"ContainerStarted","Data":"e6116932ad5a3dfb1774668435381d9fcde499441b9e8866dd51335fbf2fd6fe"} Feb 23 08:21:29 crc kubenswrapper[5047]: I0223 08:21:29.584046 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cmx6w" podStartSLOduration=2.122137553 podStartE2EDuration="4.584019431s" podCreationTimestamp="2026-02-23 08:21:25 +0000 UTC" firstStartedPulling="2026-02-23 08:21:26.518717975 +0000 UTC m=+5808.770045139" lastFinishedPulling="2026-02-23 08:21:28.980599873 +0000 UTC m=+5811.231927017" observedRunningTime="2026-02-23 08:21:29.575207603 +0000 UTC m=+5811.826534767" watchObservedRunningTime="2026-02-23 08:21:29.584019431 +0000 UTC m=+5811.835346575" Feb 23 08:21:35 crc kubenswrapper[5047]: I0223 08:21:35.433278 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:35 crc kubenswrapper[5047]: I0223 08:21:35.433990 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:35 crc kubenswrapper[5047]: I0223 08:21:35.512967 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:35 crc kubenswrapper[5047]: I0223 08:21:35.699116 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:35 crc kubenswrapper[5047]: I0223 08:21:35.770196 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmx6w"] Feb 23 08:21:37 crc kubenswrapper[5047]: I0223 08:21:37.647876 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cmx6w" podUID="68d773ac-daa4-4469-a4cb-1503a1c40509" containerName="registry-server" containerID="cri-o://e6116932ad5a3dfb1774668435381d9fcde499441b9e8866dd51335fbf2fd6fe" gracePeriod=2 Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.230129 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.317763 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d773ac-daa4-4469-a4cb-1503a1c40509-utilities\") pod \"68d773ac-daa4-4469-a4cb-1503a1c40509\" (UID: \"68d773ac-daa4-4469-a4cb-1503a1c40509\") " Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.317950 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d773ac-daa4-4469-a4cb-1503a1c40509-catalog-content\") pod \"68d773ac-daa4-4469-a4cb-1503a1c40509\" (UID: \"68d773ac-daa4-4469-a4cb-1503a1c40509\") " Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.318044 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jdjk\" (UniqueName: \"kubernetes.io/projected/68d773ac-daa4-4469-a4cb-1503a1c40509-kube-api-access-6jdjk\") pod \"68d773ac-daa4-4469-a4cb-1503a1c40509\" (UID: \"68d773ac-daa4-4469-a4cb-1503a1c40509\") " Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.320108 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68d773ac-daa4-4469-a4cb-1503a1c40509-utilities" (OuterVolumeSpecName: "utilities") pod "68d773ac-daa4-4469-a4cb-1503a1c40509" (UID: "68d773ac-daa4-4469-a4cb-1503a1c40509"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.334288 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d773ac-daa4-4469-a4cb-1503a1c40509-kube-api-access-6jdjk" (OuterVolumeSpecName: "kube-api-access-6jdjk") pod "68d773ac-daa4-4469-a4cb-1503a1c40509" (UID: "68d773ac-daa4-4469-a4cb-1503a1c40509"). InnerVolumeSpecName "kube-api-access-6jdjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.419959 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d773ac-daa4-4469-a4cb-1503a1c40509-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.419995 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jdjk\" (UniqueName: \"kubernetes.io/projected/68d773ac-daa4-4469-a4cb-1503a1c40509-kube-api-access-6jdjk\") on node \"crc\" DevicePath \"\"" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.499327 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68d773ac-daa4-4469-a4cb-1503a1c40509-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68d773ac-daa4-4469-a4cb-1503a1c40509" (UID: "68d773ac-daa4-4469-a4cb-1503a1c40509"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.521343 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d773ac-daa4-4469-a4cb-1503a1c40509-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.662556 5047 generic.go:334] "Generic (PLEG): container finished" podID="68d773ac-daa4-4469-a4cb-1503a1c40509" containerID="e6116932ad5a3dfb1774668435381d9fcde499441b9e8866dd51335fbf2fd6fe" exitCode=0 Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.662626 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmx6w" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.662652 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmx6w" event={"ID":"68d773ac-daa4-4469-a4cb-1503a1c40509","Type":"ContainerDied","Data":"e6116932ad5a3dfb1774668435381d9fcde499441b9e8866dd51335fbf2fd6fe"} Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.664284 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmx6w" event={"ID":"68d773ac-daa4-4469-a4cb-1503a1c40509","Type":"ContainerDied","Data":"2224773bba5ebd17f34752196b24d506dfa3be2271cf2d621152c6490d0a57b0"} Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.664314 5047 scope.go:117] "RemoveContainer" containerID="e6116932ad5a3dfb1774668435381d9fcde499441b9e8866dd51335fbf2fd6fe" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.712431 5047 scope.go:117] "RemoveContainer" containerID="e1b6e862bd0d12512df7154770782a357ecc5140e8015f8d512f7be8144b0879" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.718289 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmx6w"] Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.727311 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cmx6w"] Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.734612 5047 scope.go:117] "RemoveContainer" containerID="a2e825680b85399056fbe3d33d5dc3eec291342d5074e93c38f7a1e8d76264da" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.779197 5047 scope.go:117] "RemoveContainer" containerID="e6116932ad5a3dfb1774668435381d9fcde499441b9e8866dd51335fbf2fd6fe" Feb 23 08:21:38 crc kubenswrapper[5047]: E0223 08:21:38.779792 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6116932ad5a3dfb1774668435381d9fcde499441b9e8866dd51335fbf2fd6fe\": container with ID starting with e6116932ad5a3dfb1774668435381d9fcde499441b9e8866dd51335fbf2fd6fe not found: ID does not exist" containerID="e6116932ad5a3dfb1774668435381d9fcde499441b9e8866dd51335fbf2fd6fe" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.779854 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6116932ad5a3dfb1774668435381d9fcde499441b9e8866dd51335fbf2fd6fe"} err="failed to get container status \"e6116932ad5a3dfb1774668435381d9fcde499441b9e8866dd51335fbf2fd6fe\": rpc error: code = NotFound desc = could not find container \"e6116932ad5a3dfb1774668435381d9fcde499441b9e8866dd51335fbf2fd6fe\": container with ID starting with e6116932ad5a3dfb1774668435381d9fcde499441b9e8866dd51335fbf2fd6fe not found: ID does not exist" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.779893 5047 scope.go:117] "RemoveContainer" containerID="e1b6e862bd0d12512df7154770782a357ecc5140e8015f8d512f7be8144b0879" Feb 23 08:21:38 crc kubenswrapper[5047]: E0223 08:21:38.780768 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1b6e862bd0d12512df7154770782a357ecc5140e8015f8d512f7be8144b0879\": container with ID starting with e1b6e862bd0d12512df7154770782a357ecc5140e8015f8d512f7be8144b0879 not found: ID does not exist" containerID="e1b6e862bd0d12512df7154770782a357ecc5140e8015f8d512f7be8144b0879" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.780801 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b6e862bd0d12512df7154770782a357ecc5140e8015f8d512f7be8144b0879"} err="failed to get container status \"e1b6e862bd0d12512df7154770782a357ecc5140e8015f8d512f7be8144b0879\": rpc error: code = NotFound desc = could not find container \"e1b6e862bd0d12512df7154770782a357ecc5140e8015f8d512f7be8144b0879\": container with ID starting with e1b6e862bd0d12512df7154770782a357ecc5140e8015f8d512f7be8144b0879 not found: ID does not exist" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.780818 5047 scope.go:117] "RemoveContainer" containerID="a2e825680b85399056fbe3d33d5dc3eec291342d5074e93c38f7a1e8d76264da" Feb 23 08:21:38 crc kubenswrapper[5047]: E0223 08:21:38.781369 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e825680b85399056fbe3d33d5dc3eec291342d5074e93c38f7a1e8d76264da\": container with ID starting with a2e825680b85399056fbe3d33d5dc3eec291342d5074e93c38f7a1e8d76264da not found: ID does not exist" containerID="a2e825680b85399056fbe3d33d5dc3eec291342d5074e93c38f7a1e8d76264da" Feb 23 08:21:38 crc kubenswrapper[5047]: I0223 08:21:38.781444 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e825680b85399056fbe3d33d5dc3eec291342d5074e93c38f7a1e8d76264da"} err="failed to get container status \"a2e825680b85399056fbe3d33d5dc3eec291342d5074e93c38f7a1e8d76264da\": rpc error: code = NotFound desc = could not find container \"a2e825680b85399056fbe3d33d5dc3eec291342d5074e93c38f7a1e8d76264da\": container with ID starting with a2e825680b85399056fbe3d33d5dc3eec291342d5074e93c38f7a1e8d76264da not found: ID does not exist" Feb 23 08:21:40 crc kubenswrapper[5047]: I0223 08:21:40.360483 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d773ac-daa4-4469-a4cb-1503a1c40509" path="/var/lib/kubelet/pods/68d773ac-daa4-4469-a4cb-1503a1c40509/volumes" Feb 23 08:21:46 crc kubenswrapper[5047]: I0223 08:21:46.759864 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:21:46 crc kubenswrapper[5047]: I0223 08:21:46.760358 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:22:16 crc kubenswrapper[5047]: I0223 08:22:16.760056 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:22:16 crc kubenswrapper[5047]: I0223 08:22:16.760996 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:22:16 crc kubenswrapper[5047]: I0223 08:22:16.761091 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 08:22:16 crc kubenswrapper[5047]: I0223 08:22:16.762520 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7adba1578f3b89f77a4d3dbd29dbc310a04b48452a2a36e2244bdd32ebf8310f"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:22:16 crc kubenswrapper[5047]: I0223 08:22:16.762653 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://7adba1578f3b89f77a4d3dbd29dbc310a04b48452a2a36e2244bdd32ebf8310f" gracePeriod=600 Feb 23 08:22:17 crc kubenswrapper[5047]: I0223 08:22:17.053387 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="7adba1578f3b89f77a4d3dbd29dbc310a04b48452a2a36e2244bdd32ebf8310f" exitCode=0 Feb 23 08:22:17 crc kubenswrapper[5047]: I0223 08:22:17.053447 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"7adba1578f3b89f77a4d3dbd29dbc310a04b48452a2a36e2244bdd32ebf8310f"} Feb 23 08:22:17 crc kubenswrapper[5047]: I0223 08:22:17.053534 5047 scope.go:117] "RemoveContainer" containerID="c4b3f3b587c8023bb3a091c5914e6da2e4aae3714d162fd8368781ec92f1dd59" Feb 23 08:22:18 crc kubenswrapper[5047]: I0223 08:22:18.067537 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0"} Feb 23 08:24:46 crc kubenswrapper[5047]: I0223 08:24:46.759955 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:24:46 crc kubenswrapper[5047]: I0223 08:24:46.760834 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.364007 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5p47f"] Feb 23 08:24:48 crc kubenswrapper[5047]: E0223 08:24:48.364513 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d773ac-daa4-4469-a4cb-1503a1c40509" containerName="extract-content" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.364533 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d773ac-daa4-4469-a4cb-1503a1c40509" containerName="extract-content" Feb 23 08:24:48 crc kubenswrapper[5047]: E0223 08:24:48.364552 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d773ac-daa4-4469-a4cb-1503a1c40509" containerName="extract-utilities" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.364561 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d773ac-daa4-4469-a4cb-1503a1c40509" containerName="extract-utilities" Feb 23 08:24:48 crc kubenswrapper[5047]: E0223 08:24:48.364582 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d773ac-daa4-4469-a4cb-1503a1c40509" containerName="registry-server" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.364605 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d773ac-daa4-4469-a4cb-1503a1c40509" containerName="registry-server" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.364834 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d773ac-daa4-4469-a4cb-1503a1c40509" containerName="registry-server" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.366264 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.390079 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5p47f"] Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.471736 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd259d4-6700-4168-b2ea-47a93d0204af-utilities\") pod \"community-operators-5p47f\" (UID: \"6fd259d4-6700-4168-b2ea-47a93d0204af\") " pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.471808 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd259d4-6700-4168-b2ea-47a93d0204af-catalog-content\") pod \"community-operators-5p47f\" (UID: \"6fd259d4-6700-4168-b2ea-47a93d0204af\") " pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.472012 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpf46\" (UniqueName: \"kubernetes.io/projected/6fd259d4-6700-4168-b2ea-47a93d0204af-kube-api-access-mpf46\") pod \"community-operators-5p47f\" (UID: \"6fd259d4-6700-4168-b2ea-47a93d0204af\") " pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.573197 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd259d4-6700-4168-b2ea-47a93d0204af-utilities\") pod \"community-operators-5p47f\" (UID: \"6fd259d4-6700-4168-b2ea-47a93d0204af\") " pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.573262 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd259d4-6700-4168-b2ea-47a93d0204af-catalog-content\") pod \"community-operators-5p47f\" (UID: \"6fd259d4-6700-4168-b2ea-47a93d0204af\") " pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.573335 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpf46\" (UniqueName: \"kubernetes.io/projected/6fd259d4-6700-4168-b2ea-47a93d0204af-kube-api-access-mpf46\") pod \"community-operators-5p47f\" (UID: \"6fd259d4-6700-4168-b2ea-47a93d0204af\") " pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.574183 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd259d4-6700-4168-b2ea-47a93d0204af-utilities\") pod \"community-operators-5p47f\" (UID: \"6fd259d4-6700-4168-b2ea-47a93d0204af\") " pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.574270 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd259d4-6700-4168-b2ea-47a93d0204af-catalog-content\") pod \"community-operators-5p47f\" (UID: \"6fd259d4-6700-4168-b2ea-47a93d0204af\") " pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.594526 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpf46\" (UniqueName: \"kubernetes.io/projected/6fd259d4-6700-4168-b2ea-47a93d0204af-kube-api-access-mpf46\") pod \"community-operators-5p47f\" (UID: \"6fd259d4-6700-4168-b2ea-47a93d0204af\") " pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:48 crc kubenswrapper[5047]: I0223 08:24:48.698290 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:49 crc kubenswrapper[5047]: I0223 08:24:49.234994 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5p47f"] Feb 23 08:24:49 crc kubenswrapper[5047]: I0223 08:24:49.641036 5047 generic.go:334] "Generic (PLEG): container finished" podID="6fd259d4-6700-4168-b2ea-47a93d0204af" containerID="528206fbfcea3a8f496433adfd93815bc1dbcf55f0245ea235a15f4ff62c4768" exitCode=0 Feb 23 08:24:49 crc kubenswrapper[5047]: I0223 08:24:49.641117 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p47f" event={"ID":"6fd259d4-6700-4168-b2ea-47a93d0204af","Type":"ContainerDied","Data":"528206fbfcea3a8f496433adfd93815bc1dbcf55f0245ea235a15f4ff62c4768"} Feb 23 08:24:49 crc kubenswrapper[5047]: I0223 08:24:49.641589 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p47f" event={"ID":"6fd259d4-6700-4168-b2ea-47a93d0204af","Type":"ContainerStarted","Data":"a94829b695903eb7603fb2fb80ab505ece212f933e096241b9f6bd63bea94a4e"} Feb 23 08:24:49 crc kubenswrapper[5047]: I0223 08:24:49.644421 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:24:50 crc kubenswrapper[5047]: I0223 08:24:50.654301 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p47f" event={"ID":"6fd259d4-6700-4168-b2ea-47a93d0204af","Type":"ContainerStarted","Data":"770d3938f109cfa85a306839794e00a7d5c700adec30fff3e69f73e0f7e9f6aa"} Feb 23 08:24:51 crc kubenswrapper[5047]: I0223 08:24:51.667656 5047 generic.go:334] "Generic (PLEG): container finished" podID="6fd259d4-6700-4168-b2ea-47a93d0204af" containerID="770d3938f109cfa85a306839794e00a7d5c700adec30fff3e69f73e0f7e9f6aa" exitCode=0 Feb 23 08:24:51 crc kubenswrapper[5047]: I0223 08:24:51.667725 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p47f" event={"ID":"6fd259d4-6700-4168-b2ea-47a93d0204af","Type":"ContainerDied","Data":"770d3938f109cfa85a306839794e00a7d5c700adec30fff3e69f73e0f7e9f6aa"} Feb 23 08:24:52 crc kubenswrapper[5047]: I0223 08:24:52.683321 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p47f" event={"ID":"6fd259d4-6700-4168-b2ea-47a93d0204af","Type":"ContainerStarted","Data":"6ec2c9992e30ff7805cd49ee223cf9cb1bbb45aa609120f5089bd4c4e5a865a4"} Feb 23 08:24:52 crc kubenswrapper[5047]: I0223 08:24:52.705559 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5p47f" podStartSLOduration=2.288703989 podStartE2EDuration="4.705531903s" podCreationTimestamp="2026-02-23 08:24:48 +0000 UTC" firstStartedPulling="2026-02-23 08:24:49.644012311 +0000 UTC m=+6011.895339445" lastFinishedPulling="2026-02-23 08:24:52.060840195 +0000 UTC m=+6014.312167359" observedRunningTime="2026-02-23 08:24:52.704394872 +0000 UTC m=+6014.955722016" watchObservedRunningTime="2026-02-23 08:24:52.705531903 +0000 UTC m=+6014.956859037" Feb 23 08:24:54 crc kubenswrapper[5047]: I0223 08:24:54.840655 5047 patch_prober.go:28] interesting pod/console-75bc6c8444-266zl container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 08:24:54 crc kubenswrapper[5047]: I0223 08:24:54.841159 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-75bc6c8444-266zl" podUID="06d8380a-7cbc-44cf-986a-4597f832dfb9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.46:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 08:24:58 crc kubenswrapper[5047]: I0223 08:24:58.699267 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:58 crc kubenswrapper[5047]: I0223 08:24:58.701079 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:58 crc kubenswrapper[5047]: I0223 08:24:58.775507 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:59 crc kubenswrapper[5047]: I0223 08:24:59.828175 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:24:59 crc kubenswrapper[5047]: I0223 08:24:59.888538 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5p47f"] Feb 23 08:25:01 crc kubenswrapper[5047]: I0223 08:25:01.768845 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5p47f" podUID="6fd259d4-6700-4168-b2ea-47a93d0204af" containerName="registry-server" containerID="cri-o://6ec2c9992e30ff7805cd49ee223cf9cb1bbb45aa609120f5089bd4c4e5a865a4" gracePeriod=2 Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.241548 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.347630 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpf46\" (UniqueName: \"kubernetes.io/projected/6fd259d4-6700-4168-b2ea-47a93d0204af-kube-api-access-mpf46\") pod \"6fd259d4-6700-4168-b2ea-47a93d0204af\" (UID: \"6fd259d4-6700-4168-b2ea-47a93d0204af\") " Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.347735 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd259d4-6700-4168-b2ea-47a93d0204af-catalog-content\") pod \"6fd259d4-6700-4168-b2ea-47a93d0204af\" (UID: \"6fd259d4-6700-4168-b2ea-47a93d0204af\") " Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.347795 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd259d4-6700-4168-b2ea-47a93d0204af-utilities\") pod \"6fd259d4-6700-4168-b2ea-47a93d0204af\" (UID: \"6fd259d4-6700-4168-b2ea-47a93d0204af\") " Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.349936 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd259d4-6700-4168-b2ea-47a93d0204af-utilities" (OuterVolumeSpecName: "utilities") pod "6fd259d4-6700-4168-b2ea-47a93d0204af" (UID: "6fd259d4-6700-4168-b2ea-47a93d0204af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.357269 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd259d4-6700-4168-b2ea-47a93d0204af-kube-api-access-mpf46" (OuterVolumeSpecName: "kube-api-access-mpf46") pod "6fd259d4-6700-4168-b2ea-47a93d0204af" (UID: "6fd259d4-6700-4168-b2ea-47a93d0204af"). InnerVolumeSpecName "kube-api-access-mpf46". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.413264 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd259d4-6700-4168-b2ea-47a93d0204af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fd259d4-6700-4168-b2ea-47a93d0204af" (UID: "6fd259d4-6700-4168-b2ea-47a93d0204af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.450117 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpf46\" (UniqueName: \"kubernetes.io/projected/6fd259d4-6700-4168-b2ea-47a93d0204af-kube-api-access-mpf46\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.450166 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd259d4-6700-4168-b2ea-47a93d0204af-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.450177 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd259d4-6700-4168-b2ea-47a93d0204af-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.795174 5047 generic.go:334] "Generic (PLEG): container finished" podID="6fd259d4-6700-4168-b2ea-47a93d0204af" containerID="6ec2c9992e30ff7805cd49ee223cf9cb1bbb45aa609120f5089bd4c4e5a865a4" exitCode=0 Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.795318 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p47f" event={"ID":"6fd259d4-6700-4168-b2ea-47a93d0204af","Type":"ContainerDied","Data":"6ec2c9992e30ff7805cd49ee223cf9cb1bbb45aa609120f5089bd4c4e5a865a4"} Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.795387 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5p47f" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.795478 5047 scope.go:117] "RemoveContainer" containerID="6ec2c9992e30ff7805cd49ee223cf9cb1bbb45aa609120f5089bd4c4e5a865a4" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.795427 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p47f" event={"ID":"6fd259d4-6700-4168-b2ea-47a93d0204af","Type":"ContainerDied","Data":"a94829b695903eb7603fb2fb80ab505ece212f933e096241b9f6bd63bea94a4e"} Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.817927 5047 scope.go:117] "RemoveContainer" containerID="770d3938f109cfa85a306839794e00a7d5c700adec30fff3e69f73e0f7e9f6aa" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.849190 5047 scope.go:117] "RemoveContainer" containerID="528206fbfcea3a8f496433adfd93815bc1dbcf55f0245ea235a15f4ff62c4768" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.864765 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5p47f"] Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.875542 5047 scope.go:117] "RemoveContainer" containerID="6ec2c9992e30ff7805cd49ee223cf9cb1bbb45aa609120f5089bd4c4e5a865a4" Feb 23 08:25:02 crc kubenswrapper[5047]: E0223 08:25:02.876038 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ec2c9992e30ff7805cd49ee223cf9cb1bbb45aa609120f5089bd4c4e5a865a4\": container with ID starting with 6ec2c9992e30ff7805cd49ee223cf9cb1bbb45aa609120f5089bd4c4e5a865a4 not found: ID does not exist" containerID="6ec2c9992e30ff7805cd49ee223cf9cb1bbb45aa609120f5089bd4c4e5a865a4" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.876098 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ec2c9992e30ff7805cd49ee223cf9cb1bbb45aa609120f5089bd4c4e5a865a4"} err="failed to get container status \"6ec2c9992e30ff7805cd49ee223cf9cb1bbb45aa609120f5089bd4c4e5a865a4\": rpc error: code = NotFound desc = could not find container \"6ec2c9992e30ff7805cd49ee223cf9cb1bbb45aa609120f5089bd4c4e5a865a4\": container with ID starting with 6ec2c9992e30ff7805cd49ee223cf9cb1bbb45aa609120f5089bd4c4e5a865a4 not found: ID does not exist" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.876141 5047 scope.go:117] "RemoveContainer" containerID="770d3938f109cfa85a306839794e00a7d5c700adec30fff3e69f73e0f7e9f6aa" Feb 23 08:25:02 crc kubenswrapper[5047]: E0223 08:25:02.876767 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"770d3938f109cfa85a306839794e00a7d5c700adec30fff3e69f73e0f7e9f6aa\": container with ID starting with 770d3938f109cfa85a306839794e00a7d5c700adec30fff3e69f73e0f7e9f6aa not found: ID does not exist" containerID="770d3938f109cfa85a306839794e00a7d5c700adec30fff3e69f73e0f7e9f6aa" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.876910 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"770d3938f109cfa85a306839794e00a7d5c700adec30fff3e69f73e0f7e9f6aa"} err="failed to get container status \"770d3938f109cfa85a306839794e00a7d5c700adec30fff3e69f73e0f7e9f6aa\": rpc error: code = NotFound desc = could not find container \"770d3938f109cfa85a306839794e00a7d5c700adec30fff3e69f73e0f7e9f6aa\": container with ID starting with 770d3938f109cfa85a306839794e00a7d5c700adec30fff3e69f73e0f7e9f6aa not found: ID does not exist" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.877007 5047 scope.go:117] "RemoveContainer" containerID="528206fbfcea3a8f496433adfd93815bc1dbcf55f0245ea235a15f4ff62c4768" Feb 23 08:25:02 crc kubenswrapper[5047]: E0223 08:25:02.877430 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"528206fbfcea3a8f496433adfd93815bc1dbcf55f0245ea235a15f4ff62c4768\": container with ID starting with 528206fbfcea3a8f496433adfd93815bc1dbcf55f0245ea235a15f4ff62c4768 not found: ID does not exist" containerID="528206fbfcea3a8f496433adfd93815bc1dbcf55f0245ea235a15f4ff62c4768" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.877458 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528206fbfcea3a8f496433adfd93815bc1dbcf55f0245ea235a15f4ff62c4768"} err="failed to get container status \"528206fbfcea3a8f496433adfd93815bc1dbcf55f0245ea235a15f4ff62c4768\": rpc error: code = NotFound desc = could not find container \"528206fbfcea3a8f496433adfd93815bc1dbcf55f0245ea235a15f4ff62c4768\": container with ID starting with 528206fbfcea3a8f496433adfd93815bc1dbcf55f0245ea235a15f4ff62c4768 not found: ID does not exist" Feb 23 08:25:02 crc kubenswrapper[5047]: I0223 08:25:02.881141 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5p47f"] Feb 23 08:25:04 crc kubenswrapper[5047]: I0223 08:25:04.349062 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd259d4-6700-4168-b2ea-47a93d0204af" path="/var/lib/kubelet/pods/6fd259d4-6700-4168-b2ea-47a93d0204af/volumes" Feb 23 08:25:16 crc kubenswrapper[5047]: I0223 08:25:16.759767 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:25:16 crc kubenswrapper[5047]: I0223 08:25:16.760336 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:25:46 crc kubenswrapper[5047]: I0223 08:25:46.759482 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:25:46 crc kubenswrapper[5047]: I0223 08:25:46.760504 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:25:46 crc kubenswrapper[5047]: I0223 08:25:46.760588 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 08:25:46 crc kubenswrapper[5047]: I0223 08:25:46.761673 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:25:46 crc kubenswrapper[5047]: I0223 08:25:46.761791 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" gracePeriod=600 Feb 23 08:25:46 crc kubenswrapper[5047]: E0223 08:25:46.902411 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:25:47 crc kubenswrapper[5047]: I0223 08:25:47.230545 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" exitCode=0 Feb 23 08:25:47 crc kubenswrapper[5047]: I0223 08:25:47.230619 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0"} Feb 23 08:25:47 crc kubenswrapper[5047]: I0223 08:25:47.230686 5047 scope.go:117] "RemoveContainer" containerID="7adba1578f3b89f77a4d3dbd29dbc310a04b48452a2a36e2244bdd32ebf8310f" Feb 23 08:25:47 crc kubenswrapper[5047]: I0223 08:25:47.231570 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:25:47 crc kubenswrapper[5047]: E0223 08:25:47.232159 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:26:01 crc kubenswrapper[5047]: I0223 08:26:01.342377 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:26:01 crc kubenswrapper[5047]: E0223 08:26:01.343620 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:26:16 crc kubenswrapper[5047]: I0223 08:26:16.341299 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:26:16 crc kubenswrapper[5047]: E0223 08:26:16.342276 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:26:27 crc kubenswrapper[5047]: I0223 08:26:27.340845 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:26:27 crc kubenswrapper[5047]: E0223 08:26:27.341679 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.074588 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-rgc85"] Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.081267 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-rgc85"] Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.187978 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-7p2lg"] Feb 23 08:26:35 crc kubenswrapper[5047]: E0223 08:26:35.188369 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd259d4-6700-4168-b2ea-47a93d0204af" containerName="registry-server" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.188386 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd259d4-6700-4168-b2ea-47a93d0204af" containerName="registry-server" Feb 23 08:26:35 crc kubenswrapper[5047]: E0223 08:26:35.188406 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd259d4-6700-4168-b2ea-47a93d0204af" containerName="extract-utilities" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.188414 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd259d4-6700-4168-b2ea-47a93d0204af" containerName="extract-utilities" Feb 23 08:26:35 crc kubenswrapper[5047]: E0223 08:26:35.188426 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd259d4-6700-4168-b2ea-47a93d0204af" containerName="extract-content" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.188433 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd259d4-6700-4168-b2ea-47a93d0204af" containerName="extract-content" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.188598 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd259d4-6700-4168-b2ea-47a93d0204af" containerName="registry-server" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.189180 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7p2lg" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.192399 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.192417 5047 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-z68qx" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.192573 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.192628 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.208656 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7p2lg"] Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.330807 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75mvz\" (UniqueName: \"kubernetes.io/projected/0ceec823-4283-4019-96b7-f390cb3c23ff-kube-api-access-75mvz\") pod \"crc-storage-crc-7p2lg\" (UID: \"0ceec823-4283-4019-96b7-f390cb3c23ff\") " pod="crc-storage/crc-storage-crc-7p2lg" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.330865 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0ceec823-4283-4019-96b7-f390cb3c23ff-crc-storage\") pod \"crc-storage-crc-7p2lg\" (UID: \"0ceec823-4283-4019-96b7-f390cb3c23ff\") " pod="crc-storage/crc-storage-crc-7p2lg" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.330931 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0ceec823-4283-4019-96b7-f390cb3c23ff-node-mnt\") pod \"crc-storage-crc-7p2lg\" (UID: \"0ceec823-4283-4019-96b7-f390cb3c23ff\") " pod="crc-storage/crc-storage-crc-7p2lg" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.432257 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75mvz\" (UniqueName: \"kubernetes.io/projected/0ceec823-4283-4019-96b7-f390cb3c23ff-kube-api-access-75mvz\") pod \"crc-storage-crc-7p2lg\" (UID: \"0ceec823-4283-4019-96b7-f390cb3c23ff\") " pod="crc-storage/crc-storage-crc-7p2lg" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.432365 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0ceec823-4283-4019-96b7-f390cb3c23ff-crc-storage\") pod \"crc-storage-crc-7p2lg\" (UID: \"0ceec823-4283-4019-96b7-f390cb3c23ff\") " pod="crc-storage/crc-storage-crc-7p2lg" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.433193 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0ceec823-4283-4019-96b7-f390cb3c23ff-crc-storage\") pod \"crc-storage-crc-7p2lg\" (UID: \"0ceec823-4283-4019-96b7-f390cb3c23ff\") " pod="crc-storage/crc-storage-crc-7p2lg" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.433277 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0ceec823-4283-4019-96b7-f390cb3c23ff-node-mnt\") pod \"crc-storage-crc-7p2lg\" (UID: \"0ceec823-4283-4019-96b7-f390cb3c23ff\") " pod="crc-storage/crc-storage-crc-7p2lg" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.433500 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0ceec823-4283-4019-96b7-f390cb3c23ff-node-mnt\") pod \"crc-storage-crc-7p2lg\" (UID: \"0ceec823-4283-4019-96b7-f390cb3c23ff\") " pod="crc-storage/crc-storage-crc-7p2lg" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.456359 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75mvz\" (UniqueName: \"kubernetes.io/projected/0ceec823-4283-4019-96b7-f390cb3c23ff-kube-api-access-75mvz\") pod \"crc-storage-crc-7p2lg\" (UID: \"0ceec823-4283-4019-96b7-f390cb3c23ff\") " pod="crc-storage/crc-storage-crc-7p2lg" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.516545 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7p2lg" Feb 23 08:26:35 crc kubenswrapper[5047]: I0223 08:26:35.881866 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7p2lg"] Feb 23 08:26:36 crc kubenswrapper[5047]: I0223 08:26:36.349555 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b5e7015-387b-4bed-893c-0c80220d98e5" path="/var/lib/kubelet/pods/8b5e7015-387b-4bed-893c-0c80220d98e5/volumes" Feb 23 08:26:36 crc kubenswrapper[5047]: I0223 08:26:36.730744 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7p2lg" event={"ID":"0ceec823-4283-4019-96b7-f390cb3c23ff","Type":"ContainerStarted","Data":"e0028e7efb3f0193fba1eeb4c3255cc8e255cae96ce87cdb24d76ae0b574b86b"} Feb 23 08:26:36 crc kubenswrapper[5047]: I0223 08:26:36.731330 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7p2lg" event={"ID":"0ceec823-4283-4019-96b7-f390cb3c23ff","Type":"ContainerStarted","Data":"823f31aab2c3dca17c2285d52c4decb2e6309c5a59c3677b9795a19326df92df"} Feb 23 08:26:36 crc kubenswrapper[5047]: I0223 08:26:36.768128 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-7p2lg" podStartSLOduration=1.281089219 podStartE2EDuration="1.768095914s" podCreationTimestamp="2026-02-23 08:26:35 +0000 UTC" firstStartedPulling="2026-02-23 08:26:35.889749618 +0000 UTC m=+6118.141076772" lastFinishedPulling="2026-02-23 08:26:36.376756333 +0000 UTC m=+6118.628083467" observedRunningTime="2026-02-23 08:26:36.758088714 +0000 UTC m=+6119.009415888" watchObservedRunningTime="2026-02-23 08:26:36.768095914 +0000 UTC m=+6119.019423058" Feb 23 08:26:37 crc kubenswrapper[5047]: I0223 08:26:37.739816 5047 generic.go:334] "Generic (PLEG): container finished" podID="0ceec823-4283-4019-96b7-f390cb3c23ff" containerID="e0028e7efb3f0193fba1eeb4c3255cc8e255cae96ce87cdb24d76ae0b574b86b" exitCode=0 Feb 23 08:26:37 crc kubenswrapper[5047]: I0223 08:26:37.739886 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7p2lg" event={"ID":"0ceec823-4283-4019-96b7-f390cb3c23ff","Type":"ContainerDied","Data":"e0028e7efb3f0193fba1eeb4c3255cc8e255cae96ce87cdb24d76ae0b574b86b"} Feb 23 08:26:39 crc kubenswrapper[5047]: I0223 08:26:39.110682 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7p2lg" Feb 23 08:26:39 crc kubenswrapper[5047]: I0223 08:26:39.293882 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75mvz\" (UniqueName: \"kubernetes.io/projected/0ceec823-4283-4019-96b7-f390cb3c23ff-kube-api-access-75mvz\") pod \"0ceec823-4283-4019-96b7-f390cb3c23ff\" (UID: \"0ceec823-4283-4019-96b7-f390cb3c23ff\") " Feb 23 08:26:39 crc kubenswrapper[5047]: I0223 08:26:39.294052 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0ceec823-4283-4019-96b7-f390cb3c23ff-crc-storage\") pod \"0ceec823-4283-4019-96b7-f390cb3c23ff\" (UID: \"0ceec823-4283-4019-96b7-f390cb3c23ff\") " Feb 23 08:26:39 crc kubenswrapper[5047]: I0223 08:26:39.294127 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0ceec823-4283-4019-96b7-f390cb3c23ff-node-mnt\") pod \"0ceec823-4283-4019-96b7-f390cb3c23ff\" (UID: \"0ceec823-4283-4019-96b7-f390cb3c23ff\") " Feb 23 08:26:39 crc kubenswrapper[5047]: I0223 08:26:39.294536 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ceec823-4283-4019-96b7-f390cb3c23ff-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "0ceec823-4283-4019-96b7-f390cb3c23ff" (UID: "0ceec823-4283-4019-96b7-f390cb3c23ff"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 08:26:39 crc kubenswrapper[5047]: I0223 08:26:39.300560 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ceec823-4283-4019-96b7-f390cb3c23ff-kube-api-access-75mvz" (OuterVolumeSpecName: "kube-api-access-75mvz") pod "0ceec823-4283-4019-96b7-f390cb3c23ff" (UID: "0ceec823-4283-4019-96b7-f390cb3c23ff"). InnerVolumeSpecName "kube-api-access-75mvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:26:39 crc kubenswrapper[5047]: I0223 08:26:39.315534 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ceec823-4283-4019-96b7-f390cb3c23ff-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "0ceec823-4283-4019-96b7-f390cb3c23ff" (UID: "0ceec823-4283-4019-96b7-f390cb3c23ff"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:26:39 crc kubenswrapper[5047]: I0223 08:26:39.396532 5047 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0ceec823-4283-4019-96b7-f390cb3c23ff-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 23 08:26:39 crc kubenswrapper[5047]: I0223 08:26:39.396588 5047 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0ceec823-4283-4019-96b7-f390cb3c23ff-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 23 08:26:39 crc kubenswrapper[5047]: I0223 08:26:39.396614 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75mvz\" (UniqueName: \"kubernetes.io/projected/0ceec823-4283-4019-96b7-f390cb3c23ff-kube-api-access-75mvz\") on node \"crc\" DevicePath \"\"" Feb 23 08:26:39 crc kubenswrapper[5047]: I0223 08:26:39.757517 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7p2lg" event={"ID":"0ceec823-4283-4019-96b7-f390cb3c23ff","Type":"ContainerDied","Data":"823f31aab2c3dca17c2285d52c4decb2e6309c5a59c3677b9795a19326df92df"} Feb 23 08:26:39 crc kubenswrapper[5047]: I0223 08:26:39.757572 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="823f31aab2c3dca17c2285d52c4decb2e6309c5a59c3677b9795a19326df92df" Feb 23 08:26:39 crc kubenswrapper[5047]: I0223 08:26:39.757622 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7p2lg" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.206408 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-7p2lg"] Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.216210 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-7p2lg"] Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.319856 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-vx7pv"] Feb 23 08:26:41 crc kubenswrapper[5047]: E0223 08:26:41.320372 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ceec823-4283-4019-96b7-f390cb3c23ff" containerName="storage" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.320401 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ceec823-4283-4019-96b7-f390cb3c23ff" containerName="storage" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.320747 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ceec823-4283-4019-96b7-f390cb3c23ff" containerName="storage" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.321538 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vx7pv" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.324684 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.325167 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.325402 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.326475 5047 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-z68qx" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.332165 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-vx7pv"] Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.349672 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm8hx\" (UniqueName: \"kubernetes.io/projected/736bd9f7-11ff-4ed7-9272-cde88f3fadba-kube-api-access-jm8hx\") pod \"crc-storage-crc-vx7pv\" (UID: \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\") " pod="crc-storage/crc-storage-crc-vx7pv" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.349740 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/736bd9f7-11ff-4ed7-9272-cde88f3fadba-crc-storage\") pod \"crc-storage-crc-vx7pv\" (UID: \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\") " pod="crc-storage/crc-storage-crc-vx7pv" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.349770 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/736bd9f7-11ff-4ed7-9272-cde88f3fadba-node-mnt\") pod \"crc-storage-crc-vx7pv\" (UID: \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\") " pod="crc-storage/crc-storage-crc-vx7pv" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.451326 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/736bd9f7-11ff-4ed7-9272-cde88f3fadba-crc-storage\") pod \"crc-storage-crc-vx7pv\" (UID: \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\") " pod="crc-storage/crc-storage-crc-vx7pv" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.451391 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/736bd9f7-11ff-4ed7-9272-cde88f3fadba-node-mnt\") pod \"crc-storage-crc-vx7pv\" (UID: \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\") " pod="crc-storage/crc-storage-crc-vx7pv" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.451599 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm8hx\" (UniqueName: \"kubernetes.io/projected/736bd9f7-11ff-4ed7-9272-cde88f3fadba-kube-api-access-jm8hx\") pod \"crc-storage-crc-vx7pv\" (UID: \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\") " pod="crc-storage/crc-storage-crc-vx7pv" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.452325 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/736bd9f7-11ff-4ed7-9272-cde88f3fadba-node-mnt\") pod \"crc-storage-crc-vx7pv\" (UID: \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\") " pod="crc-storage/crc-storage-crc-vx7pv" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.452393 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/736bd9f7-11ff-4ed7-9272-cde88f3fadba-crc-storage\") pod \"crc-storage-crc-vx7pv\" (UID: \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\") " pod="crc-storage/crc-storage-crc-vx7pv" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.485493 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm8hx\" (UniqueName: \"kubernetes.io/projected/736bd9f7-11ff-4ed7-9272-cde88f3fadba-kube-api-access-jm8hx\") pod \"crc-storage-crc-vx7pv\" (UID: \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\") " pod="crc-storage/crc-storage-crc-vx7pv" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.657613 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vx7pv" Feb 23 08:26:41 crc kubenswrapper[5047]: I0223 08:26:41.933558 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-vx7pv"] Feb 23 08:26:42 crc kubenswrapper[5047]: I0223 08:26:42.341232 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:26:42 crc kubenswrapper[5047]: E0223 08:26:42.342119 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:26:42 crc kubenswrapper[5047]: I0223 08:26:42.363071 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ceec823-4283-4019-96b7-f390cb3c23ff" path="/var/lib/kubelet/pods/0ceec823-4283-4019-96b7-f390cb3c23ff/volumes" Feb 23 08:26:42 crc kubenswrapper[5047]: I0223 08:26:42.789299 5047 generic.go:334] "Generic (PLEG): container finished" podID="736bd9f7-11ff-4ed7-9272-cde88f3fadba" containerID="0d58a6d9591526ffb706e3edab36d8a69890237ef93cb7e7814ab06861b96f16" exitCode=0 Feb 23 08:26:42 crc kubenswrapper[5047]: I0223 08:26:42.789420 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vx7pv" event={"ID":"736bd9f7-11ff-4ed7-9272-cde88f3fadba","Type":"ContainerDied","Data":"0d58a6d9591526ffb706e3edab36d8a69890237ef93cb7e7814ab06861b96f16"} Feb 23 08:26:42 crc kubenswrapper[5047]: I0223 08:26:42.789861 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vx7pv" event={"ID":"736bd9f7-11ff-4ed7-9272-cde88f3fadba","Type":"ContainerStarted","Data":"1f5223dad6c67c72b6fc82cb8f4f5bee99871abca31015a9ccbabf21daf18ee0"} Feb 23 08:26:44 crc kubenswrapper[5047]: I0223 08:26:44.140387 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vx7pv" Feb 23 08:26:44 crc kubenswrapper[5047]: I0223 08:26:44.212995 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/736bd9f7-11ff-4ed7-9272-cde88f3fadba-node-mnt\") pod \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\" (UID: \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\") " Feb 23 08:26:44 crc kubenswrapper[5047]: I0223 08:26:44.213213 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736bd9f7-11ff-4ed7-9272-cde88f3fadba-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "736bd9f7-11ff-4ed7-9272-cde88f3fadba" (UID: "736bd9f7-11ff-4ed7-9272-cde88f3fadba"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 08:26:44 crc kubenswrapper[5047]: I0223 08:26:44.213380 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm8hx\" (UniqueName: \"kubernetes.io/projected/736bd9f7-11ff-4ed7-9272-cde88f3fadba-kube-api-access-jm8hx\") pod \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\" (UID: \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\") " Feb 23 08:26:44 crc kubenswrapper[5047]: I0223 08:26:44.213461 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/736bd9f7-11ff-4ed7-9272-cde88f3fadba-crc-storage\") pod \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\" (UID: \"736bd9f7-11ff-4ed7-9272-cde88f3fadba\") " Feb 23 08:26:44 crc kubenswrapper[5047]: I0223 08:26:44.214500 5047 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/736bd9f7-11ff-4ed7-9272-cde88f3fadba-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 23 08:26:44 crc kubenswrapper[5047]: I0223 08:26:44.219831 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736bd9f7-11ff-4ed7-9272-cde88f3fadba-kube-api-access-jm8hx" (OuterVolumeSpecName: "kube-api-access-jm8hx") pod "736bd9f7-11ff-4ed7-9272-cde88f3fadba" (UID: "736bd9f7-11ff-4ed7-9272-cde88f3fadba"). InnerVolumeSpecName "kube-api-access-jm8hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:26:44 crc kubenswrapper[5047]: I0223 08:26:44.232253 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736bd9f7-11ff-4ed7-9272-cde88f3fadba-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "736bd9f7-11ff-4ed7-9272-cde88f3fadba" (UID: "736bd9f7-11ff-4ed7-9272-cde88f3fadba"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:26:44 crc kubenswrapper[5047]: I0223 08:26:44.316484 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm8hx\" (UniqueName: \"kubernetes.io/projected/736bd9f7-11ff-4ed7-9272-cde88f3fadba-kube-api-access-jm8hx\") on node \"crc\" DevicePath \"\"" Feb 23 08:26:44 crc kubenswrapper[5047]: I0223 08:26:44.316531 5047 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/736bd9f7-11ff-4ed7-9272-cde88f3fadba-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 23 08:26:44 crc kubenswrapper[5047]: I0223 08:26:44.814644 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vx7pv" event={"ID":"736bd9f7-11ff-4ed7-9272-cde88f3fadba","Type":"ContainerDied","Data":"1f5223dad6c67c72b6fc82cb8f4f5bee99871abca31015a9ccbabf21daf18ee0"} Feb 23 08:26:44 crc kubenswrapper[5047]: I0223 08:26:44.814697 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f5223dad6c67c72b6fc82cb8f4f5bee99871abca31015a9ccbabf21daf18ee0" Feb 23 08:26:44 crc kubenswrapper[5047]: I0223 08:26:44.814698 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vx7pv" Feb 23 08:26:55 crc kubenswrapper[5047]: I0223 08:26:55.341628 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:26:55 crc kubenswrapper[5047]: E0223 08:26:55.342966 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:26:55 crc kubenswrapper[5047]: I0223 08:26:55.433341 5047 scope.go:117] "RemoveContainer" containerID="bfb3b1b6c1bb52e7fa3129c3b4e49a5d31e93d027bc077159a879900e4e36f7e" Feb 23 08:27:10 crc kubenswrapper[5047]: I0223 08:27:10.341606 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:27:10 crc kubenswrapper[5047]: E0223 08:27:10.342740 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:27:23 crc kubenswrapper[5047]: I0223 08:27:23.341711 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:27:23 crc kubenswrapper[5047]: E0223 08:27:23.343328 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:27:35 crc kubenswrapper[5047]: I0223 08:27:35.340744 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:27:35 crc kubenswrapper[5047]: E0223 08:27:35.342315 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:27:47 crc kubenswrapper[5047]: I0223 08:27:47.340851 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:27:47 crc kubenswrapper[5047]: E0223 08:27:47.341820 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.663944 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zfvrc"] Feb 23 08:27:51 crc kubenswrapper[5047]: E0223 08:27:51.664693 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736bd9f7-11ff-4ed7-9272-cde88f3fadba" containerName="storage" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.664710 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="736bd9f7-11ff-4ed7-9272-cde88f3fadba" containerName="storage" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.664880 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="736bd9f7-11ff-4ed7-9272-cde88f3fadba" containerName="storage" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.666279 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.671873 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfvrc"] Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.726559 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-catalog-content\") pod \"redhat-marketplace-zfvrc\" (UID: \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\") " pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.726625 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-utilities\") pod \"redhat-marketplace-zfvrc\" (UID: \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\") " pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.726694 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gr48\" (UniqueName: \"kubernetes.io/projected/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-kube-api-access-8gr48\") pod \"redhat-marketplace-zfvrc\" (UID: \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\") " pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.828311 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-catalog-content\") pod \"redhat-marketplace-zfvrc\" (UID: \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\") " pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.828373 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-utilities\") pod \"redhat-marketplace-zfvrc\" (UID: \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\") " pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.828459 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gr48\" (UniqueName: \"kubernetes.io/projected/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-kube-api-access-8gr48\") pod \"redhat-marketplace-zfvrc\" (UID: \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\") " pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.828957 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-catalog-content\") pod \"redhat-marketplace-zfvrc\" (UID: \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\") " pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.829031 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-utilities\") pod \"redhat-marketplace-zfvrc\" (UID: \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\") " pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.851183 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gr48\" (UniqueName: \"kubernetes.io/projected/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-kube-api-access-8gr48\") pod \"redhat-marketplace-zfvrc\" (UID: \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\") " pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:27:51 crc kubenswrapper[5047]: I0223 08:27:51.996301 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:27:52 crc kubenswrapper[5047]: I0223 08:27:52.470804 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfvrc"] Feb 23 08:27:53 crc kubenswrapper[5047]: I0223 08:27:53.469588 5047 generic.go:334] "Generic (PLEG): container finished" podID="a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" containerID="1ac6274b6e99664025ac66d15b34243440215ded2e00077c7b67c2dbb3f96b0e" exitCode=0 Feb 23 08:27:53 crc kubenswrapper[5047]: I0223 08:27:53.469677 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfvrc" event={"ID":"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211","Type":"ContainerDied","Data":"1ac6274b6e99664025ac66d15b34243440215ded2e00077c7b67c2dbb3f96b0e"} Feb 23 08:27:53 crc kubenswrapper[5047]: I0223 08:27:53.470183 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfvrc" event={"ID":"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211","Type":"ContainerStarted","Data":"65d65533a7540cb3b524bf6dcc0fbece7b4d6fa7ec3469c4a9554578d398d7c2"} Feb 23 08:27:55 crc kubenswrapper[5047]: I0223 08:27:55.599140 5047 generic.go:334] "Generic (PLEG): container finished" podID="a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" containerID="5229baa7c57b0f00b7a3e546ffbb0b6d64f73dc88687eae84186af9c116c7e39" exitCode=0 Feb 23 08:27:55 crc kubenswrapper[5047]: I0223 08:27:55.599213 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfvrc" event={"ID":"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211","Type":"ContainerDied","Data":"5229baa7c57b0f00b7a3e546ffbb0b6d64f73dc88687eae84186af9c116c7e39"} Feb 23 08:27:56 crc kubenswrapper[5047]: I0223 08:27:56.612443 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfvrc" event={"ID":"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211","Type":"ContainerStarted","Data":"b799d6b5be0e37b57f42e0b0419fb244e9c560532c7ebfc2209b2255633058c1"} Feb 23 08:27:56 crc kubenswrapper[5047]: I0223 08:27:56.648884 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zfvrc" podStartSLOduration=3.103722918 podStartE2EDuration="5.648845435s" podCreationTimestamp="2026-02-23 08:27:51 +0000 UTC" firstStartedPulling="2026-02-23 08:27:53.471971278 +0000 UTC m=+6195.723298452" lastFinishedPulling="2026-02-23 08:27:56.017093825 +0000 UTC m=+6198.268420969" observedRunningTime="2026-02-23 08:27:56.639130114 +0000 UTC m=+6198.890457288" watchObservedRunningTime="2026-02-23 08:27:56.648845435 +0000 UTC m=+6198.900172609" Feb 23 08:27:58 crc kubenswrapper[5047]: I0223 08:27:58.345642 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:27:58 crc kubenswrapper[5047]: E0223 08:27:58.346296 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:28:01 crc kubenswrapper[5047]: I0223 08:28:01.996762 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:28:01 crc kubenswrapper[5047]: I0223 08:28:01.997458 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:28:02 crc kubenswrapper[5047]: I0223 08:28:02.060253 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:28:02 crc kubenswrapper[5047]: I0223 08:28:02.748551 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:28:02 crc kubenswrapper[5047]: I0223 08:28:02.839078 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfvrc"] Feb 23 08:28:04 crc kubenswrapper[5047]: I0223 08:28:04.689341 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zfvrc" podUID="a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" containerName="registry-server" containerID="cri-o://b799d6b5be0e37b57f42e0b0419fb244e9c560532c7ebfc2209b2255633058c1" gracePeriod=2 Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.190479 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.300592 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-utilities\") pod \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\" (UID: \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\") " Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.300797 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gr48\" (UniqueName: \"kubernetes.io/projected/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-kube-api-access-8gr48\") pod \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\" (UID: \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\") " Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.300835 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-catalog-content\") pod \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\" (UID: \"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211\") " Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.302065 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-utilities" (OuterVolumeSpecName: "utilities") pod "a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" (UID: "a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.312164 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-kube-api-access-8gr48" (OuterVolumeSpecName: "kube-api-access-8gr48") pod "a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" (UID: "a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211"). InnerVolumeSpecName "kube-api-access-8gr48". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.404801 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gr48\" (UniqueName: \"kubernetes.io/projected/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-kube-api-access-8gr48\") on node \"crc\" DevicePath \"\"" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.405420 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.406408 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" (UID: "a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.507961 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.703824 5047 generic.go:334] "Generic (PLEG): container finished" podID="a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" containerID="b799d6b5be0e37b57f42e0b0419fb244e9c560532c7ebfc2209b2255633058c1" exitCode=0 Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.703897 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfvrc" event={"ID":"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211","Type":"ContainerDied","Data":"b799d6b5be0e37b57f42e0b0419fb244e9c560532c7ebfc2209b2255633058c1"} Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.703971 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfvrc" event={"ID":"a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211","Type":"ContainerDied","Data":"65d65533a7540cb3b524bf6dcc0fbece7b4d6fa7ec3469c4a9554578d398d7c2"} Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.704004 5047 scope.go:117] "RemoveContainer" containerID="b799d6b5be0e37b57f42e0b0419fb244e9c560532c7ebfc2209b2255633058c1" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.707004 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfvrc" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.739267 5047 scope.go:117] "RemoveContainer" containerID="5229baa7c57b0f00b7a3e546ffbb0b6d64f73dc88687eae84186af9c116c7e39" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.780698 5047 scope.go:117] "RemoveContainer" containerID="1ac6274b6e99664025ac66d15b34243440215ded2e00077c7b67c2dbb3f96b0e" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.790637 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfvrc"] Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.803349 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfvrc"] Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.808601 5047 scope.go:117] "RemoveContainer" containerID="b799d6b5be0e37b57f42e0b0419fb244e9c560532c7ebfc2209b2255633058c1" Feb 23 08:28:05 crc kubenswrapper[5047]: E0223 08:28:05.809424 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b799d6b5be0e37b57f42e0b0419fb244e9c560532c7ebfc2209b2255633058c1\": container with ID starting with b799d6b5be0e37b57f42e0b0419fb244e9c560532c7ebfc2209b2255633058c1 not found: ID does not exist" containerID="b799d6b5be0e37b57f42e0b0419fb244e9c560532c7ebfc2209b2255633058c1" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.809590 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b799d6b5be0e37b57f42e0b0419fb244e9c560532c7ebfc2209b2255633058c1"} err="failed to get container status \"b799d6b5be0e37b57f42e0b0419fb244e9c560532c7ebfc2209b2255633058c1\": rpc error: code = NotFound desc = could not find container \"b799d6b5be0e37b57f42e0b0419fb244e9c560532c7ebfc2209b2255633058c1\": container with ID starting with b799d6b5be0e37b57f42e0b0419fb244e9c560532c7ebfc2209b2255633058c1 not found: ID does not exist" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.809726 5047 scope.go:117] "RemoveContainer" containerID="5229baa7c57b0f00b7a3e546ffbb0b6d64f73dc88687eae84186af9c116c7e39" Feb 23 08:28:05 crc kubenswrapper[5047]: E0223 08:28:05.810590 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5229baa7c57b0f00b7a3e546ffbb0b6d64f73dc88687eae84186af9c116c7e39\": container with ID starting with 5229baa7c57b0f00b7a3e546ffbb0b6d64f73dc88687eae84186af9c116c7e39 not found: ID does not exist" containerID="5229baa7c57b0f00b7a3e546ffbb0b6d64f73dc88687eae84186af9c116c7e39" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.810635 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5229baa7c57b0f00b7a3e546ffbb0b6d64f73dc88687eae84186af9c116c7e39"} err="failed to get container status \"5229baa7c57b0f00b7a3e546ffbb0b6d64f73dc88687eae84186af9c116c7e39\": rpc error: code = NotFound desc = could not find container \"5229baa7c57b0f00b7a3e546ffbb0b6d64f73dc88687eae84186af9c116c7e39\": container with ID starting with 5229baa7c57b0f00b7a3e546ffbb0b6d64f73dc88687eae84186af9c116c7e39 not found: ID does not exist" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.810667 5047 scope.go:117] "RemoveContainer" containerID="1ac6274b6e99664025ac66d15b34243440215ded2e00077c7b67c2dbb3f96b0e" Feb 23 08:28:05 crc kubenswrapper[5047]: E0223 08:28:05.811086 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac6274b6e99664025ac66d15b34243440215ded2e00077c7b67c2dbb3f96b0e\": container with ID starting with 1ac6274b6e99664025ac66d15b34243440215ded2e00077c7b67c2dbb3f96b0e not found: ID does not exist" containerID="1ac6274b6e99664025ac66d15b34243440215ded2e00077c7b67c2dbb3f96b0e" Feb 23 08:28:05 crc kubenswrapper[5047]: I0223 08:28:05.811169 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac6274b6e99664025ac66d15b34243440215ded2e00077c7b67c2dbb3f96b0e"} err="failed to get container status \"1ac6274b6e99664025ac66d15b34243440215ded2e00077c7b67c2dbb3f96b0e\": rpc error: code = NotFound desc = could not find container \"1ac6274b6e99664025ac66d15b34243440215ded2e00077c7b67c2dbb3f96b0e\": container with ID starting with 1ac6274b6e99664025ac66d15b34243440215ded2e00077c7b67c2dbb3f96b0e not found: ID does not exist" Feb 23 08:28:06 crc kubenswrapper[5047]: I0223 08:28:06.358834 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" path="/var/lib/kubelet/pods/a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211/volumes" Feb 23 08:28:12 crc kubenswrapper[5047]: I0223 08:28:12.341883 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:28:12 crc kubenswrapper[5047]: E0223 08:28:12.343077 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:28:26 crc kubenswrapper[5047]: I0223 08:28:26.341426 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:28:26 crc kubenswrapper[5047]: E0223 08:28:26.342581 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:28:40 crc kubenswrapper[5047]: I0223 08:28:40.343514 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:28:40 crc kubenswrapper[5047]: E0223 08:28:40.346236 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:28:55 crc kubenswrapper[5047]: I0223 08:28:55.341634 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:28:55 crc kubenswrapper[5047]: E0223 08:28:55.342711 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:29:10 crc kubenswrapper[5047]: I0223 08:29:10.340674 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:29:10 crc kubenswrapper[5047]: E0223 08:29:10.341487 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.527327 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-774d9db845-6lt25"] Feb 23 08:29:13 crc kubenswrapper[5047]: E0223 08:29:13.527930 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" containerName="extract-content" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.527946 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" containerName="extract-content" Feb 23 08:29:13 crc kubenswrapper[5047]: E0223 08:29:13.527972 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" containerName="extract-utilities" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.527981 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" containerName="extract-utilities" Feb 23 08:29:13 crc kubenswrapper[5047]: E0223 08:29:13.528008 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" containerName="registry-server" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.528015 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" containerName="registry-server" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.528171 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75b7cf6-ddc3-40bc-892b-a4c1fd3c0211" containerName="registry-server" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.528847 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774d9db845-6lt25" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.530936 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.531186 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.543180 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-fwlcq" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.543385 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.560943 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-774d9db845-6lt25"] Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.621378 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f67b98cb7-p9d6t"] Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.622468 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.626194 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.641375 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f67b98cb7-p9d6t"] Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.704615 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h6fc\" (UniqueName: \"kubernetes.io/projected/e91a6ea2-940e-421f-af86-c26daacb7a9f-kube-api-access-5h6fc\") pod \"dnsmasq-dns-7f67b98cb7-p9d6t\" (UID: \"e91a6ea2-940e-421f-af86-c26daacb7a9f\") " pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.704672 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4cd0be5-188f-4b4a-a3b0-6008f397120c-config\") pod \"dnsmasq-dns-774d9db845-6lt25\" (UID: \"b4cd0be5-188f-4b4a-a3b0-6008f397120c\") " pod="openstack/dnsmasq-dns-774d9db845-6lt25" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.704704 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88nz2\" (UniqueName: \"kubernetes.io/projected/b4cd0be5-188f-4b4a-a3b0-6008f397120c-kube-api-access-88nz2\") pod \"dnsmasq-dns-774d9db845-6lt25\" (UID: \"b4cd0be5-188f-4b4a-a3b0-6008f397120c\") " pod="openstack/dnsmasq-dns-774d9db845-6lt25" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.705058 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91a6ea2-940e-421f-af86-c26daacb7a9f-config\") pod \"dnsmasq-dns-7f67b98cb7-p9d6t\" (UID: \"e91a6ea2-940e-421f-af86-c26daacb7a9f\") " pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.705162 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e91a6ea2-940e-421f-af86-c26daacb7a9f-dns-svc\") pod \"dnsmasq-dns-7f67b98cb7-p9d6t\" (UID: \"e91a6ea2-940e-421f-af86-c26daacb7a9f\") " pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.806578 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91a6ea2-940e-421f-af86-c26daacb7a9f-config\") pod \"dnsmasq-dns-7f67b98cb7-p9d6t\" (UID: \"e91a6ea2-940e-421f-af86-c26daacb7a9f\") " pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.806642 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e91a6ea2-940e-421f-af86-c26daacb7a9f-dns-svc\") pod \"dnsmasq-dns-7f67b98cb7-p9d6t\" (UID: \"e91a6ea2-940e-421f-af86-c26daacb7a9f\") " pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.806668 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h6fc\" (UniqueName: \"kubernetes.io/projected/e91a6ea2-940e-421f-af86-c26daacb7a9f-kube-api-access-5h6fc\") pod \"dnsmasq-dns-7f67b98cb7-p9d6t\" (UID: \"e91a6ea2-940e-421f-af86-c26daacb7a9f\") " pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.806694 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4cd0be5-188f-4b4a-a3b0-6008f397120c-config\") pod \"dnsmasq-dns-774d9db845-6lt25\" (UID: \"b4cd0be5-188f-4b4a-a3b0-6008f397120c\") " pod="openstack/dnsmasq-dns-774d9db845-6lt25" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.806719 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88nz2\" (UniqueName: \"kubernetes.io/projected/b4cd0be5-188f-4b4a-a3b0-6008f397120c-kube-api-access-88nz2\") pod \"dnsmasq-dns-774d9db845-6lt25\" (UID: \"b4cd0be5-188f-4b4a-a3b0-6008f397120c\") " pod="openstack/dnsmasq-dns-774d9db845-6lt25" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.807935 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91a6ea2-940e-421f-af86-c26daacb7a9f-config\") pod \"dnsmasq-dns-7f67b98cb7-p9d6t\" (UID: \"e91a6ea2-940e-421f-af86-c26daacb7a9f\") " pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.807956 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e91a6ea2-940e-421f-af86-c26daacb7a9f-dns-svc\") pod \"dnsmasq-dns-7f67b98cb7-p9d6t\" (UID: \"e91a6ea2-940e-421f-af86-c26daacb7a9f\") " pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.808725 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4cd0be5-188f-4b4a-a3b0-6008f397120c-config\") pod \"dnsmasq-dns-774d9db845-6lt25\" (UID: \"b4cd0be5-188f-4b4a-a3b0-6008f397120c\") " pod="openstack/dnsmasq-dns-774d9db845-6lt25" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.833111 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h6fc\" (UniqueName: \"kubernetes.io/projected/e91a6ea2-940e-421f-af86-c26daacb7a9f-kube-api-access-5h6fc\") pod \"dnsmasq-dns-7f67b98cb7-p9d6t\" (UID: \"e91a6ea2-940e-421f-af86-c26daacb7a9f\") " pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.833120 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88nz2\" (UniqueName: \"kubernetes.io/projected/b4cd0be5-188f-4b4a-a3b0-6008f397120c-kube-api-access-88nz2\") pod \"dnsmasq-dns-774d9db845-6lt25\" (UID: \"b4cd0be5-188f-4b4a-a3b0-6008f397120c\") " pod="openstack/dnsmasq-dns-774d9db845-6lt25" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.845245 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774d9db845-6lt25" Feb 23 08:29:13 crc kubenswrapper[5047]: I0223 08:29:13.940789 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.013532 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774d9db845-6lt25"] Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.047964 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-787c4dc9-kxv5b"] Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.049207 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.081026 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-787c4dc9-kxv5b"] Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.221760 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1411ee02-d976-4ed6-8125-111f5fdbb39e-config\") pod \"dnsmasq-dns-787c4dc9-kxv5b\" (UID: \"1411ee02-d976-4ed6-8125-111f5fdbb39e\") " pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.221814 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rd6w\" (UniqueName: \"kubernetes.io/projected/1411ee02-d976-4ed6-8125-111f5fdbb39e-kube-api-access-4rd6w\") pod \"dnsmasq-dns-787c4dc9-kxv5b\" (UID: \"1411ee02-d976-4ed6-8125-111f5fdbb39e\") " pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.221862 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1411ee02-d976-4ed6-8125-111f5fdbb39e-dns-svc\") pod \"dnsmasq-dns-787c4dc9-kxv5b\" (UID: \"1411ee02-d976-4ed6-8125-111f5fdbb39e\") " pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.323469 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1411ee02-d976-4ed6-8125-111f5fdbb39e-dns-svc\") pod \"dnsmasq-dns-787c4dc9-kxv5b\" (UID: \"1411ee02-d976-4ed6-8125-111f5fdbb39e\") " pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.323573 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1411ee02-d976-4ed6-8125-111f5fdbb39e-config\") pod \"dnsmasq-dns-787c4dc9-kxv5b\" (UID: \"1411ee02-d976-4ed6-8125-111f5fdbb39e\") " pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.323604 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rd6w\" (UniqueName: \"kubernetes.io/projected/1411ee02-d976-4ed6-8125-111f5fdbb39e-kube-api-access-4rd6w\") pod \"dnsmasq-dns-787c4dc9-kxv5b\" (UID: \"1411ee02-d976-4ed6-8125-111f5fdbb39e\") " pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.324522 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1411ee02-d976-4ed6-8125-111f5fdbb39e-dns-svc\") pod \"dnsmasq-dns-787c4dc9-kxv5b\" (UID: \"1411ee02-d976-4ed6-8125-111f5fdbb39e\") " pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.325743 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1411ee02-d976-4ed6-8125-111f5fdbb39e-config\") pod \"dnsmasq-dns-787c4dc9-kxv5b\" (UID: \"1411ee02-d976-4ed6-8125-111f5fdbb39e\") " pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.328868 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-787c4dc9-kxv5b"] Feb 23 08:29:14 crc kubenswrapper[5047]: E0223 08:29:14.329407 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-4rd6w], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" podUID="1411ee02-d976-4ed6-8125-111f5fdbb39e" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.342588 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rd6w\" (UniqueName: \"kubernetes.io/projected/1411ee02-d976-4ed6-8125-111f5fdbb39e-kube-api-access-4rd6w\") pod \"dnsmasq-dns-787c4dc9-kxv5b\" (UID: \"1411ee02-d976-4ed6-8125-111f5fdbb39e\") " pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.354864 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.355204 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bb88b7bf5-zg89r"] Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.358553 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.369863 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bb88b7bf5-zg89r"] Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.371961 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.406659 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774d9db845-6lt25"] Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.499603 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f67b98cb7-p9d6t"] Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.526047 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1411ee02-d976-4ed6-8125-111f5fdbb39e-dns-svc\") pod \"1411ee02-d976-4ed6-8125-111f5fdbb39e\" (UID: \"1411ee02-d976-4ed6-8125-111f5fdbb39e\") " Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.526139 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1411ee02-d976-4ed6-8125-111f5fdbb39e-config\") pod \"1411ee02-d976-4ed6-8125-111f5fdbb39e\" (UID: \"1411ee02-d976-4ed6-8125-111f5fdbb39e\") " Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.526208 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rd6w\" (UniqueName: \"kubernetes.io/projected/1411ee02-d976-4ed6-8125-111f5fdbb39e-kube-api-access-4rd6w\") pod \"1411ee02-d976-4ed6-8125-111f5fdbb39e\" (UID: \"1411ee02-d976-4ed6-8125-111f5fdbb39e\") " Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.526408 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b825\" (UniqueName: \"kubernetes.io/projected/2e02ce02-2283-4c5d-8094-e8826849eea3-kube-api-access-6b825\") pod \"dnsmasq-dns-bb88b7bf5-zg89r\" (UID: \"2e02ce02-2283-4c5d-8094-e8826849eea3\") " pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.526470 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e02ce02-2283-4c5d-8094-e8826849eea3-dns-svc\") pod \"dnsmasq-dns-bb88b7bf5-zg89r\" (UID: \"2e02ce02-2283-4c5d-8094-e8826849eea3\") " pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.526494 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e02ce02-2283-4c5d-8094-e8826849eea3-config\") pod \"dnsmasq-dns-bb88b7bf5-zg89r\" (UID: \"2e02ce02-2283-4c5d-8094-e8826849eea3\") " pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.526522 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1411ee02-d976-4ed6-8125-111f5fdbb39e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1411ee02-d976-4ed6-8125-111f5fdbb39e" (UID: "1411ee02-d976-4ed6-8125-111f5fdbb39e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.526951 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1411ee02-d976-4ed6-8125-111f5fdbb39e-config" (OuterVolumeSpecName: "config") pod "1411ee02-d976-4ed6-8125-111f5fdbb39e" (UID: "1411ee02-d976-4ed6-8125-111f5fdbb39e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.529543 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1411ee02-d976-4ed6-8125-111f5fdbb39e-kube-api-access-4rd6w" (OuterVolumeSpecName: "kube-api-access-4rd6w") pod "1411ee02-d976-4ed6-8125-111f5fdbb39e" (UID: "1411ee02-d976-4ed6-8125-111f5fdbb39e"). InnerVolumeSpecName "kube-api-access-4rd6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.628694 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b825\" (UniqueName: \"kubernetes.io/projected/2e02ce02-2283-4c5d-8094-e8826849eea3-kube-api-access-6b825\") pod \"dnsmasq-dns-bb88b7bf5-zg89r\" (UID: \"2e02ce02-2283-4c5d-8094-e8826849eea3\") " pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.628769 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e02ce02-2283-4c5d-8094-e8826849eea3-dns-svc\") pod \"dnsmasq-dns-bb88b7bf5-zg89r\" (UID: \"2e02ce02-2283-4c5d-8094-e8826849eea3\") " pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.628790 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e02ce02-2283-4c5d-8094-e8826849eea3-config\") pod \"dnsmasq-dns-bb88b7bf5-zg89r\" (UID: \"2e02ce02-2283-4c5d-8094-e8826849eea3\") " pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.628892 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1411ee02-d976-4ed6-8125-111f5fdbb39e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.628926 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1411ee02-d976-4ed6-8125-111f5fdbb39e-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.628938 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rd6w\" (UniqueName: \"kubernetes.io/projected/1411ee02-d976-4ed6-8125-111f5fdbb39e-kube-api-access-4rd6w\") on node \"crc\" DevicePath \"\"" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.629777 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e02ce02-2283-4c5d-8094-e8826849eea3-dns-svc\") pod \"dnsmasq-dns-bb88b7bf5-zg89r\" (UID: \"2e02ce02-2283-4c5d-8094-e8826849eea3\") " pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.630951 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e02ce02-2283-4c5d-8094-e8826849eea3-config\") pod \"dnsmasq-dns-bb88b7bf5-zg89r\" (UID: \"2e02ce02-2283-4c5d-8094-e8826849eea3\") " pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.646090 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b825\" (UniqueName: \"kubernetes.io/projected/2e02ce02-2283-4c5d-8094-e8826849eea3-kube-api-access-6b825\") pod \"dnsmasq-dns-bb88b7bf5-zg89r\" (UID: \"2e02ce02-2283-4c5d-8094-e8826849eea3\") " pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:29:14 crc kubenswrapper[5047]: I0223 08:29:14.684087 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.063697 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bb88b7bf5-zg89r"] Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.187144 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.188359 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.196109 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.196691 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.196723 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.196913 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.197122 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.197307 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rsnxc" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.197498 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.203743 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.346769 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-82d2ef49-2524-428f-af58-59241651700c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.346827 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.346885 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.346935 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt4sd\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-kube-api-access-nt4sd\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.346968 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-config-data\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.346995 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.347019 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.347054 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.347967 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.348164 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.348219 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.366798 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774d9db845-6lt25" event={"ID":"b4cd0be5-188f-4b4a-a3b0-6008f397120c","Type":"ContainerStarted","Data":"3aec3f13a89e8811b4f525dcea37e9c823de2b13beee04b4fd171e0ab0ac4c58"} Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.368407 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" event={"ID":"e91a6ea2-940e-421f-af86-c26daacb7a9f","Type":"ContainerStarted","Data":"156faa5fa3e5671441977fb278a15ae2122be4f8df41a2efe5d9de5f5598e6c8"} Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.369897 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-787c4dc9-kxv5b" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.370873 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" event={"ID":"2e02ce02-2283-4c5d-8094-e8826849eea3","Type":"ContainerStarted","Data":"51a0c1fa3323c8c0ab637b3cee9760c5e8ce872c8bbbb000acad46b4fe3032a2"} Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.452442 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.452505 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.452592 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.452632 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.452676 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.452746 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.452778 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-82d2ef49-2524-428f-af58-59241651700c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.452841 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.453046 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.453086 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt4sd\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-kube-api-access-nt4sd\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.453143 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-config-data\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.454673 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-config-data\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.455244 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.455377 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.456673 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.457095 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.459780 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.466044 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.466422 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-82d2ef49-2524-428f-af58-59241651700c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/aabf718db6f5931eb11327a98efab57b5892cb45001a61dea3c49cc7bf70a70f/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.467968 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-787c4dc9-kxv5b"] Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.474317 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.475347 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.479230 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.483281 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt4sd\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-kube-api-access-nt4sd\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.486763 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-787c4dc9-kxv5b"] Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.496981 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.498581 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.501186 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.501330 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.501369 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.501536 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.501637 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.501719 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-r8k72" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.501887 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.504806 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.572256 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-82d2ef49-2524-428f-af58-59241651700c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") pod \"rabbitmq-server-0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.656056 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-381a0458-ccaa-459e-8751-bf247399cbd4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.656107 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.656127 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.656144 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.656166 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.656200 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.656244 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.656269 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.656297 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/474079eb-cdc7-47b4-b555-8909f6a658d5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.656315 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l527f\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-kube-api-access-l527f\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.656332 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/474079eb-cdc7-47b4-b555-8909f6a658d5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.758112 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.758152 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.758188 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/474079eb-cdc7-47b4-b555-8909f6a658d5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.758216 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l527f\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-kube-api-access-l527f\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.758235 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/474079eb-cdc7-47b4-b555-8909f6a658d5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.758278 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-381a0458-ccaa-459e-8751-bf247399cbd4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.758312 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.758336 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.758352 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.758371 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.758419 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.759293 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.762367 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.762496 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.762928 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.763456 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.763720 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.768575 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/474079eb-cdc7-47b4-b555-8909f6a658d5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.769137 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/474079eb-cdc7-47b4-b555-8909f6a658d5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.769222 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.769989 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.770018 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-381a0458-ccaa-459e-8751-bf247399cbd4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/577037528b2482e0b9f6f00316925bdadde6388bc3db266287efd4a6d1ff6d9e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.779816 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l527f\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-kube-api-access-l527f\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.798683 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-381a0458-ccaa-459e-8751-bf247399cbd4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.817461 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 08:29:15 crc kubenswrapper[5047]: I0223 08:29:15.885464 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.270065 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.281736 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.282942 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.290255 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-zslw9" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.290440 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.290574 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.290698 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.320311 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.326702 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.379967 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753e53f0-d44b-4af2-9aff-eabc1a46d537-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.380047 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-operator-scripts\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.380102 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6ngk\" (UniqueName: \"kubernetes.io/projected/753e53f0-d44b-4af2-9aff-eabc1a46d537-kube-api-access-w6ngk\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.380181 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/753e53f0-d44b-4af2-9aff-eabc1a46d537-config-data-generated\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.380297 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1411ee02-d976-4ed6-8125-111f5fdbb39e" path="/var/lib/kubelet/pods/1411ee02-d976-4ed6-8125-111f5fdbb39e/volumes" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.380326 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-config-data-default\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.380355 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-kolla-config\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.380391 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/753e53f0-d44b-4af2-9aff-eabc1a46d537-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.380418 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.394881 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.398529 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"474079eb-cdc7-47b4-b555-8909f6a658d5","Type":"ContainerStarted","Data":"76e660be09ff3d762b1380811c3bb838e87214d64f27ff0227a10b3ddef21f3f"} Feb 23 08:29:16 crc kubenswrapper[5047]: W0223 08:29:16.416745 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7c88bdf_c040_42b6_a5e4_269f32bd99d0.slice/crio-769d0093cc86d81251a1d5e6420ef2cad2f8a7fad4949b36b037dd2216a38ab8 WatchSource:0}: Error finding container 769d0093cc86d81251a1d5e6420ef2cad2f8a7fad4949b36b037dd2216a38ab8: Status 404 returned error can't find the container with id 769d0093cc86d81251a1d5e6420ef2cad2f8a7fad4949b36b037dd2216a38ab8 Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.481874 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753e53f0-d44b-4af2-9aff-eabc1a46d537-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.481964 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-operator-scripts\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.482010 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6ngk\" (UniqueName: \"kubernetes.io/projected/753e53f0-d44b-4af2-9aff-eabc1a46d537-kube-api-access-w6ngk\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.482035 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/753e53f0-d44b-4af2-9aff-eabc1a46d537-config-data-generated\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.482077 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-config-data-default\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.482112 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-kolla-config\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.482155 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/753e53f0-d44b-4af2-9aff-eabc1a46d537-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.482179 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.484340 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/753e53f0-d44b-4af2-9aff-eabc1a46d537-config-data-generated\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.484795 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-kolla-config\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.484948 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-operator-scripts\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.485955 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-config-data-default\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.487273 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.487329 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/05f6499c08e5fc64d01c3dddf666eb716258bf967132f5ba622154b8286f4bd2/globalmount\"" pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.488969 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/753e53f0-d44b-4af2-9aff-eabc1a46d537-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.489325 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753e53f0-d44b-4af2-9aff-eabc1a46d537-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.509883 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6ngk\" (UniqueName: \"kubernetes.io/projected/753e53f0-d44b-4af2-9aff-eabc1a46d537-kube-api-access-w6ngk\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.528755 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602\") pod \"openstack-galera-0\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " pod="openstack/openstack-galera-0" Feb 23 08:29:16 crc kubenswrapper[5047]: I0223 08:29:16.653383 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.226336 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 08:29:17 crc kubenswrapper[5047]: W0223 08:29:17.239526 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod753e53f0_d44b_4af2_9aff_eabc1a46d537.slice/crio-3ecaac4e9c6f1e95cfd65140c33e4209babfd73d1e7084f88cb7a0312107bc12 WatchSource:0}: Error finding container 3ecaac4e9c6f1e95cfd65140c33e4209babfd73d1e7084f88cb7a0312107bc12: Status 404 returned error can't find the container with id 3ecaac4e9c6f1e95cfd65140c33e4209babfd73d1e7084f88cb7a0312107bc12 Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.413383 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7c88bdf-c040-42b6-a5e4-269f32bd99d0","Type":"ContainerStarted","Data":"769d0093cc86d81251a1d5e6420ef2cad2f8a7fad4949b36b037dd2216a38ab8"} Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.415653 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"753e53f0-d44b-4af2-9aff-eabc1a46d537","Type":"ContainerStarted","Data":"3ecaac4e9c6f1e95cfd65140c33e4209babfd73d1e7084f88cb7a0312107bc12"} Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.676481 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.678033 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.681275 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.681584 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-w9pwf" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.681761 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.687175 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.702237 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.703217 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.703265 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/74700fa7-59df-4201-a7c4-de815b82208e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.703312 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.703407 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.703430 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/74700fa7-59df-4201-a7c4-de815b82208e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.703453 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vldcx\" (UniqueName: \"kubernetes.io/projected/74700fa7-59df-4201-a7c4-de815b82208e-kube-api-access-vldcx\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.703479 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.703535 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74700fa7-59df-4201-a7c4-de815b82208e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.805961 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.806062 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.807411 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.807470 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/74700fa7-59df-4201-a7c4-de815b82208e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.807524 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vldcx\" (UniqueName: \"kubernetes.io/projected/74700fa7-59df-4201-a7c4-de815b82208e-kube-api-access-vldcx\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.807870 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.808862 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.808941 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74700fa7-59df-4201-a7c4-de815b82208e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.810766 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.810842 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/74700fa7-59df-4201-a7c4-de815b82208e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.811407 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/74700fa7-59df-4201-a7c4-de815b82208e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.812244 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.814411 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74700fa7-59df-4201-a7c4-de815b82208e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.820768 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.820831 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c3ba4254e64e46e891c58e7ffb86002367c30e44ef7851584770669baf19d10e/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.827422 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vldcx\" (UniqueName: \"kubernetes.io/projected/74700fa7-59df-4201-a7c4-de815b82208e-kube-api-access-vldcx\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.829057 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/74700fa7-59df-4201-a7c4-de815b82208e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:17 crc kubenswrapper[5047]: I0223 08:29:17.861438 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0\") pod \"openstack-cell1-galera-0\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.012535 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.283032 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.284040 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.286041 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.290932 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.291076 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-259gt" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.303068 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.426094 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/39cf7673-4d38-49e0-9b86-f80c3949fd06-memcached-tls-certs\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.426464 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/39cf7673-4d38-49e0-9b86-f80c3949fd06-kolla-config\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.426519 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39cf7673-4d38-49e0-9b86-f80c3949fd06-combined-ca-bundle\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.426559 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xmtb\" (UniqueName: \"kubernetes.io/projected/39cf7673-4d38-49e0-9b86-f80c3949fd06-kube-api-access-6xmtb\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.426665 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39cf7673-4d38-49e0-9b86-f80c3949fd06-config-data\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.512062 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 08:29:18 crc kubenswrapper[5047]: W0223 08:29:18.513083 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74700fa7_59df_4201_a7c4_de815b82208e.slice/crio-4bd063b40bce2917ee513c5473c64da14c8fa9673242bb9cb0ad59ab6f069bdf WatchSource:0}: Error finding container 4bd063b40bce2917ee513c5473c64da14c8fa9673242bb9cb0ad59ab6f069bdf: Status 404 returned error can't find the container with id 4bd063b40bce2917ee513c5473c64da14c8fa9673242bb9cb0ad59ab6f069bdf Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.528054 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39cf7673-4d38-49e0-9b86-f80c3949fd06-combined-ca-bundle\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.528122 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xmtb\" (UniqueName: \"kubernetes.io/projected/39cf7673-4d38-49e0-9b86-f80c3949fd06-kube-api-access-6xmtb\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.528150 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39cf7673-4d38-49e0-9b86-f80c3949fd06-config-data\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.528252 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/39cf7673-4d38-49e0-9b86-f80c3949fd06-memcached-tls-certs\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.528275 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/39cf7673-4d38-49e0-9b86-f80c3949fd06-kolla-config\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.532538 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/39cf7673-4d38-49e0-9b86-f80c3949fd06-kolla-config\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.532757 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39cf7673-4d38-49e0-9b86-f80c3949fd06-config-data\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.537388 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/39cf7673-4d38-49e0-9b86-f80c3949fd06-memcached-tls-certs\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.538580 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39cf7673-4d38-49e0-9b86-f80c3949fd06-combined-ca-bundle\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.549095 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xmtb\" (UniqueName: \"kubernetes.io/projected/39cf7673-4d38-49e0-9b86-f80c3949fd06-kube-api-access-6xmtb\") pod \"memcached-0\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " pod="openstack/memcached-0" Feb 23 08:29:18 crc kubenswrapper[5047]: I0223 08:29:18.609720 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 08:29:19 crc kubenswrapper[5047]: I0223 08:29:19.291281 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 23 08:29:19 crc kubenswrapper[5047]: I0223 08:29:19.453942 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"39cf7673-4d38-49e0-9b86-f80c3949fd06","Type":"ContainerStarted","Data":"01cddf8ee5f86dd10417671e610044e5e1f989298e6722e7beab0a6a68c0b35b"} Feb 23 08:29:19 crc kubenswrapper[5047]: I0223 08:29:19.456447 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"74700fa7-59df-4201-a7c4-de815b82208e","Type":"ContainerStarted","Data":"4bd063b40bce2917ee513c5473c64da14c8fa9673242bb9cb0ad59ab6f069bdf"} Feb 23 08:29:22 crc kubenswrapper[5047]: I0223 08:29:22.343604 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:29:22 crc kubenswrapper[5047]: E0223 08:29:22.346983 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:29:36 crc kubenswrapper[5047]: I0223 08:29:36.342801 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:29:36 crc kubenswrapper[5047]: E0223 08:29:36.344661 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:29:43 crc kubenswrapper[5047]: E0223 08:29:42.999426 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 08:29:43 crc kubenswrapper[5047]: E0223 08:29:43.000708 5047 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 08:29:43 crc kubenswrapper[5047]: E0223 08:29:43.000986 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n697h54dhb7h666h69h76h59ch55ch65ch596h8h79h5c8h57hc8hfch5d7h697h79h698h5fch644hf9h54chbfh655hfchcbh5f8h646h5f7h89q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5h6fc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7f67b98cb7-p9d6t_openstack(e91a6ea2-940e-421f-af86-c26daacb7a9f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 08:29:43 crc kubenswrapper[5047]: E0223 08:29:43.002234 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" podUID="e91a6ea2-940e-421f-af86-c26daacb7a9f" Feb 23 08:29:43 crc kubenswrapper[5047]: E0223 08:29:43.005589 5047 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 08:29:43 crc kubenswrapper[5047]: E0223 08:29:43.005630 5047 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 08:29:43 crc kubenswrapper[5047]: E0223 08:29:43.005745 5047 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n564h564h676h699hcdh67bh66hfdh569h545h648h94h546h696h668h89h96h667h575h595h5d9h584h8dhbdh697h54bhb7h58fh5c9hd8h5cdh5c7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6b825,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bb88b7bf5-zg89r_openstack(2e02ce02-2283-4c5d-8094-e8826849eea3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 08:29:43 crc kubenswrapper[5047]: E0223 08:29:43.006976 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" podUID="2e02ce02-2283-4c5d-8094-e8826849eea3" Feb 23 08:29:43 crc kubenswrapper[5047]: E0223 08:29:43.228595 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" podUID="e91a6ea2-940e-421f-af86-c26daacb7a9f" Feb 23 08:29:43 crc kubenswrapper[5047]: E0223 08:29:43.228711 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" podUID="2e02ce02-2283-4c5d-8094-e8826849eea3" Feb 23 08:29:47 crc kubenswrapper[5047]: I0223 08:29:47.341306 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:29:47 crc kubenswrapper[5047]: E0223 08:29:47.342356 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.279868 5047 generic.go:334] "Generic (PLEG): container finished" podID="b4cd0be5-188f-4b4a-a3b0-6008f397120c" containerID="80011e847f17d975fa9c68e2d530325d6ae6815a9d2f109c4ce1e0b7b347bcc8" exitCode=0 Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.279984 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774d9db845-6lt25" event={"ID":"b4cd0be5-188f-4b4a-a3b0-6008f397120c","Type":"ContainerDied","Data":"80011e847f17d975fa9c68e2d530325d6ae6815a9d2f109c4ce1e0b7b347bcc8"} Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.283121 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"753e53f0-d44b-4af2-9aff-eabc1a46d537","Type":"ContainerStarted","Data":"abc7a14060746168d3aab5953a1dec8484ca0c6b35fab048b6695351e5711f38"} Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.285262 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"39cf7673-4d38-49e0-9b86-f80c3949fd06","Type":"ContainerStarted","Data":"2fb193ba990e8911587f36aa7890ab58c29c03d0ee9c8cdec4e9fb37cf8c3f1b"} Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.285411 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.287405 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"74700fa7-59df-4201-a7c4-de815b82208e","Type":"ContainerStarted","Data":"919ae0467b0ab826c5b9f2356df51aec4127e389561d05792503362c2de42572"} Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.383831 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.578140311 podStartE2EDuration="30.383809902s" podCreationTimestamp="2026-02-23 08:29:18 +0000 UTC" firstStartedPulling="2026-02-23 08:29:19.303694827 +0000 UTC m=+6281.555021961" lastFinishedPulling="2026-02-23 08:29:47.109364408 +0000 UTC m=+6309.360691552" observedRunningTime="2026-02-23 08:29:48.375297543 +0000 UTC m=+6310.626624707" watchObservedRunningTime="2026-02-23 08:29:48.383809902 +0000 UTC m=+6310.635137056" Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.765385 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774d9db845-6lt25" Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.889630 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88nz2\" (UniqueName: \"kubernetes.io/projected/b4cd0be5-188f-4b4a-a3b0-6008f397120c-kube-api-access-88nz2\") pod \"b4cd0be5-188f-4b4a-a3b0-6008f397120c\" (UID: \"b4cd0be5-188f-4b4a-a3b0-6008f397120c\") " Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.889745 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4cd0be5-188f-4b4a-a3b0-6008f397120c-config\") pod \"b4cd0be5-188f-4b4a-a3b0-6008f397120c\" (UID: \"b4cd0be5-188f-4b4a-a3b0-6008f397120c\") " Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.898154 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4cd0be5-188f-4b4a-a3b0-6008f397120c-kube-api-access-88nz2" (OuterVolumeSpecName: "kube-api-access-88nz2") pod "b4cd0be5-188f-4b4a-a3b0-6008f397120c" (UID: "b4cd0be5-188f-4b4a-a3b0-6008f397120c"). InnerVolumeSpecName "kube-api-access-88nz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.920367 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4cd0be5-188f-4b4a-a3b0-6008f397120c-config" (OuterVolumeSpecName: "config") pod "b4cd0be5-188f-4b4a-a3b0-6008f397120c" (UID: "b4cd0be5-188f-4b4a-a3b0-6008f397120c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.991509 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88nz2\" (UniqueName: \"kubernetes.io/projected/b4cd0be5-188f-4b4a-a3b0-6008f397120c-kube-api-access-88nz2\") on node \"crc\" DevicePath \"\"" Feb 23 08:29:48 crc kubenswrapper[5047]: I0223 08:29:48.991552 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4cd0be5-188f-4b4a-a3b0-6008f397120c-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:29:49 crc kubenswrapper[5047]: I0223 08:29:49.299945 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7c88bdf-c040-42b6-a5e4-269f32bd99d0","Type":"ContainerStarted","Data":"6e77d5b862d856ce26486bddd0e0fc9c45e120bdeb53f66f6f386516d9fde739"} Feb 23 08:29:49 crc kubenswrapper[5047]: I0223 08:29:49.303627 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774d9db845-6lt25" event={"ID":"b4cd0be5-188f-4b4a-a3b0-6008f397120c","Type":"ContainerDied","Data":"3aec3f13a89e8811b4f525dcea37e9c823de2b13beee04b4fd171e0ab0ac4c58"} Feb 23 08:29:49 crc kubenswrapper[5047]: I0223 08:29:49.303697 5047 scope.go:117] "RemoveContainer" containerID="80011e847f17d975fa9c68e2d530325d6ae6815a9d2f109c4ce1e0b7b347bcc8" Feb 23 08:29:49 crc kubenswrapper[5047]: I0223 08:29:49.303697 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774d9db845-6lt25" Feb 23 08:29:49 crc kubenswrapper[5047]: I0223 08:29:49.309518 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"474079eb-cdc7-47b4-b555-8909f6a658d5","Type":"ContainerStarted","Data":"8b24cf14a83f74292a635bcb65fe1121a5c60309a84222dbed4cc591f9970f6b"} Feb 23 08:29:49 crc kubenswrapper[5047]: I0223 08:29:49.446706 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774d9db845-6lt25"] Feb 23 08:29:49 crc kubenswrapper[5047]: I0223 08:29:49.454702 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-774d9db845-6lt25"] Feb 23 08:29:50 crc kubenswrapper[5047]: I0223 08:29:50.387061 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4cd0be5-188f-4b4a-a3b0-6008f397120c" path="/var/lib/kubelet/pods/b4cd0be5-188f-4b4a-a3b0-6008f397120c/volumes" Feb 23 08:29:51 crc kubenswrapper[5047]: I0223 08:29:51.395465 5047 generic.go:334] "Generic (PLEG): container finished" podID="753e53f0-d44b-4af2-9aff-eabc1a46d537" containerID="abc7a14060746168d3aab5953a1dec8484ca0c6b35fab048b6695351e5711f38" exitCode=0 Feb 23 08:29:51 crc kubenswrapper[5047]: I0223 08:29:51.395605 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"753e53f0-d44b-4af2-9aff-eabc1a46d537","Type":"ContainerDied","Data":"abc7a14060746168d3aab5953a1dec8484ca0c6b35fab048b6695351e5711f38"} Feb 23 08:29:51 crc kubenswrapper[5047]: I0223 08:29:51.398292 5047 generic.go:334] "Generic (PLEG): container finished" podID="74700fa7-59df-4201-a7c4-de815b82208e" containerID="919ae0467b0ab826c5b9f2356df51aec4127e389561d05792503362c2de42572" exitCode=0 Feb 23 08:29:51 crc kubenswrapper[5047]: I0223 08:29:51.398362 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"74700fa7-59df-4201-a7c4-de815b82208e","Type":"ContainerDied","Data":"919ae0467b0ab826c5b9f2356df51aec4127e389561d05792503362c2de42572"} Feb 23 08:29:52 crc kubenswrapper[5047]: I0223 08:29:52.415490 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"74700fa7-59df-4201-a7c4-de815b82208e","Type":"ContainerStarted","Data":"000d3c24fd99b6c2ed581b5f5c4fd5b32ee63c9bba247e2ec7f68a77055306d1"} Feb 23 08:29:52 crc kubenswrapper[5047]: I0223 08:29:52.420405 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"753e53f0-d44b-4af2-9aff-eabc1a46d537","Type":"ContainerStarted","Data":"4373b06bebc37262c61b269b46fec7e7e686ae9f047fd098351acbd0de2a4b21"} Feb 23 08:29:52 crc kubenswrapper[5047]: I0223 08:29:52.461148 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.79969738 podStartE2EDuration="36.461114848s" podCreationTimestamp="2026-02-23 08:29:16 +0000 UTC" firstStartedPulling="2026-02-23 08:29:18.515757594 +0000 UTC m=+6280.767084718" lastFinishedPulling="2026-02-23 08:29:47.177175052 +0000 UTC m=+6309.428502186" observedRunningTime="2026-02-23 08:29:52.453761141 +0000 UTC m=+6314.705088315" watchObservedRunningTime="2026-02-23 08:29:52.461114848 +0000 UTC m=+6314.712442022" Feb 23 08:29:52 crc kubenswrapper[5047]: I0223 08:29:52.494830 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.53696203 podStartE2EDuration="37.494786244s" podCreationTimestamp="2026-02-23 08:29:15 +0000 UTC" firstStartedPulling="2026-02-23 08:29:17.241972197 +0000 UTC m=+6279.493299331" lastFinishedPulling="2026-02-23 08:29:47.199796391 +0000 UTC m=+6309.451123545" observedRunningTime="2026-02-23 08:29:52.486883172 +0000 UTC m=+6314.738210336" watchObservedRunningTime="2026-02-23 08:29:52.494786244 +0000 UTC m=+6314.746113418" Feb 23 08:29:53 crc kubenswrapper[5047]: I0223 08:29:53.611088 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 23 08:29:55 crc kubenswrapper[5047]: E0223 08:29:55.498452 5047 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.129:58532->38.102.83.129:42399: write tcp 38.102.83.129:58532->38.102.83.129:42399: write: broken pipe Feb 23 08:29:56 crc kubenswrapper[5047]: I0223 08:29:56.654375 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 23 08:29:56 crc kubenswrapper[5047]: I0223 08:29:56.654449 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 23 08:29:58 crc kubenswrapper[5047]: I0223 08:29:58.013273 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:58 crc kubenswrapper[5047]: I0223 08:29:58.013802 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:58 crc kubenswrapper[5047]: I0223 08:29:58.139847 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:58 crc kubenswrapper[5047]: I0223 08:29:58.344921 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:29:58 crc kubenswrapper[5047]: E0223 08:29:58.347266 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:29:58 crc kubenswrapper[5047]: I0223 08:29:58.581002 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 23 08:29:59 crc kubenswrapper[5047]: I0223 08:29:59.487461 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" event={"ID":"2e02ce02-2283-4c5d-8094-e8826849eea3","Type":"ContainerStarted","Data":"3ecbf5cb8ff4ff3fe9d4c5513a8359a088708f0d1da16f64259390678abbdadd"} Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.154781 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll"] Feb 23 08:30:00 crc kubenswrapper[5047]: E0223 08:30:00.155690 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cd0be5-188f-4b4a-a3b0-6008f397120c" containerName="init" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.155784 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cd0be5-188f-4b4a-a3b0-6008f397120c" containerName="init" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.156064 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4cd0be5-188f-4b4a-a3b0-6008f397120c" containerName="init" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.159627 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.164311 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.164701 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.169047 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll"] Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.298947 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xp4h\" (UniqueName: \"kubernetes.io/projected/44c8183f-c5e8-470e-81ad-ff946c884704-kube-api-access-2xp4h\") pod \"collect-profiles-29530590-fkfll\" (UID: \"44c8183f-c5e8-470e-81ad-ff946c884704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.299012 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44c8183f-c5e8-470e-81ad-ff946c884704-secret-volume\") pod \"collect-profiles-29530590-fkfll\" (UID: \"44c8183f-c5e8-470e-81ad-ff946c884704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.299044 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44c8183f-c5e8-470e-81ad-ff946c884704-config-volume\") pod \"collect-profiles-29530590-fkfll\" (UID: \"44c8183f-c5e8-470e-81ad-ff946c884704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.401194 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44c8183f-c5e8-470e-81ad-ff946c884704-secret-volume\") pod \"collect-profiles-29530590-fkfll\" (UID: \"44c8183f-c5e8-470e-81ad-ff946c884704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.401243 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44c8183f-c5e8-470e-81ad-ff946c884704-config-volume\") pod \"collect-profiles-29530590-fkfll\" (UID: \"44c8183f-c5e8-470e-81ad-ff946c884704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.401363 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xp4h\" (UniqueName: \"kubernetes.io/projected/44c8183f-c5e8-470e-81ad-ff946c884704-kube-api-access-2xp4h\") pod \"collect-profiles-29530590-fkfll\" (UID: \"44c8183f-c5e8-470e-81ad-ff946c884704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.402368 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44c8183f-c5e8-470e-81ad-ff946c884704-config-volume\") pod \"collect-profiles-29530590-fkfll\" (UID: \"44c8183f-c5e8-470e-81ad-ff946c884704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.410975 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44c8183f-c5e8-470e-81ad-ff946c884704-secret-volume\") pod \"collect-profiles-29530590-fkfll\" (UID: \"44c8183f-c5e8-470e-81ad-ff946c884704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.418871 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xp4h\" (UniqueName: \"kubernetes.io/projected/44c8183f-c5e8-470e-81ad-ff946c884704-kube-api-access-2xp4h\") pod \"collect-profiles-29530590-fkfll\" (UID: \"44c8183f-c5e8-470e-81ad-ff946c884704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.486489 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.500037 5047 generic.go:334] "Generic (PLEG): container finished" podID="e91a6ea2-940e-421f-af86-c26daacb7a9f" containerID="d26d8592a5b9f160e774efc2584f170796159f82b6c98d456cd5464279684c2f" exitCode=0 Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.500146 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" event={"ID":"e91a6ea2-940e-421f-af86-c26daacb7a9f","Type":"ContainerDied","Data":"d26d8592a5b9f160e774efc2584f170796159f82b6c98d456cd5464279684c2f"} Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.503927 5047 generic.go:334] "Generic (PLEG): container finished" podID="2e02ce02-2283-4c5d-8094-e8826849eea3" containerID="3ecbf5cb8ff4ff3fe9d4c5513a8359a088708f0d1da16f64259390678abbdadd" exitCode=0 Feb 23 08:30:00 crc kubenswrapper[5047]: I0223 08:30:00.503994 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" event={"ID":"2e02ce02-2283-4c5d-8094-e8826849eea3","Type":"ContainerDied","Data":"3ecbf5cb8ff4ff3fe9d4c5513a8359a088708f0d1da16f64259390678abbdadd"} Feb 23 08:30:01 crc kubenswrapper[5047]: W0223 08:30:01.010228 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c8183f_c5e8_470e_81ad_ff946c884704.slice/crio-bda4c4e67625c3ccf4e7ab4d01aa9092e94a88262297c30d49832b773466faa6 WatchSource:0}: Error finding container bda4c4e67625c3ccf4e7ab4d01aa9092e94a88262297c30d49832b773466faa6: Status 404 returned error can't find the container with id bda4c4e67625c3ccf4e7ab4d01aa9092e94a88262297c30d49832b773466faa6 Feb 23 08:30:01 crc kubenswrapper[5047]: I0223 08:30:01.010844 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll"] Feb 23 08:30:01 crc kubenswrapper[5047]: I0223 08:30:01.512246 5047 generic.go:334] "Generic (PLEG): container finished" podID="44c8183f-c5e8-470e-81ad-ff946c884704" containerID="0461be2d36c08dcc087491a15dc454eaec732c0a5803e8b7153b89c62ebc128d" exitCode=0 Feb 23 08:30:01 crc kubenswrapper[5047]: I0223 08:30:01.512341 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" event={"ID":"44c8183f-c5e8-470e-81ad-ff946c884704","Type":"ContainerDied","Data":"0461be2d36c08dcc087491a15dc454eaec732c0a5803e8b7153b89c62ebc128d"} Feb 23 08:30:01 crc kubenswrapper[5047]: I0223 08:30:01.512382 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" event={"ID":"44c8183f-c5e8-470e-81ad-ff946c884704","Type":"ContainerStarted","Data":"bda4c4e67625c3ccf4e7ab4d01aa9092e94a88262297c30d49832b773466faa6"} Feb 23 08:30:01 crc kubenswrapper[5047]: I0223 08:30:01.516289 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" event={"ID":"2e02ce02-2283-4c5d-8094-e8826849eea3","Type":"ContainerStarted","Data":"3036046b0bfaaad24871bee9bf4881c3554f6ccd9dc10bb3d4ce234f61b980af"} Feb 23 08:30:01 crc kubenswrapper[5047]: I0223 08:30:01.516493 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:30:01 crc kubenswrapper[5047]: I0223 08:30:01.518147 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" event={"ID":"e91a6ea2-940e-421f-af86-c26daacb7a9f","Type":"ContainerStarted","Data":"ff06ea6d6524a8b7d07ad3ce9d5143094a1cf41fbc6a39e753177aa2f7226e34"} Feb 23 08:30:01 crc kubenswrapper[5047]: I0223 08:30:01.518372 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:30:01 crc kubenswrapper[5047]: I0223 08:30:01.551851 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" podStartSLOduration=-9223371989.302946 podStartE2EDuration="47.551829612s" podCreationTimestamp="2026-02-23 08:29:14 +0000 UTC" firstStartedPulling="2026-02-23 08:29:15.08924546 +0000 UTC m=+6277.340572584" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:30:01.549573692 +0000 UTC m=+6323.800900826" watchObservedRunningTime="2026-02-23 08:30:01.551829612 +0000 UTC m=+6323.803156756" Feb 23 08:30:01 crc kubenswrapper[5047]: I0223 08:30:01.575087 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" podStartSLOduration=-9223371988.279718 podStartE2EDuration="48.575057398s" podCreationTimestamp="2026-02-23 08:29:13 +0000 UTC" firstStartedPulling="2026-02-23 08:29:14.504407713 +0000 UTC m=+6276.755734847" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:30:01.572278203 +0000 UTC m=+6323.823605347" watchObservedRunningTime="2026-02-23 08:30:01.575057398 +0000 UTC m=+6323.826384532" Feb 23 08:30:02 crc kubenswrapper[5047]: I0223 08:30:02.621351 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 23 08:30:02 crc kubenswrapper[5047]: I0223 08:30:02.742649 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 23 08:30:02 crc kubenswrapper[5047]: I0223 08:30:02.913115 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" Feb 23 08:30:03 crc kubenswrapper[5047]: I0223 08:30:03.088993 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xp4h\" (UniqueName: \"kubernetes.io/projected/44c8183f-c5e8-470e-81ad-ff946c884704-kube-api-access-2xp4h\") pod \"44c8183f-c5e8-470e-81ad-ff946c884704\" (UID: \"44c8183f-c5e8-470e-81ad-ff946c884704\") " Feb 23 08:30:03 crc kubenswrapper[5047]: I0223 08:30:03.089663 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44c8183f-c5e8-470e-81ad-ff946c884704-config-volume\") pod \"44c8183f-c5e8-470e-81ad-ff946c884704\" (UID: \"44c8183f-c5e8-470e-81ad-ff946c884704\") " Feb 23 08:30:03 crc kubenswrapper[5047]: I0223 08:30:03.089992 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44c8183f-c5e8-470e-81ad-ff946c884704-secret-volume\") pod \"44c8183f-c5e8-470e-81ad-ff946c884704\" (UID: \"44c8183f-c5e8-470e-81ad-ff946c884704\") " Feb 23 08:30:03 crc kubenswrapper[5047]: I0223 08:30:03.090544 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44c8183f-c5e8-470e-81ad-ff946c884704-config-volume" (OuterVolumeSpecName: "config-volume") pod "44c8183f-c5e8-470e-81ad-ff946c884704" (UID: "44c8183f-c5e8-470e-81ad-ff946c884704"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:03 crc kubenswrapper[5047]: I0223 08:30:03.090838 5047 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44c8183f-c5e8-470e-81ad-ff946c884704-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:03 crc kubenswrapper[5047]: I0223 08:30:03.096029 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c8183f-c5e8-470e-81ad-ff946c884704-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "44c8183f-c5e8-470e-81ad-ff946c884704" (UID: "44c8183f-c5e8-470e-81ad-ff946c884704"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:30:03 crc kubenswrapper[5047]: I0223 08:30:03.097690 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c8183f-c5e8-470e-81ad-ff946c884704-kube-api-access-2xp4h" (OuterVolumeSpecName: "kube-api-access-2xp4h") pod "44c8183f-c5e8-470e-81ad-ff946c884704" (UID: "44c8183f-c5e8-470e-81ad-ff946c884704"). InnerVolumeSpecName "kube-api-access-2xp4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:30:03 crc kubenswrapper[5047]: I0223 08:30:03.192058 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xp4h\" (UniqueName: \"kubernetes.io/projected/44c8183f-c5e8-470e-81ad-ff946c884704-kube-api-access-2xp4h\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:03 crc kubenswrapper[5047]: I0223 08:30:03.192113 5047 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44c8183f-c5e8-470e-81ad-ff946c884704-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:03 crc kubenswrapper[5047]: I0223 08:30:03.535863 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" event={"ID":"44c8183f-c5e8-470e-81ad-ff946c884704","Type":"ContainerDied","Data":"bda4c4e67625c3ccf4e7ab4d01aa9092e94a88262297c30d49832b773466faa6"} Feb 23 08:30:03 crc kubenswrapper[5047]: I0223 08:30:03.535943 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bda4c4e67625c3ccf4e7ab4d01aa9092e94a88262297c30d49832b773466faa6" Feb 23 08:30:03 crc kubenswrapper[5047]: I0223 08:30:03.535954 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-fkfll" Feb 23 08:30:04 crc kubenswrapper[5047]: I0223 08:30:04.109924 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw"] Feb 23 08:30:04 crc kubenswrapper[5047]: I0223 08:30:04.121979 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-m4rqw"] Feb 23 08:30:04 crc kubenswrapper[5047]: I0223 08:30:04.351743 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fac5c32a-1ae4-472a-8158-73ecc3260f3d" path="/var/lib/kubelet/pods/fac5c32a-1ae4-472a-8158-73ecc3260f3d/volumes" Feb 23 08:30:05 crc kubenswrapper[5047]: I0223 08:30:05.231440 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-v9vpl"] Feb 23 08:30:05 crc kubenswrapper[5047]: E0223 08:30:05.232197 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c8183f-c5e8-470e-81ad-ff946c884704" containerName="collect-profiles" Feb 23 08:30:05 crc kubenswrapper[5047]: I0223 08:30:05.232214 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c8183f-c5e8-470e-81ad-ff946c884704" containerName="collect-profiles" Feb 23 08:30:05 crc kubenswrapper[5047]: I0223 08:30:05.232377 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c8183f-c5e8-470e-81ad-ff946c884704" containerName="collect-profiles" Feb 23 08:30:05 crc kubenswrapper[5047]: I0223 08:30:05.232897 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v9vpl" Feb 23 08:30:05 crc kubenswrapper[5047]: I0223 08:30:05.237462 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 23 08:30:05 crc kubenswrapper[5047]: I0223 08:30:05.251145 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-v9vpl"] Feb 23 08:30:05 crc kubenswrapper[5047]: I0223 08:30:05.331608 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz75z\" (UniqueName: \"kubernetes.io/projected/f7332cd9-b2b7-45b7-a73c-44c25dde73d7-kube-api-access-kz75z\") pod \"root-account-create-update-v9vpl\" (UID: \"f7332cd9-b2b7-45b7-a73c-44c25dde73d7\") " pod="openstack/root-account-create-update-v9vpl" Feb 23 08:30:05 crc kubenswrapper[5047]: I0223 08:30:05.332055 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7332cd9-b2b7-45b7-a73c-44c25dde73d7-operator-scripts\") pod \"root-account-create-update-v9vpl\" (UID: \"f7332cd9-b2b7-45b7-a73c-44c25dde73d7\") " pod="openstack/root-account-create-update-v9vpl" Feb 23 08:30:05 crc kubenswrapper[5047]: I0223 08:30:05.433167 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7332cd9-b2b7-45b7-a73c-44c25dde73d7-operator-scripts\") pod \"root-account-create-update-v9vpl\" (UID: \"f7332cd9-b2b7-45b7-a73c-44c25dde73d7\") " pod="openstack/root-account-create-update-v9vpl" Feb 23 08:30:05 crc kubenswrapper[5047]: I0223 08:30:05.433255 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz75z\" (UniqueName: \"kubernetes.io/projected/f7332cd9-b2b7-45b7-a73c-44c25dde73d7-kube-api-access-kz75z\") pod \"root-account-create-update-v9vpl\" (UID: \"f7332cd9-b2b7-45b7-a73c-44c25dde73d7\") " pod="openstack/root-account-create-update-v9vpl" Feb 23 08:30:05 crc kubenswrapper[5047]: I0223 08:30:05.434203 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7332cd9-b2b7-45b7-a73c-44c25dde73d7-operator-scripts\") pod \"root-account-create-update-v9vpl\" (UID: \"f7332cd9-b2b7-45b7-a73c-44c25dde73d7\") " pod="openstack/root-account-create-update-v9vpl" Feb 23 08:30:05 crc kubenswrapper[5047]: I0223 08:30:05.456531 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz75z\" (UniqueName: \"kubernetes.io/projected/f7332cd9-b2b7-45b7-a73c-44c25dde73d7-kube-api-access-kz75z\") pod \"root-account-create-update-v9vpl\" (UID: \"f7332cd9-b2b7-45b7-a73c-44c25dde73d7\") " pod="openstack/root-account-create-update-v9vpl" Feb 23 08:30:05 crc kubenswrapper[5047]: I0223 08:30:05.554280 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v9vpl" Feb 23 08:30:06 crc kubenswrapper[5047]: I0223 08:30:06.026095 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-v9vpl"] Feb 23 08:30:06 crc kubenswrapper[5047]: W0223 08:30:06.028746 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7332cd9_b2b7_45b7_a73c_44c25dde73d7.slice/crio-619687f42bab43e7c44a80ff7e4c1a939d3e856572cbd8f4e957da7b39c60c6d WatchSource:0}: Error finding container 619687f42bab43e7c44a80ff7e4c1a939d3e856572cbd8f4e957da7b39c60c6d: Status 404 returned error can't find the container with id 619687f42bab43e7c44a80ff7e4c1a939d3e856572cbd8f4e957da7b39c60c6d Feb 23 08:30:06 crc kubenswrapper[5047]: I0223 08:30:06.562661 5047 generic.go:334] "Generic (PLEG): container finished" podID="f7332cd9-b2b7-45b7-a73c-44c25dde73d7" containerID="6fbe93cb216655ba7c9450d1c62801e8c95c2afe0c00afd532f5ed1da9b6cacb" exitCode=0 Feb 23 08:30:06 crc kubenswrapper[5047]: I0223 08:30:06.563748 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v9vpl" event={"ID":"f7332cd9-b2b7-45b7-a73c-44c25dde73d7","Type":"ContainerDied","Data":"6fbe93cb216655ba7c9450d1c62801e8c95c2afe0c00afd532f5ed1da9b6cacb"} Feb 23 08:30:06 crc kubenswrapper[5047]: I0223 08:30:06.563874 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v9vpl" event={"ID":"f7332cd9-b2b7-45b7-a73c-44c25dde73d7","Type":"ContainerStarted","Data":"619687f42bab43e7c44a80ff7e4c1a939d3e856572cbd8f4e957da7b39c60c6d"} Feb 23 08:30:07 crc kubenswrapper[5047]: I0223 08:30:07.901336 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v9vpl" Feb 23 08:30:08 crc kubenswrapper[5047]: I0223 08:30:08.082808 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7332cd9-b2b7-45b7-a73c-44c25dde73d7-operator-scripts\") pod \"f7332cd9-b2b7-45b7-a73c-44c25dde73d7\" (UID: \"f7332cd9-b2b7-45b7-a73c-44c25dde73d7\") " Feb 23 08:30:08 crc kubenswrapper[5047]: I0223 08:30:08.083732 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz75z\" (UniqueName: \"kubernetes.io/projected/f7332cd9-b2b7-45b7-a73c-44c25dde73d7-kube-api-access-kz75z\") pod \"f7332cd9-b2b7-45b7-a73c-44c25dde73d7\" (UID: \"f7332cd9-b2b7-45b7-a73c-44c25dde73d7\") " Feb 23 08:30:08 crc kubenswrapper[5047]: I0223 08:30:08.084414 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7332cd9-b2b7-45b7-a73c-44c25dde73d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7332cd9-b2b7-45b7-a73c-44c25dde73d7" (UID: "f7332cd9-b2b7-45b7-a73c-44c25dde73d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:08 crc kubenswrapper[5047]: I0223 08:30:08.085002 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7332cd9-b2b7-45b7-a73c-44c25dde73d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:08 crc kubenswrapper[5047]: I0223 08:30:08.093359 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7332cd9-b2b7-45b7-a73c-44c25dde73d7-kube-api-access-kz75z" (OuterVolumeSpecName: "kube-api-access-kz75z") pod "f7332cd9-b2b7-45b7-a73c-44c25dde73d7" (UID: "f7332cd9-b2b7-45b7-a73c-44c25dde73d7"). InnerVolumeSpecName "kube-api-access-kz75z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:30:08 crc kubenswrapper[5047]: I0223 08:30:08.186856 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz75z\" (UniqueName: \"kubernetes.io/projected/f7332cd9-b2b7-45b7-a73c-44c25dde73d7-kube-api-access-kz75z\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:08 crc kubenswrapper[5047]: I0223 08:30:08.588856 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v9vpl" event={"ID":"f7332cd9-b2b7-45b7-a73c-44c25dde73d7","Type":"ContainerDied","Data":"619687f42bab43e7c44a80ff7e4c1a939d3e856572cbd8f4e957da7b39c60c6d"} Feb 23 08:30:08 crc kubenswrapper[5047]: I0223 08:30:08.588988 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="619687f42bab43e7c44a80ff7e4c1a939d3e856572cbd8f4e957da7b39c60c6d" Feb 23 08:30:08 crc kubenswrapper[5047]: I0223 08:30:08.589105 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v9vpl" Feb 23 08:30:08 crc kubenswrapper[5047]: I0223 08:30:08.943252 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:30:09 crc kubenswrapper[5047]: I0223 08:30:09.686212 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:30:09 crc kubenswrapper[5047]: I0223 08:30:09.746212 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f67b98cb7-p9d6t"] Feb 23 08:30:09 crc kubenswrapper[5047]: I0223 08:30:09.746527 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" podUID="e91a6ea2-940e-421f-af86-c26daacb7a9f" containerName="dnsmasq-dns" containerID="cri-o://ff06ea6d6524a8b7d07ad3ce9d5143094a1cf41fbc6a39e753177aa2f7226e34" gracePeriod=10 Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.224220 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.318052 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h6fc\" (UniqueName: \"kubernetes.io/projected/e91a6ea2-940e-421f-af86-c26daacb7a9f-kube-api-access-5h6fc\") pod \"e91a6ea2-940e-421f-af86-c26daacb7a9f\" (UID: \"e91a6ea2-940e-421f-af86-c26daacb7a9f\") " Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.318135 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e91a6ea2-940e-421f-af86-c26daacb7a9f-dns-svc\") pod \"e91a6ea2-940e-421f-af86-c26daacb7a9f\" (UID: \"e91a6ea2-940e-421f-af86-c26daacb7a9f\") " Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.318154 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91a6ea2-940e-421f-af86-c26daacb7a9f-config\") pod \"e91a6ea2-940e-421f-af86-c26daacb7a9f\" (UID: \"e91a6ea2-940e-421f-af86-c26daacb7a9f\") " Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.323214 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e91a6ea2-940e-421f-af86-c26daacb7a9f-kube-api-access-5h6fc" (OuterVolumeSpecName: "kube-api-access-5h6fc") pod "e91a6ea2-940e-421f-af86-c26daacb7a9f" (UID: "e91a6ea2-940e-421f-af86-c26daacb7a9f"). InnerVolumeSpecName "kube-api-access-5h6fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.355956 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e91a6ea2-940e-421f-af86-c26daacb7a9f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e91a6ea2-940e-421f-af86-c26daacb7a9f" (UID: "e91a6ea2-940e-421f-af86-c26daacb7a9f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.372125 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e91a6ea2-940e-421f-af86-c26daacb7a9f-config" (OuterVolumeSpecName: "config") pod "e91a6ea2-940e-421f-af86-c26daacb7a9f" (UID: "e91a6ea2-940e-421f-af86-c26daacb7a9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.420401 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e91a6ea2-940e-421f-af86-c26daacb7a9f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.420444 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91a6ea2-940e-421f-af86-c26daacb7a9f-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.420455 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h6fc\" (UniqueName: \"kubernetes.io/projected/e91a6ea2-940e-421f-af86-c26daacb7a9f-kube-api-access-5h6fc\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.608978 5047 generic.go:334] "Generic (PLEG): container finished" podID="e91a6ea2-940e-421f-af86-c26daacb7a9f" containerID="ff06ea6d6524a8b7d07ad3ce9d5143094a1cf41fbc6a39e753177aa2f7226e34" exitCode=0 Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.609042 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" event={"ID":"e91a6ea2-940e-421f-af86-c26daacb7a9f","Type":"ContainerDied","Data":"ff06ea6d6524a8b7d07ad3ce9d5143094a1cf41fbc6a39e753177aa2f7226e34"} Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.609076 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" event={"ID":"e91a6ea2-940e-421f-af86-c26daacb7a9f","Type":"ContainerDied","Data":"156faa5fa3e5671441977fb278a15ae2122be4f8df41a2efe5d9de5f5598e6c8"} Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.609073 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f67b98cb7-p9d6t" Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.609189 5047 scope.go:117] "RemoveContainer" containerID="ff06ea6d6524a8b7d07ad3ce9d5143094a1cf41fbc6a39e753177aa2f7226e34" Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.647124 5047 scope.go:117] "RemoveContainer" containerID="d26d8592a5b9f160e774efc2584f170796159f82b6c98d456cd5464279684c2f" Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.667208 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f67b98cb7-p9d6t"] Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.679854 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f67b98cb7-p9d6t"] Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.681896 5047 scope.go:117] "RemoveContainer" containerID="ff06ea6d6524a8b7d07ad3ce9d5143094a1cf41fbc6a39e753177aa2f7226e34" Feb 23 08:30:10 crc kubenswrapper[5047]: E0223 08:30:10.682484 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff06ea6d6524a8b7d07ad3ce9d5143094a1cf41fbc6a39e753177aa2f7226e34\": container with ID starting with ff06ea6d6524a8b7d07ad3ce9d5143094a1cf41fbc6a39e753177aa2f7226e34 not found: ID does not exist" containerID="ff06ea6d6524a8b7d07ad3ce9d5143094a1cf41fbc6a39e753177aa2f7226e34" Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.682515 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff06ea6d6524a8b7d07ad3ce9d5143094a1cf41fbc6a39e753177aa2f7226e34"} err="failed to get container status \"ff06ea6d6524a8b7d07ad3ce9d5143094a1cf41fbc6a39e753177aa2f7226e34\": rpc error: code = NotFound desc = could not find container \"ff06ea6d6524a8b7d07ad3ce9d5143094a1cf41fbc6a39e753177aa2f7226e34\": container with ID starting with ff06ea6d6524a8b7d07ad3ce9d5143094a1cf41fbc6a39e753177aa2f7226e34 not found: ID does not exist" Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.682536 5047 scope.go:117] "RemoveContainer" containerID="d26d8592a5b9f160e774efc2584f170796159f82b6c98d456cd5464279684c2f" Feb 23 08:30:10 crc kubenswrapper[5047]: E0223 08:30:10.683338 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d26d8592a5b9f160e774efc2584f170796159f82b6c98d456cd5464279684c2f\": container with ID starting with d26d8592a5b9f160e774efc2584f170796159f82b6c98d456cd5464279684c2f not found: ID does not exist" containerID="d26d8592a5b9f160e774efc2584f170796159f82b6c98d456cd5464279684c2f" Feb 23 08:30:10 crc kubenswrapper[5047]: I0223 08:30:10.683414 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26d8592a5b9f160e774efc2584f170796159f82b6c98d456cd5464279684c2f"} err="failed to get container status \"d26d8592a5b9f160e774efc2584f170796159f82b6c98d456cd5464279684c2f\": rpc error: code = NotFound desc = could not find container \"d26d8592a5b9f160e774efc2584f170796159f82b6c98d456cd5464279684c2f\": container with ID starting with d26d8592a5b9f160e774efc2584f170796159f82b6c98d456cd5464279684c2f not found: ID does not exist" Feb 23 08:30:11 crc kubenswrapper[5047]: I0223 08:30:11.672184 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-v9vpl"] Feb 23 08:30:11 crc kubenswrapper[5047]: I0223 08:30:11.682869 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-v9vpl"] Feb 23 08:30:12 crc kubenswrapper[5047]: I0223 08:30:12.352507 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e91a6ea2-940e-421f-af86-c26daacb7a9f" path="/var/lib/kubelet/pods/e91a6ea2-940e-421f-af86-c26daacb7a9f/volumes" Feb 23 08:30:12 crc kubenswrapper[5047]: I0223 08:30:12.353877 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7332cd9-b2b7-45b7-a73c-44c25dde73d7" path="/var/lib/kubelet/pods/f7332cd9-b2b7-45b7-a73c-44c25dde73d7/volumes" Feb 23 08:30:13 crc kubenswrapper[5047]: I0223 08:30:13.341617 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:30:13 crc kubenswrapper[5047]: E0223 08:30:13.342520 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.710266 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-z9f6r"] Feb 23 08:30:16 crc kubenswrapper[5047]: E0223 08:30:16.710761 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91a6ea2-940e-421f-af86-c26daacb7a9f" containerName="init" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.710783 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91a6ea2-940e-421f-af86-c26daacb7a9f" containerName="init" Feb 23 08:30:16 crc kubenswrapper[5047]: E0223 08:30:16.710829 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7332cd9-b2b7-45b7-a73c-44c25dde73d7" containerName="mariadb-account-create-update" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.710843 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7332cd9-b2b7-45b7-a73c-44c25dde73d7" containerName="mariadb-account-create-update" Feb 23 08:30:16 crc kubenswrapper[5047]: E0223 08:30:16.710866 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91a6ea2-940e-421f-af86-c26daacb7a9f" containerName="dnsmasq-dns" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.710880 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91a6ea2-940e-421f-af86-c26daacb7a9f" containerName="dnsmasq-dns" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.711193 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91a6ea2-940e-421f-af86-c26daacb7a9f" containerName="dnsmasq-dns" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.711237 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7332cd9-b2b7-45b7-a73c-44c25dde73d7" containerName="mariadb-account-create-update" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.712172 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z9f6r" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.721633 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.729847 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z9f6r"] Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.845965 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw7zn\" (UniqueName: \"kubernetes.io/projected/3877d0e2-e421-45d2-ab02-608987a4ab03-kube-api-access-dw7zn\") pod \"root-account-create-update-z9f6r\" (UID: \"3877d0e2-e421-45d2-ab02-608987a4ab03\") " pod="openstack/root-account-create-update-z9f6r" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.846603 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3877d0e2-e421-45d2-ab02-608987a4ab03-operator-scripts\") pod \"root-account-create-update-z9f6r\" (UID: \"3877d0e2-e421-45d2-ab02-608987a4ab03\") " pod="openstack/root-account-create-update-z9f6r" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.948112 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw7zn\" (UniqueName: \"kubernetes.io/projected/3877d0e2-e421-45d2-ab02-608987a4ab03-kube-api-access-dw7zn\") pod \"root-account-create-update-z9f6r\" (UID: \"3877d0e2-e421-45d2-ab02-608987a4ab03\") " pod="openstack/root-account-create-update-z9f6r" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.948229 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3877d0e2-e421-45d2-ab02-608987a4ab03-operator-scripts\") pod \"root-account-create-update-z9f6r\" (UID: \"3877d0e2-e421-45d2-ab02-608987a4ab03\") " pod="openstack/root-account-create-update-z9f6r" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.950228 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3877d0e2-e421-45d2-ab02-608987a4ab03-operator-scripts\") pod \"root-account-create-update-z9f6r\" (UID: \"3877d0e2-e421-45d2-ab02-608987a4ab03\") " pod="openstack/root-account-create-update-z9f6r" Feb 23 08:30:16 crc kubenswrapper[5047]: I0223 08:30:16.974585 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw7zn\" (UniqueName: \"kubernetes.io/projected/3877d0e2-e421-45d2-ab02-608987a4ab03-kube-api-access-dw7zn\") pod \"root-account-create-update-z9f6r\" (UID: \"3877d0e2-e421-45d2-ab02-608987a4ab03\") " pod="openstack/root-account-create-update-z9f6r" Feb 23 08:30:17 crc kubenswrapper[5047]: I0223 08:30:17.046135 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z9f6r" Feb 23 08:30:17 crc kubenswrapper[5047]: I0223 08:30:17.355716 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z9f6r"] Feb 23 08:30:17 crc kubenswrapper[5047]: W0223 08:30:17.356956 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3877d0e2_e421_45d2_ab02_608987a4ab03.slice/crio-b21e1689d315f1703cbbd7c4572466589ceb0b912e810104f62f1f1255c1a340 WatchSource:0}: Error finding container b21e1689d315f1703cbbd7c4572466589ceb0b912e810104f62f1f1255c1a340: Status 404 returned error can't find the container with id b21e1689d315f1703cbbd7c4572466589ceb0b912e810104f62f1f1255c1a340 Feb 23 08:30:17 crc kubenswrapper[5047]: I0223 08:30:17.688553 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z9f6r" event={"ID":"3877d0e2-e421-45d2-ab02-608987a4ab03","Type":"ContainerStarted","Data":"f93cfea23a9ba94ac4dc13889ee91f2d871df1cde600bd475bb6e57aa79f0b38"} Feb 23 08:30:17 crc kubenswrapper[5047]: I0223 08:30:17.689215 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z9f6r" event={"ID":"3877d0e2-e421-45d2-ab02-608987a4ab03","Type":"ContainerStarted","Data":"b21e1689d315f1703cbbd7c4572466589ceb0b912e810104f62f1f1255c1a340"} Feb 23 08:30:17 crc kubenswrapper[5047]: I0223 08:30:17.717109 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-z9f6r" podStartSLOduration=1.717088617 podStartE2EDuration="1.717088617s" podCreationTimestamp="2026-02-23 08:30:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:30:17.712196475 +0000 UTC m=+6339.963523649" watchObservedRunningTime="2026-02-23 08:30:17.717088617 +0000 UTC m=+6339.968415761" Feb 23 08:30:18 crc kubenswrapper[5047]: I0223 08:30:18.698964 5047 generic.go:334] "Generic (PLEG): container finished" podID="3877d0e2-e421-45d2-ab02-608987a4ab03" containerID="f93cfea23a9ba94ac4dc13889ee91f2d871df1cde600bd475bb6e57aa79f0b38" exitCode=0 Feb 23 08:30:18 crc kubenswrapper[5047]: I0223 08:30:18.699033 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z9f6r" event={"ID":"3877d0e2-e421-45d2-ab02-608987a4ab03","Type":"ContainerDied","Data":"f93cfea23a9ba94ac4dc13889ee91f2d871df1cde600bd475bb6e57aa79f0b38"} Feb 23 08:30:20 crc kubenswrapper[5047]: I0223 08:30:20.105679 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z9f6r" Feb 23 08:30:20 crc kubenswrapper[5047]: I0223 08:30:20.126495 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3877d0e2-e421-45d2-ab02-608987a4ab03-operator-scripts\") pod \"3877d0e2-e421-45d2-ab02-608987a4ab03\" (UID: \"3877d0e2-e421-45d2-ab02-608987a4ab03\") " Feb 23 08:30:20 crc kubenswrapper[5047]: I0223 08:30:20.126556 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw7zn\" (UniqueName: \"kubernetes.io/projected/3877d0e2-e421-45d2-ab02-608987a4ab03-kube-api-access-dw7zn\") pod \"3877d0e2-e421-45d2-ab02-608987a4ab03\" (UID: \"3877d0e2-e421-45d2-ab02-608987a4ab03\") " Feb 23 08:30:20 crc kubenswrapper[5047]: I0223 08:30:20.129675 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3877d0e2-e421-45d2-ab02-608987a4ab03-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3877d0e2-e421-45d2-ab02-608987a4ab03" (UID: "3877d0e2-e421-45d2-ab02-608987a4ab03"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:20 crc kubenswrapper[5047]: I0223 08:30:20.140391 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3877d0e2-e421-45d2-ab02-608987a4ab03-kube-api-access-dw7zn" (OuterVolumeSpecName: "kube-api-access-dw7zn") pod "3877d0e2-e421-45d2-ab02-608987a4ab03" (UID: "3877d0e2-e421-45d2-ab02-608987a4ab03"). InnerVolumeSpecName "kube-api-access-dw7zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:30:20 crc kubenswrapper[5047]: I0223 08:30:20.227711 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3877d0e2-e421-45d2-ab02-608987a4ab03-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:20 crc kubenswrapper[5047]: I0223 08:30:20.227751 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw7zn\" (UniqueName: \"kubernetes.io/projected/3877d0e2-e421-45d2-ab02-608987a4ab03-kube-api-access-dw7zn\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:20 crc kubenswrapper[5047]: I0223 08:30:20.717261 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z9f6r" event={"ID":"3877d0e2-e421-45d2-ab02-608987a4ab03","Type":"ContainerDied","Data":"b21e1689d315f1703cbbd7c4572466589ceb0b912e810104f62f1f1255c1a340"} Feb 23 08:30:20 crc kubenswrapper[5047]: I0223 08:30:20.717313 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b21e1689d315f1703cbbd7c4572466589ceb0b912e810104f62f1f1255c1a340" Feb 23 08:30:20 crc kubenswrapper[5047]: I0223 08:30:20.717323 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z9f6r" Feb 23 08:30:21 crc kubenswrapper[5047]: I0223 08:30:21.731156 5047 generic.go:334] "Generic (PLEG): container finished" podID="e7c88bdf-c040-42b6-a5e4-269f32bd99d0" containerID="6e77d5b862d856ce26486bddd0e0fc9c45e120bdeb53f66f6f386516d9fde739" exitCode=0 Feb 23 08:30:21 crc kubenswrapper[5047]: I0223 08:30:21.731552 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7c88bdf-c040-42b6-a5e4-269f32bd99d0","Type":"ContainerDied","Data":"6e77d5b862d856ce26486bddd0e0fc9c45e120bdeb53f66f6f386516d9fde739"} Feb 23 08:30:22 crc kubenswrapper[5047]: I0223 08:30:22.742739 5047 generic.go:334] "Generic (PLEG): container finished" podID="474079eb-cdc7-47b4-b555-8909f6a658d5" containerID="8b24cf14a83f74292a635bcb65fe1121a5c60309a84222dbed4cc591f9970f6b" exitCode=0 Feb 23 08:30:22 crc kubenswrapper[5047]: I0223 08:30:22.742861 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"474079eb-cdc7-47b4-b555-8909f6a658d5","Type":"ContainerDied","Data":"8b24cf14a83f74292a635bcb65fe1121a5c60309a84222dbed4cc591f9970f6b"} Feb 23 08:30:22 crc kubenswrapper[5047]: I0223 08:30:22.747321 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7c88bdf-c040-42b6-a5e4-269f32bd99d0","Type":"ContainerStarted","Data":"f52850da8a6346cae35d765716c69f78d845a376d2b3a5867827f7d27c7665ea"} Feb 23 08:30:22 crc kubenswrapper[5047]: I0223 08:30:22.747758 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 23 08:30:22 crc kubenswrapper[5047]: I0223 08:30:22.826927 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.11599391 podStartE2EDuration="1m8.826888358s" podCreationTimestamp="2026-02-23 08:29:14 +0000 UTC" firstStartedPulling="2026-02-23 08:29:16.42078333 +0000 UTC m=+6278.672110464" lastFinishedPulling="2026-02-23 08:29:47.131677768 +0000 UTC m=+6309.383004912" observedRunningTime="2026-02-23 08:30:22.824631327 +0000 UTC m=+6345.075958471" watchObservedRunningTime="2026-02-23 08:30:22.826888358 +0000 UTC m=+6345.078215492" Feb 23 08:30:23 crc kubenswrapper[5047]: I0223 08:30:23.758811 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"474079eb-cdc7-47b4-b555-8909f6a658d5","Type":"ContainerStarted","Data":"7e4183c5399f5062fa2e344886a9e145b220b9d4dae1adaa24dc2c2756a8a1d6"} Feb 23 08:30:23 crc kubenswrapper[5047]: I0223 08:30:23.759495 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:23 crc kubenswrapper[5047]: I0223 08:30:23.805498 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.023657525 podStartE2EDuration="1m9.805475821s" podCreationTimestamp="2026-02-23 08:29:14 +0000 UTC" firstStartedPulling="2026-02-23 08:29:16.347146489 +0000 UTC m=+6278.598473623" lastFinishedPulling="2026-02-23 08:29:47.128964785 +0000 UTC m=+6309.380291919" observedRunningTime="2026-02-23 08:30:23.796972842 +0000 UTC m=+6346.048299996" watchObservedRunningTime="2026-02-23 08:30:23.805475821 +0000 UTC m=+6346.056802955" Feb 23 08:30:26 crc kubenswrapper[5047]: I0223 08:30:26.679525 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:30:26 crc kubenswrapper[5047]: E0223 08:30:26.682025 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:30:35 crc kubenswrapper[5047]: I0223 08:30:35.823372 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 23 08:30:35 crc kubenswrapper[5047]: I0223 08:30:35.925184 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:39 crc kubenswrapper[5047]: I0223 08:30:39.342991 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:30:39 crc kubenswrapper[5047]: E0223 08:30:39.343848 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.696642 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79496f79cc-hph48"] Feb 23 08:30:41 crc kubenswrapper[5047]: E0223 08:30:41.697437 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3877d0e2-e421-45d2-ab02-608987a4ab03" containerName="mariadb-account-create-update" Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.697461 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3877d0e2-e421-45d2-ab02-608987a4ab03" containerName="mariadb-account-create-update" Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.697658 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="3877d0e2-e421-45d2-ab02-608987a4ab03" containerName="mariadb-account-create-update" Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.698822 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.719829 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79496f79cc-hph48"] Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.827285 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v25c6\" (UniqueName: \"kubernetes.io/projected/79488449-9115-4c14-a738-6e8a8fc93dd6-kube-api-access-v25c6\") pod \"dnsmasq-dns-79496f79cc-hph48\" (UID: \"79488449-9115-4c14-a738-6e8a8fc93dd6\") " pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.827379 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79488449-9115-4c14-a738-6e8a8fc93dd6-config\") pod \"dnsmasq-dns-79496f79cc-hph48\" (UID: \"79488449-9115-4c14-a738-6e8a8fc93dd6\") " pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.827409 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79488449-9115-4c14-a738-6e8a8fc93dd6-dns-svc\") pod \"dnsmasq-dns-79496f79cc-hph48\" (UID: \"79488449-9115-4c14-a738-6e8a8fc93dd6\") " pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.928591 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79488449-9115-4c14-a738-6e8a8fc93dd6-config\") pod \"dnsmasq-dns-79496f79cc-hph48\" (UID: \"79488449-9115-4c14-a738-6e8a8fc93dd6\") " pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.928879 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79488449-9115-4c14-a738-6e8a8fc93dd6-dns-svc\") pod \"dnsmasq-dns-79496f79cc-hph48\" (UID: \"79488449-9115-4c14-a738-6e8a8fc93dd6\") " pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.929152 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v25c6\" (UniqueName: \"kubernetes.io/projected/79488449-9115-4c14-a738-6e8a8fc93dd6-kube-api-access-v25c6\") pod \"dnsmasq-dns-79496f79cc-hph48\" (UID: \"79488449-9115-4c14-a738-6e8a8fc93dd6\") " pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.929461 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79488449-9115-4c14-a738-6e8a8fc93dd6-config\") pod \"dnsmasq-dns-79496f79cc-hph48\" (UID: \"79488449-9115-4c14-a738-6e8a8fc93dd6\") " pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.930304 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79488449-9115-4c14-a738-6e8a8fc93dd6-dns-svc\") pod \"dnsmasq-dns-79496f79cc-hph48\" (UID: \"79488449-9115-4c14-a738-6e8a8fc93dd6\") " pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:30:41 crc kubenswrapper[5047]: I0223 08:30:41.964182 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v25c6\" (UniqueName: \"kubernetes.io/projected/79488449-9115-4c14-a738-6e8a8fc93dd6-kube-api-access-v25c6\") pod \"dnsmasq-dns-79496f79cc-hph48\" (UID: \"79488449-9115-4c14-a738-6e8a8fc93dd6\") " pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:30:42 crc kubenswrapper[5047]: I0223 08:30:42.068731 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:30:42 crc kubenswrapper[5047]: I0223 08:30:42.391502 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:30:42 crc kubenswrapper[5047]: I0223 08:30:42.591745 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79496f79cc-hph48"] Feb 23 08:30:42 crc kubenswrapper[5047]: W0223 08:30:42.599091 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79488449_9115_4c14_a738_6e8a8fc93dd6.slice/crio-28593a9e2c4cb60ef09566c289bda89d6e48e1f84b67d703a7ba81c568371349 WatchSource:0}: Error finding container 28593a9e2c4cb60ef09566c289bda89d6e48e1f84b67d703a7ba81c568371349: Status 404 returned error can't find the container with id 28593a9e2c4cb60ef09566c289bda89d6e48e1f84b67d703a7ba81c568371349 Feb 23 08:30:42 crc kubenswrapper[5047]: I0223 08:30:42.946542 5047 generic.go:334] "Generic (PLEG): container finished" podID="79488449-9115-4c14-a738-6e8a8fc93dd6" containerID="c86b73b5bd1561cf31ddcefc14de7cd23ddbeef7ad7a77147064cc184b5f144a" exitCode=0 Feb 23 08:30:42 crc kubenswrapper[5047]: I0223 08:30:42.946632 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79496f79cc-hph48" event={"ID":"79488449-9115-4c14-a738-6e8a8fc93dd6","Type":"ContainerDied","Data":"c86b73b5bd1561cf31ddcefc14de7cd23ddbeef7ad7a77147064cc184b5f144a"} Feb 23 08:30:42 crc kubenswrapper[5047]: I0223 08:30:42.946676 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79496f79cc-hph48" event={"ID":"79488449-9115-4c14-a738-6e8a8fc93dd6","Type":"ContainerStarted","Data":"28593a9e2c4cb60ef09566c289bda89d6e48e1f84b67d703a7ba81c568371349"} Feb 23 08:30:43 crc kubenswrapper[5047]: I0223 08:30:43.413348 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:30:43 crc kubenswrapper[5047]: I0223 08:30:43.956138 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79496f79cc-hph48" event={"ID":"79488449-9115-4c14-a738-6e8a8fc93dd6","Type":"ContainerStarted","Data":"a1ba1d8ec540e8dc98dd9804ecbed5fdb8255cd10c575f40a12e1a4b9f8237dc"} Feb 23 08:30:43 crc kubenswrapper[5047]: I0223 08:30:43.956613 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:30:43 crc kubenswrapper[5047]: I0223 08:30:43.982358 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79496f79cc-hph48" podStartSLOduration=2.982336033 podStartE2EDuration="2.982336033s" podCreationTimestamp="2026-02-23 08:30:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:30:43.979158317 +0000 UTC m=+6366.230485461" watchObservedRunningTime="2026-02-23 08:30:43.982336033 +0000 UTC m=+6366.233663177" Feb 23 08:30:47 crc kubenswrapper[5047]: I0223 08:30:47.350088 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e7c88bdf-c040-42b6-a5e4-269f32bd99d0" containerName="rabbitmq" containerID="cri-o://f52850da8a6346cae35d765716c69f78d845a376d2b3a5867827f7d27c7665ea" gracePeriod=604796 Feb 23 08:30:47 crc kubenswrapper[5047]: I0223 08:30:47.590033 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="474079eb-cdc7-47b4-b555-8909f6a658d5" containerName="rabbitmq" containerID="cri-o://7e4183c5399f5062fa2e344886a9e145b220b9d4dae1adaa24dc2c2756a8a1d6" gracePeriod=604796 Feb 23 08:30:52 crc kubenswrapper[5047]: I0223 08:30:52.071240 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:30:52 crc kubenswrapper[5047]: I0223 08:30:52.149682 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bb88b7bf5-zg89r"] Feb 23 08:30:52 crc kubenswrapper[5047]: I0223 08:30:52.150181 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" podUID="2e02ce02-2283-4c5d-8094-e8826849eea3" containerName="dnsmasq-dns" containerID="cri-o://3036046b0bfaaad24871bee9bf4881c3554f6ccd9dc10bb3d4ce234f61b980af" gracePeriod=10 Feb 23 08:30:52 crc kubenswrapper[5047]: I0223 08:30:52.643384 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:30:52 crc kubenswrapper[5047]: I0223 08:30:52.725078 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e02ce02-2283-4c5d-8094-e8826849eea3-dns-svc\") pod \"2e02ce02-2283-4c5d-8094-e8826849eea3\" (UID: \"2e02ce02-2283-4c5d-8094-e8826849eea3\") " Feb 23 08:30:52 crc kubenswrapper[5047]: I0223 08:30:52.725158 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b825\" (UniqueName: \"kubernetes.io/projected/2e02ce02-2283-4c5d-8094-e8826849eea3-kube-api-access-6b825\") pod \"2e02ce02-2283-4c5d-8094-e8826849eea3\" (UID: \"2e02ce02-2283-4c5d-8094-e8826849eea3\") " Feb 23 08:30:52 crc kubenswrapper[5047]: I0223 08:30:52.725264 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e02ce02-2283-4c5d-8094-e8826849eea3-config\") pod \"2e02ce02-2283-4c5d-8094-e8826849eea3\" (UID: \"2e02ce02-2283-4c5d-8094-e8826849eea3\") " Feb 23 08:30:52 crc kubenswrapper[5047]: I0223 08:30:52.748237 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e02ce02-2283-4c5d-8094-e8826849eea3-kube-api-access-6b825" (OuterVolumeSpecName: "kube-api-access-6b825") pod "2e02ce02-2283-4c5d-8094-e8826849eea3" (UID: "2e02ce02-2283-4c5d-8094-e8826849eea3"). InnerVolumeSpecName "kube-api-access-6b825". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:30:52 crc kubenswrapper[5047]: I0223 08:30:52.764372 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e02ce02-2283-4c5d-8094-e8826849eea3-config" (OuterVolumeSpecName: "config") pod "2e02ce02-2283-4c5d-8094-e8826849eea3" (UID: "2e02ce02-2283-4c5d-8094-e8826849eea3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:52 crc kubenswrapper[5047]: I0223 08:30:52.773130 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e02ce02-2283-4c5d-8094-e8826849eea3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e02ce02-2283-4c5d-8094-e8826849eea3" (UID: "2e02ce02-2283-4c5d-8094-e8826849eea3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:52 crc kubenswrapper[5047]: I0223 08:30:52.827080 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e02ce02-2283-4c5d-8094-e8826849eea3-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:52 crc kubenswrapper[5047]: I0223 08:30:52.827623 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e02ce02-2283-4c5d-8094-e8826849eea3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:52 crc kubenswrapper[5047]: I0223 08:30:52.827647 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b825\" (UniqueName: \"kubernetes.io/projected/2e02ce02-2283-4c5d-8094-e8826849eea3-kube-api-access-6b825\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:53 crc kubenswrapper[5047]: I0223 08:30:53.050977 5047 generic.go:334] "Generic (PLEG): container finished" podID="2e02ce02-2283-4c5d-8094-e8826849eea3" containerID="3036046b0bfaaad24871bee9bf4881c3554f6ccd9dc10bb3d4ce234f61b980af" exitCode=0 Feb 23 08:30:53 crc kubenswrapper[5047]: I0223 08:30:53.051067 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" event={"ID":"2e02ce02-2283-4c5d-8094-e8826849eea3","Type":"ContainerDied","Data":"3036046b0bfaaad24871bee9bf4881c3554f6ccd9dc10bb3d4ce234f61b980af"} Feb 23 08:30:53 crc kubenswrapper[5047]: I0223 08:30:53.051162 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" event={"ID":"2e02ce02-2283-4c5d-8094-e8826849eea3","Type":"ContainerDied","Data":"51a0c1fa3323c8c0ab637b3cee9760c5e8ce872c8bbbb000acad46b4fe3032a2"} Feb 23 08:30:53 crc kubenswrapper[5047]: I0223 08:30:53.051194 5047 scope.go:117] "RemoveContainer" containerID="3036046b0bfaaad24871bee9bf4881c3554f6ccd9dc10bb3d4ce234f61b980af" Feb 23 08:30:53 crc kubenswrapper[5047]: I0223 08:30:53.051092 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb88b7bf5-zg89r" Feb 23 08:30:53 crc kubenswrapper[5047]: I0223 08:30:53.090995 5047 scope.go:117] "RemoveContainer" containerID="3ecbf5cb8ff4ff3fe9d4c5513a8359a088708f0d1da16f64259390678abbdadd" Feb 23 08:30:53 crc kubenswrapper[5047]: I0223 08:30:53.091961 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bb88b7bf5-zg89r"] Feb 23 08:30:53 crc kubenswrapper[5047]: I0223 08:30:53.098339 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bb88b7bf5-zg89r"] Feb 23 08:30:53 crc kubenswrapper[5047]: I0223 08:30:53.120570 5047 scope.go:117] "RemoveContainer" containerID="3036046b0bfaaad24871bee9bf4881c3554f6ccd9dc10bb3d4ce234f61b980af" Feb 23 08:30:53 crc kubenswrapper[5047]: E0223 08:30:53.121174 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3036046b0bfaaad24871bee9bf4881c3554f6ccd9dc10bb3d4ce234f61b980af\": container with ID starting with 3036046b0bfaaad24871bee9bf4881c3554f6ccd9dc10bb3d4ce234f61b980af not found: ID does not exist" containerID="3036046b0bfaaad24871bee9bf4881c3554f6ccd9dc10bb3d4ce234f61b980af" Feb 23 08:30:53 crc kubenswrapper[5047]: I0223 08:30:53.121225 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3036046b0bfaaad24871bee9bf4881c3554f6ccd9dc10bb3d4ce234f61b980af"} err="failed to get container status \"3036046b0bfaaad24871bee9bf4881c3554f6ccd9dc10bb3d4ce234f61b980af\": rpc error: code = NotFound desc = could not find container \"3036046b0bfaaad24871bee9bf4881c3554f6ccd9dc10bb3d4ce234f61b980af\": container with ID starting with 3036046b0bfaaad24871bee9bf4881c3554f6ccd9dc10bb3d4ce234f61b980af not found: ID does not exist" Feb 23 08:30:53 crc kubenswrapper[5047]: I0223 08:30:53.121272 5047 scope.go:117] "RemoveContainer" containerID="3ecbf5cb8ff4ff3fe9d4c5513a8359a088708f0d1da16f64259390678abbdadd" Feb 23 08:30:53 crc kubenswrapper[5047]: E0223 08:30:53.122061 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ecbf5cb8ff4ff3fe9d4c5513a8359a088708f0d1da16f64259390678abbdadd\": container with ID starting with 3ecbf5cb8ff4ff3fe9d4c5513a8359a088708f0d1da16f64259390678abbdadd not found: ID does not exist" containerID="3ecbf5cb8ff4ff3fe9d4c5513a8359a088708f0d1da16f64259390678abbdadd" Feb 23 08:30:53 crc kubenswrapper[5047]: I0223 08:30:53.122093 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ecbf5cb8ff4ff3fe9d4c5513a8359a088708f0d1da16f64259390678abbdadd"} err="failed to get container status \"3ecbf5cb8ff4ff3fe9d4c5513a8359a088708f0d1da16f64259390678abbdadd\": rpc error: code = NotFound desc = could not find container \"3ecbf5cb8ff4ff3fe9d4c5513a8359a088708f0d1da16f64259390678abbdadd\": container with ID starting with 3ecbf5cb8ff4ff3fe9d4c5513a8359a088708f0d1da16f64259390678abbdadd not found: ID does not exist" Feb 23 08:30:53 crc kubenswrapper[5047]: I0223 08:30:53.340999 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.060187 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.063125 5047 generic.go:334] "Generic (PLEG): container finished" podID="e7c88bdf-c040-42b6-a5e4-269f32bd99d0" containerID="f52850da8a6346cae35d765716c69f78d845a376d2b3a5867827f7d27c7665ea" exitCode=0 Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.063217 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7c88bdf-c040-42b6-a5e4-269f32bd99d0","Type":"ContainerDied","Data":"f52850da8a6346cae35d765716c69f78d845a376d2b3a5867827f7d27c7665ea"} Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.063255 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7c88bdf-c040-42b6-a5e4-269f32bd99d0","Type":"ContainerDied","Data":"769d0093cc86d81251a1d5e6420ef2cad2f8a7fad4949b36b037dd2216a38ab8"} Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.063278 5047 scope.go:117] "RemoveContainer" containerID="f52850da8a6346cae35d765716c69f78d845a376d2b3a5867827f7d27c7665ea" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.065296 5047 generic.go:334] "Generic (PLEG): container finished" podID="474079eb-cdc7-47b4-b555-8909f6a658d5" containerID="7e4183c5399f5062fa2e344886a9e145b220b9d4dae1adaa24dc2c2756a8a1d6" exitCode=0 Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.065373 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"474079eb-cdc7-47b4-b555-8909f6a658d5","Type":"ContainerDied","Data":"7e4183c5399f5062fa2e344886a9e145b220b9d4dae1adaa24dc2c2756a8a1d6"} Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.081500 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"dc00fe3df405381f420e2b1840543fc3d733eab4a154b2a6713423066de3f8b0"} Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.142979 5047 scope.go:117] "RemoveContainer" containerID="6e77d5b862d856ce26486bddd0e0fc9c45e120bdeb53f66f6f386516d9fde739" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.159204 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-config-data\") pod \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.159271 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-plugins\") pod \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.159333 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-erlang-cookie-secret\") pod \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.159485 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") pod \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.159564 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-erlang-cookie\") pod \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.159615 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-plugins-conf\") pod \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.159663 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt4sd\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-kube-api-access-nt4sd\") pod \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.159697 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-pod-info\") pod \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.159739 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-tls\") pod \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.159763 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-server-conf\") pod \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.159798 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-confd\") pod \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\" (UID: \"e7c88bdf-c040-42b6-a5e4-269f32bd99d0\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.161772 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e7c88bdf-c040-42b6-a5e4-269f32bd99d0" (UID: "e7c88bdf-c040-42b6-a5e4-269f32bd99d0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.165492 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e7c88bdf-c040-42b6-a5e4-269f32bd99d0" (UID: "e7c88bdf-c040-42b6-a5e4-269f32bd99d0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.166059 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e7c88bdf-c040-42b6-a5e4-269f32bd99d0" (UID: "e7c88bdf-c040-42b6-a5e4-269f32bd99d0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.173709 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e7c88bdf-c040-42b6-a5e4-269f32bd99d0" (UID: "e7c88bdf-c040-42b6-a5e4-269f32bd99d0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.173746 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-pod-info" (OuterVolumeSpecName: "pod-info") pod "e7c88bdf-c040-42b6-a5e4-269f32bd99d0" (UID: "e7c88bdf-c040-42b6-a5e4-269f32bd99d0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.175074 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e7c88bdf-c040-42b6-a5e4-269f32bd99d0" (UID: "e7c88bdf-c040-42b6-a5e4-269f32bd99d0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.179383 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.182343 5047 scope.go:117] "RemoveContainer" containerID="f52850da8a6346cae35d765716c69f78d845a376d2b3a5867827f7d27c7665ea" Feb 23 08:30:54 crc kubenswrapper[5047]: E0223 08:30:54.183200 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f52850da8a6346cae35d765716c69f78d845a376d2b3a5867827f7d27c7665ea\": container with ID starting with f52850da8a6346cae35d765716c69f78d845a376d2b3a5867827f7d27c7665ea not found: ID does not exist" containerID="f52850da8a6346cae35d765716c69f78d845a376d2b3a5867827f7d27c7665ea" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.183376 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f52850da8a6346cae35d765716c69f78d845a376d2b3a5867827f7d27c7665ea"} err="failed to get container status \"f52850da8a6346cae35d765716c69f78d845a376d2b3a5867827f7d27c7665ea\": rpc error: code = NotFound desc = could not find container \"f52850da8a6346cae35d765716c69f78d845a376d2b3a5867827f7d27c7665ea\": container with ID starting with f52850da8a6346cae35d765716c69f78d845a376d2b3a5867827f7d27c7665ea not found: ID does not exist" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.183528 5047 scope.go:117] "RemoveContainer" containerID="6e77d5b862d856ce26486bddd0e0fc9c45e120bdeb53f66f6f386516d9fde739" Feb 23 08:30:54 crc kubenswrapper[5047]: E0223 08:30:54.184259 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e77d5b862d856ce26486bddd0e0fc9c45e120bdeb53f66f6f386516d9fde739\": container with ID starting with 6e77d5b862d856ce26486bddd0e0fc9c45e120bdeb53f66f6f386516d9fde739 not found: ID does not exist" containerID="6e77d5b862d856ce26486bddd0e0fc9c45e120bdeb53f66f6f386516d9fde739" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.184359 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e77d5b862d856ce26486bddd0e0fc9c45e120bdeb53f66f6f386516d9fde739"} err="failed to get container status \"6e77d5b862d856ce26486bddd0e0fc9c45e120bdeb53f66f6f386516d9fde739\": rpc error: code = NotFound desc = could not find container \"6e77d5b862d856ce26486bddd0e0fc9c45e120bdeb53f66f6f386516d9fde739\": container with ID starting with 6e77d5b862d856ce26486bddd0e0fc9c45e120bdeb53f66f6f386516d9fde739 not found: ID does not exist" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.184691 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-config-data" (OuterVolumeSpecName: "config-data") pod "e7c88bdf-c040-42b6-a5e4-269f32bd99d0" (UID: "e7c88bdf-c040-42b6-a5e4-269f32bd99d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.186445 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c" (OuterVolumeSpecName: "persistence") pod "e7c88bdf-c040-42b6-a5e4-269f32bd99d0" (UID: "e7c88bdf-c040-42b6-a5e4-269f32bd99d0"). InnerVolumeSpecName "pvc-82d2ef49-2524-428f-af58-59241651700c". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.194661 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-kube-api-access-nt4sd" (OuterVolumeSpecName: "kube-api-access-nt4sd") pod "e7c88bdf-c040-42b6-a5e4-269f32bd99d0" (UID: "e7c88bdf-c040-42b6-a5e4-269f32bd99d0"). InnerVolumeSpecName "kube-api-access-nt4sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.239065 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-server-conf" (OuterVolumeSpecName: "server-conf") pod "e7c88bdf-c040-42b6-a5e4-269f32bd99d0" (UID: "e7c88bdf-c040-42b6-a5e4-269f32bd99d0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.261648 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-plugins-conf\") pod \"474079eb-cdc7-47b4-b555-8909f6a658d5\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.261730 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-confd\") pod \"474079eb-cdc7-47b4-b555-8909f6a658d5\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.261770 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l527f\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-kube-api-access-l527f\") pod \"474079eb-cdc7-47b4-b555-8909f6a658d5\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.261800 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/474079eb-cdc7-47b4-b555-8909f6a658d5-erlang-cookie-secret\") pod \"474079eb-cdc7-47b4-b555-8909f6a658d5\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.261842 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/474079eb-cdc7-47b4-b555-8909f6a658d5-pod-info\") pod \"474079eb-cdc7-47b4-b555-8909f6a658d5\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.261866 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-plugins\") pod \"474079eb-cdc7-47b4-b555-8909f6a658d5\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.261886 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-server-conf\") pod \"474079eb-cdc7-47b4-b555-8909f6a658d5\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.261972 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-config-data\") pod \"474079eb-cdc7-47b4-b555-8909f6a658d5\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.261994 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-erlang-cookie\") pod \"474079eb-cdc7-47b4-b555-8909f6a658d5\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.262102 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") pod \"474079eb-cdc7-47b4-b555-8909f6a658d5\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.262127 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-tls\") pod \"474079eb-cdc7-47b4-b555-8909f6a658d5\" (UID: \"474079eb-cdc7-47b4-b555-8909f6a658d5\") " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.262388 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-82d2ef49-2524-428f-af58-59241651700c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") on node \"crc\" " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.262404 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.262414 5047 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.262424 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt4sd\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-kube-api-access-nt4sd\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.262434 5047 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-pod-info\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.262445 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.262454 5047 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-server-conf\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.262462 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.262470 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.262479 5047 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.263598 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "474079eb-cdc7-47b4-b555-8909f6a658d5" (UID: "474079eb-cdc7-47b4-b555-8909f6a658d5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.266668 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "474079eb-cdc7-47b4-b555-8909f6a658d5" (UID: "474079eb-cdc7-47b4-b555-8909f6a658d5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.267063 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-kube-api-access-l527f" (OuterVolumeSpecName: "kube-api-access-l527f") pod "474079eb-cdc7-47b4-b555-8909f6a658d5" (UID: "474079eb-cdc7-47b4-b555-8909f6a658d5"). InnerVolumeSpecName "kube-api-access-l527f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.267266 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "474079eb-cdc7-47b4-b555-8909f6a658d5" (UID: "474079eb-cdc7-47b4-b555-8909f6a658d5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.270631 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/474079eb-cdc7-47b4-b555-8909f6a658d5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "474079eb-cdc7-47b4-b555-8909f6a658d5" (UID: "474079eb-cdc7-47b4-b555-8909f6a658d5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.272231 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "474079eb-cdc7-47b4-b555-8909f6a658d5" (UID: "474079eb-cdc7-47b4-b555-8909f6a658d5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.277718 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/474079eb-cdc7-47b4-b555-8909f6a658d5-pod-info" (OuterVolumeSpecName: "pod-info") pod "474079eb-cdc7-47b4-b555-8909f6a658d5" (UID: "474079eb-cdc7-47b4-b555-8909f6a658d5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.279877 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4" (OuterVolumeSpecName: "persistence") pod "474079eb-cdc7-47b4-b555-8909f6a658d5" (UID: "474079eb-cdc7-47b4-b555-8909f6a658d5"). InnerVolumeSpecName "pvc-381a0458-ccaa-459e-8751-bf247399cbd4". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.298190 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e7c88bdf-c040-42b6-a5e4-269f32bd99d0" (UID: "e7c88bdf-c040-42b6-a5e4-269f32bd99d0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.306826 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.307010 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-82d2ef49-2524-428f-af58-59241651700c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c") on node "crc" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.311688 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-config-data" (OuterVolumeSpecName: "config-data") pod "474079eb-cdc7-47b4-b555-8909f6a658d5" (UID: "474079eb-cdc7-47b4-b555-8909f6a658d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.328458 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-server-conf" (OuterVolumeSpecName: "server-conf") pod "474079eb-cdc7-47b4-b555-8909f6a658d5" (UID: "474079eb-cdc7-47b4-b555-8909f6a658d5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.368087 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e02ce02-2283-4c5d-8094-e8826849eea3" path="/var/lib/kubelet/pods/2e02ce02-2283-4c5d-8094-e8826849eea3/volumes" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.369108 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-381a0458-ccaa-459e-8751-bf247399cbd4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") on node \"crc\" " Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.369157 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.369174 5047 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.369193 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7c88bdf-c040-42b6-a5e4-269f32bd99d0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.369206 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l527f\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-kube-api-access-l527f\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.369219 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-82d2ef49-2524-428f-af58-59241651700c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.369234 5047 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/474079eb-cdc7-47b4-b555-8909f6a658d5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.369246 5047 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/474079eb-cdc7-47b4-b555-8909f6a658d5-pod-info\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.369260 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.369272 5047 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-server-conf\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.369283 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/474079eb-cdc7-47b4-b555-8909f6a658d5-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.369295 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.387223 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.387575 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-381a0458-ccaa-459e-8751-bf247399cbd4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4") on node "crc" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.470269 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-381a0458-ccaa-459e-8751-bf247399cbd4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.627823 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "474079eb-cdc7-47b4-b555-8909f6a658d5" (UID: "474079eb-cdc7-47b4-b555-8909f6a658d5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:30:54 crc kubenswrapper[5047]: I0223 08:30:54.700505 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/474079eb-cdc7-47b4-b555-8909f6a658d5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.119563 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.123256 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"474079eb-cdc7-47b4-b555-8909f6a658d5","Type":"ContainerDied","Data":"76e660be09ff3d762b1380811c3bb838e87214d64f27ff0227a10b3ddef21f3f"} Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.123324 5047 scope.go:117] "RemoveContainer" containerID="7e4183c5399f5062fa2e344886a9e145b220b9d4dae1adaa24dc2c2756a8a1d6" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.127705 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.166610 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.184015 5047 scope.go:117] "RemoveContainer" containerID="8b24cf14a83f74292a635bcb65fe1121a5c60309a84222dbed4cc591f9970f6b" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.192257 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.205883 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.225167 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.233289 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:30:55 crc kubenswrapper[5047]: E0223 08:30:55.233634 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c88bdf-c040-42b6-a5e4-269f32bd99d0" containerName="rabbitmq" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.233652 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c88bdf-c040-42b6-a5e4-269f32bd99d0" containerName="rabbitmq" Feb 23 08:30:55 crc kubenswrapper[5047]: E0223 08:30:55.233665 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e02ce02-2283-4c5d-8094-e8826849eea3" containerName="init" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.233671 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e02ce02-2283-4c5d-8094-e8826849eea3" containerName="init" Feb 23 08:30:55 crc kubenswrapper[5047]: E0223 08:30:55.233687 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c88bdf-c040-42b6-a5e4-269f32bd99d0" containerName="setup-container" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.233693 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c88bdf-c040-42b6-a5e4-269f32bd99d0" containerName="setup-container" Feb 23 08:30:55 crc kubenswrapper[5047]: E0223 08:30:55.233713 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="474079eb-cdc7-47b4-b555-8909f6a658d5" containerName="setup-container" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.233719 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="474079eb-cdc7-47b4-b555-8909f6a658d5" containerName="setup-container" Feb 23 08:30:55 crc kubenswrapper[5047]: E0223 08:30:55.233733 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e02ce02-2283-4c5d-8094-e8826849eea3" containerName="dnsmasq-dns" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.233739 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e02ce02-2283-4c5d-8094-e8826849eea3" containerName="dnsmasq-dns" Feb 23 08:30:55 crc kubenswrapper[5047]: E0223 08:30:55.233749 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="474079eb-cdc7-47b4-b555-8909f6a658d5" containerName="rabbitmq" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.233757 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="474079eb-cdc7-47b4-b555-8909f6a658d5" containerName="rabbitmq" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.233935 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c88bdf-c040-42b6-a5e4-269f32bd99d0" containerName="rabbitmq" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.233951 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="474079eb-cdc7-47b4-b555-8909f6a658d5" containerName="rabbitmq" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.233962 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e02ce02-2283-4c5d-8094-e8826849eea3" containerName="dnsmasq-dns" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.234768 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.239865 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.240087 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.240141 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.240090 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rsnxc" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.240364 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.242475 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.243657 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.253705 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.254161 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.254322 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.255364 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.258356 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.258514 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.258716 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.258826 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.259290 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.259828 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-r8k72" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.265661 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.429571 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.429690 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d8f82aad-7df9-4b14-a328-2cc708aeed84-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.429747 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.429864 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-82d2ef49-2524-428f-af58-59241651700c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.429952 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.430010 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld4sk\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-kube-api-access-ld4sk\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.430047 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.430093 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.430131 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.430171 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.430206 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db558a41-6dbf-4b18-af50-6a5311530ef4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.430403 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.430487 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.430577 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.430792 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-381a0458-ccaa-459e-8751-bf247399cbd4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.430887 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjm2n\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-kube-api-access-fjm2n\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.431005 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d8f82aad-7df9-4b14-a328-2cc708aeed84-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.431127 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.431174 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.431312 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db558a41-6dbf-4b18-af50-6a5311530ef4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.431356 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.431394 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.532803 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d8f82aad-7df9-4b14-a328-2cc708aeed84-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.532882 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.532913 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.532940 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db558a41-6dbf-4b18-af50-6a5311530ef4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.532959 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.532975 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533004 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533025 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d8f82aad-7df9-4b14-a328-2cc708aeed84-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533049 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533076 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-82d2ef49-2524-428f-af58-59241651700c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533106 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533131 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld4sk\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-kube-api-access-ld4sk\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533151 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533175 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533193 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533212 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533231 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db558a41-6dbf-4b18-af50-6a5311530ef4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533250 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533266 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533289 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533314 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-381a0458-ccaa-459e-8751-bf247399cbd4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.533331 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjm2n\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-kube-api-access-fjm2n\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.534201 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.534523 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.534882 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.535201 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.535236 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.535827 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.536134 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.537777 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.538749 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.542514 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.553114 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.553738 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.553866 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d8f82aad-7df9-4b14-a328-2cc708aeed84-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.555517 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.555552 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-82d2ef49-2524-428f-af58-59241651700c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/aabf718db6f5931eb11327a98efab57b5892cb45001a61dea3c49cc7bf70a70f/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.555946 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.556710 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.557165 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d8f82aad-7df9-4b14-a328-2cc708aeed84-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.559542 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db558a41-6dbf-4b18-af50-6a5311530ef4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.564746 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.564769 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-381a0458-ccaa-459e-8751-bf247399cbd4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/577037528b2482e0b9f6f00316925bdadde6388bc3db266287efd4a6d1ff6d9e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.569284 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db558a41-6dbf-4b18-af50-6a5311530ef4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.582264 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjm2n\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-kube-api-access-fjm2n\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.590206 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld4sk\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-kube-api-access-ld4sk\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.595169 5047 scope.go:117] "RemoveContainer" containerID="f56bb4bcb58240badcd123a44f7d98f5bf8f8bd27c80dbc9f57a3ddbcac1fc28" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.784070 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-82d2ef49-2524-428f-af58-59241651700c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") pod \"rabbitmq-server-0\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.789319 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-381a0458-ccaa-459e-8751-bf247399cbd4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.863332 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 08:30:55 crc kubenswrapper[5047]: I0223 08:30:55.908270 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:30:56 crc kubenswrapper[5047]: I0223 08:30:56.230193 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:30:56 crc kubenswrapper[5047]: W0223 08:30:56.246077 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8f82aad_7df9_4b14_a328_2cc708aeed84.slice/crio-7c68d16a654220b5726f957209a20f588b0ff5fdaa881dc113e2ec4864b7f892 WatchSource:0}: Error finding container 7c68d16a654220b5726f957209a20f588b0ff5fdaa881dc113e2ec4864b7f892: Status 404 returned error can't find the container with id 7c68d16a654220b5726f957209a20f588b0ff5fdaa881dc113e2ec4864b7f892 Feb 23 08:30:56 crc kubenswrapper[5047]: I0223 08:30:56.306114 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:30:56 crc kubenswrapper[5047]: I0223 08:30:56.379718 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="474079eb-cdc7-47b4-b555-8909f6a658d5" path="/var/lib/kubelet/pods/474079eb-cdc7-47b4-b555-8909f6a658d5/volumes" Feb 23 08:30:56 crc kubenswrapper[5047]: I0223 08:30:56.381786 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c88bdf-c040-42b6-a5e4-269f32bd99d0" path="/var/lib/kubelet/pods/e7c88bdf-c040-42b6-a5e4-269f32bd99d0/volumes" Feb 23 08:30:57 crc kubenswrapper[5047]: I0223 08:30:57.157290 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"db558a41-6dbf-4b18-af50-6a5311530ef4","Type":"ContainerStarted","Data":"6d9d8ea0af57fce16d9c37676ecf917da449db33ee3b80528f6a720b64a675bc"} Feb 23 08:30:57 crc kubenswrapper[5047]: I0223 08:30:57.159599 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d8f82aad-7df9-4b14-a328-2cc708aeed84","Type":"ContainerStarted","Data":"7c68d16a654220b5726f957209a20f588b0ff5fdaa881dc113e2ec4864b7f892"} Feb 23 08:30:58 crc kubenswrapper[5047]: I0223 08:30:58.172947 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"db558a41-6dbf-4b18-af50-6a5311530ef4","Type":"ContainerStarted","Data":"764eca4b1eae80106e477be230c3c004a88237b926e233ec66dd335cc48d62a3"} Feb 23 08:30:59 crc kubenswrapper[5047]: I0223 08:30:59.185310 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d8f82aad-7df9-4b14-a328-2cc708aeed84","Type":"ContainerStarted","Data":"5b1e9abcc11ca95b25f760ee4921480a4a3de7b01708fdf6b170d338c0245738"} Feb 23 08:31:07 crc kubenswrapper[5047]: I0223 08:31:07.866262 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xr84r"] Feb 23 08:31:07 crc kubenswrapper[5047]: I0223 08:31:07.894951 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xr84r"] Feb 23 08:31:07 crc kubenswrapper[5047]: I0223 08:31:07.895142 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:07 crc kubenswrapper[5047]: I0223 08:31:07.962618 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j6mn\" (UniqueName: \"kubernetes.io/projected/823c75d9-7382-4bae-9971-94ff993c8a2b-kube-api-access-9j6mn\") pod \"redhat-operators-xr84r\" (UID: \"823c75d9-7382-4bae-9971-94ff993c8a2b\") " pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:07 crc kubenswrapper[5047]: I0223 08:31:07.962712 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823c75d9-7382-4bae-9971-94ff993c8a2b-catalog-content\") pod \"redhat-operators-xr84r\" (UID: \"823c75d9-7382-4bae-9971-94ff993c8a2b\") " pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:07 crc kubenswrapper[5047]: I0223 08:31:07.962835 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823c75d9-7382-4bae-9971-94ff993c8a2b-utilities\") pod \"redhat-operators-xr84r\" (UID: \"823c75d9-7382-4bae-9971-94ff993c8a2b\") " pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:08 crc kubenswrapper[5047]: I0223 08:31:08.064867 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823c75d9-7382-4bae-9971-94ff993c8a2b-catalog-content\") pod \"redhat-operators-xr84r\" (UID: \"823c75d9-7382-4bae-9971-94ff993c8a2b\") " pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:08 crc kubenswrapper[5047]: I0223 08:31:08.064958 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823c75d9-7382-4bae-9971-94ff993c8a2b-utilities\") pod \"redhat-operators-xr84r\" (UID: \"823c75d9-7382-4bae-9971-94ff993c8a2b\") " pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:08 crc kubenswrapper[5047]: I0223 08:31:08.065055 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j6mn\" (UniqueName: \"kubernetes.io/projected/823c75d9-7382-4bae-9971-94ff993c8a2b-kube-api-access-9j6mn\") pod \"redhat-operators-xr84r\" (UID: \"823c75d9-7382-4bae-9971-94ff993c8a2b\") " pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:08 crc kubenswrapper[5047]: I0223 08:31:08.065569 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823c75d9-7382-4bae-9971-94ff993c8a2b-catalog-content\") pod \"redhat-operators-xr84r\" (UID: \"823c75d9-7382-4bae-9971-94ff993c8a2b\") " pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:08 crc kubenswrapper[5047]: I0223 08:31:08.065589 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823c75d9-7382-4bae-9971-94ff993c8a2b-utilities\") pod \"redhat-operators-xr84r\" (UID: \"823c75d9-7382-4bae-9971-94ff993c8a2b\") " pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:08 crc kubenswrapper[5047]: I0223 08:31:08.110583 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j6mn\" (UniqueName: \"kubernetes.io/projected/823c75d9-7382-4bae-9971-94ff993c8a2b-kube-api-access-9j6mn\") pod \"redhat-operators-xr84r\" (UID: \"823c75d9-7382-4bae-9971-94ff993c8a2b\") " pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:08 crc kubenswrapper[5047]: I0223 08:31:08.226052 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:08 crc kubenswrapper[5047]: I0223 08:31:08.744141 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xr84r"] Feb 23 08:31:09 crc kubenswrapper[5047]: I0223 08:31:09.296316 5047 generic.go:334] "Generic (PLEG): container finished" podID="823c75d9-7382-4bae-9971-94ff993c8a2b" containerID="9d3e723b619b3154ec81c92e7c9f37465be864a548cade16a62069ab8d896c56" exitCode=0 Feb 23 08:31:09 crc kubenswrapper[5047]: I0223 08:31:09.296488 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr84r" event={"ID":"823c75d9-7382-4bae-9971-94ff993c8a2b","Type":"ContainerDied","Data":"9d3e723b619b3154ec81c92e7c9f37465be864a548cade16a62069ab8d896c56"} Feb 23 08:31:09 crc kubenswrapper[5047]: I0223 08:31:09.296613 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr84r" event={"ID":"823c75d9-7382-4bae-9971-94ff993c8a2b","Type":"ContainerStarted","Data":"fda29b9be4ed2cc55ebcf33006f331fe55bf9a8a9a1b5f6cc821b58b89402e53"} Feb 23 08:31:09 crc kubenswrapper[5047]: I0223 08:31:09.299364 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:31:11 crc kubenswrapper[5047]: I0223 08:31:11.317179 5047 generic.go:334] "Generic (PLEG): container finished" podID="823c75d9-7382-4bae-9971-94ff993c8a2b" containerID="2bb89e3658ebeb97f9bdf3d3c10cc3998d7b8e3763b1b44c9d4753e08b5f6de4" exitCode=0 Feb 23 08:31:11 crc kubenswrapper[5047]: I0223 08:31:11.317275 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr84r" event={"ID":"823c75d9-7382-4bae-9971-94ff993c8a2b","Type":"ContainerDied","Data":"2bb89e3658ebeb97f9bdf3d3c10cc3998d7b8e3763b1b44c9d4753e08b5f6de4"} Feb 23 08:31:12 crc kubenswrapper[5047]: I0223 08:31:12.328395 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr84r" event={"ID":"823c75d9-7382-4bae-9971-94ff993c8a2b","Type":"ContainerStarted","Data":"3621ef43d9990cb4f952b67f179824a61db8a1fc657f1b8518385262fc4e7aa1"} Feb 23 08:31:12 crc kubenswrapper[5047]: I0223 08:31:12.355800 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xr84r" podStartSLOduration=2.87372505 podStartE2EDuration="5.355775s" podCreationTimestamp="2026-02-23 08:31:07 +0000 UTC" firstStartedPulling="2026-02-23 08:31:09.299020035 +0000 UTC m=+6391.550347189" lastFinishedPulling="2026-02-23 08:31:11.781069975 +0000 UTC m=+6394.032397139" observedRunningTime="2026-02-23 08:31:12.352275486 +0000 UTC m=+6394.603602640" watchObservedRunningTime="2026-02-23 08:31:12.355775 +0000 UTC m=+6394.607102144" Feb 23 08:31:18 crc kubenswrapper[5047]: I0223 08:31:18.227349 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:18 crc kubenswrapper[5047]: I0223 08:31:18.229036 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:19 crc kubenswrapper[5047]: I0223 08:31:19.292356 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xr84r" podUID="823c75d9-7382-4bae-9971-94ff993c8a2b" containerName="registry-server" probeResult="failure" output=< Feb 23 08:31:19 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 08:31:19 crc kubenswrapper[5047]: > Feb 23 08:31:28 crc kubenswrapper[5047]: I0223 08:31:28.325773 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:28 crc kubenswrapper[5047]: I0223 08:31:28.407333 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:28 crc kubenswrapper[5047]: I0223 08:31:28.572739 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xr84r"] Feb 23 08:31:29 crc kubenswrapper[5047]: I0223 08:31:29.492403 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xr84r" podUID="823c75d9-7382-4bae-9971-94ff993c8a2b" containerName="registry-server" containerID="cri-o://3621ef43d9990cb4f952b67f179824a61db8a1fc657f1b8518385262fc4e7aa1" gracePeriod=2 Feb 23 08:31:29 crc kubenswrapper[5047]: I0223 08:31:29.986587 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.164953 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823c75d9-7382-4bae-9971-94ff993c8a2b-utilities\") pod \"823c75d9-7382-4bae-9971-94ff993c8a2b\" (UID: \"823c75d9-7382-4bae-9971-94ff993c8a2b\") " Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.165070 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823c75d9-7382-4bae-9971-94ff993c8a2b-catalog-content\") pod \"823c75d9-7382-4bae-9971-94ff993c8a2b\" (UID: \"823c75d9-7382-4bae-9971-94ff993c8a2b\") " Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.165113 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j6mn\" (UniqueName: \"kubernetes.io/projected/823c75d9-7382-4bae-9971-94ff993c8a2b-kube-api-access-9j6mn\") pod \"823c75d9-7382-4bae-9971-94ff993c8a2b\" (UID: \"823c75d9-7382-4bae-9971-94ff993c8a2b\") " Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.166544 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/823c75d9-7382-4bae-9971-94ff993c8a2b-utilities" (OuterVolumeSpecName: "utilities") pod "823c75d9-7382-4bae-9971-94ff993c8a2b" (UID: "823c75d9-7382-4bae-9971-94ff993c8a2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.173045 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/823c75d9-7382-4bae-9971-94ff993c8a2b-kube-api-access-9j6mn" (OuterVolumeSpecName: "kube-api-access-9j6mn") pod "823c75d9-7382-4bae-9971-94ff993c8a2b" (UID: "823c75d9-7382-4bae-9971-94ff993c8a2b"). InnerVolumeSpecName "kube-api-access-9j6mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.268750 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823c75d9-7382-4bae-9971-94ff993c8a2b-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.268801 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j6mn\" (UniqueName: \"kubernetes.io/projected/823c75d9-7382-4bae-9971-94ff993c8a2b-kube-api-access-9j6mn\") on node \"crc\" DevicePath \"\"" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.302970 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/823c75d9-7382-4bae-9971-94ff993c8a2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "823c75d9-7382-4bae-9971-94ff993c8a2b" (UID: "823c75d9-7382-4bae-9971-94ff993c8a2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.370143 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823c75d9-7382-4bae-9971-94ff993c8a2b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.505134 5047 generic.go:334] "Generic (PLEG): container finished" podID="823c75d9-7382-4bae-9971-94ff993c8a2b" containerID="3621ef43d9990cb4f952b67f179824a61db8a1fc657f1b8518385262fc4e7aa1" exitCode=0 Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.505205 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr84r" event={"ID":"823c75d9-7382-4bae-9971-94ff993c8a2b","Type":"ContainerDied","Data":"3621ef43d9990cb4f952b67f179824a61db8a1fc657f1b8518385262fc4e7aa1"} Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.505316 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr84r" event={"ID":"823c75d9-7382-4bae-9971-94ff993c8a2b","Type":"ContainerDied","Data":"fda29b9be4ed2cc55ebcf33006f331fe55bf9a8a9a1b5f6cc821b58b89402e53"} Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.505351 5047 scope.go:117] "RemoveContainer" containerID="3621ef43d9990cb4f952b67f179824a61db8a1fc657f1b8518385262fc4e7aa1" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.506534 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xr84r" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.527068 5047 scope.go:117] "RemoveContainer" containerID="2bb89e3658ebeb97f9bdf3d3c10cc3998d7b8e3763b1b44c9d4753e08b5f6de4" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.544023 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xr84r"] Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.557377 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xr84r"] Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.568945 5047 scope.go:117] "RemoveContainer" containerID="9d3e723b619b3154ec81c92e7c9f37465be864a548cade16a62069ab8d896c56" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.599130 5047 scope.go:117] "RemoveContainer" containerID="3621ef43d9990cb4f952b67f179824a61db8a1fc657f1b8518385262fc4e7aa1" Feb 23 08:31:30 crc kubenswrapper[5047]: E0223 08:31:30.600124 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3621ef43d9990cb4f952b67f179824a61db8a1fc657f1b8518385262fc4e7aa1\": container with ID starting with 3621ef43d9990cb4f952b67f179824a61db8a1fc657f1b8518385262fc4e7aa1 not found: ID does not exist" containerID="3621ef43d9990cb4f952b67f179824a61db8a1fc657f1b8518385262fc4e7aa1" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.600210 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3621ef43d9990cb4f952b67f179824a61db8a1fc657f1b8518385262fc4e7aa1"} err="failed to get container status \"3621ef43d9990cb4f952b67f179824a61db8a1fc657f1b8518385262fc4e7aa1\": rpc error: code = NotFound desc = could not find container \"3621ef43d9990cb4f952b67f179824a61db8a1fc657f1b8518385262fc4e7aa1\": container with ID starting with 3621ef43d9990cb4f952b67f179824a61db8a1fc657f1b8518385262fc4e7aa1 not found: ID does not exist" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.600265 5047 scope.go:117] "RemoveContainer" containerID="2bb89e3658ebeb97f9bdf3d3c10cc3998d7b8e3763b1b44c9d4753e08b5f6de4" Feb 23 08:31:30 crc kubenswrapper[5047]: E0223 08:31:30.601610 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bb89e3658ebeb97f9bdf3d3c10cc3998d7b8e3763b1b44c9d4753e08b5f6de4\": container with ID starting with 2bb89e3658ebeb97f9bdf3d3c10cc3998d7b8e3763b1b44c9d4753e08b5f6de4 not found: ID does not exist" containerID="2bb89e3658ebeb97f9bdf3d3c10cc3998d7b8e3763b1b44c9d4753e08b5f6de4" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.601659 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bb89e3658ebeb97f9bdf3d3c10cc3998d7b8e3763b1b44c9d4753e08b5f6de4"} err="failed to get container status \"2bb89e3658ebeb97f9bdf3d3c10cc3998d7b8e3763b1b44c9d4753e08b5f6de4\": rpc error: code = NotFound desc = could not find container \"2bb89e3658ebeb97f9bdf3d3c10cc3998d7b8e3763b1b44c9d4753e08b5f6de4\": container with ID starting with 2bb89e3658ebeb97f9bdf3d3c10cc3998d7b8e3763b1b44c9d4753e08b5f6de4 not found: ID does not exist" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.601691 5047 scope.go:117] "RemoveContainer" containerID="9d3e723b619b3154ec81c92e7c9f37465be864a548cade16a62069ab8d896c56" Feb 23 08:31:30 crc kubenswrapper[5047]: E0223 08:31:30.602246 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d3e723b619b3154ec81c92e7c9f37465be864a548cade16a62069ab8d896c56\": container with ID starting with 9d3e723b619b3154ec81c92e7c9f37465be864a548cade16a62069ab8d896c56 not found: ID does not exist" containerID="9d3e723b619b3154ec81c92e7c9f37465be864a548cade16a62069ab8d896c56" Feb 23 08:31:30 crc kubenswrapper[5047]: I0223 08:31:30.602321 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3e723b619b3154ec81c92e7c9f37465be864a548cade16a62069ab8d896c56"} err="failed to get container status \"9d3e723b619b3154ec81c92e7c9f37465be864a548cade16a62069ab8d896c56\": rpc error: code = NotFound desc = could not find container \"9d3e723b619b3154ec81c92e7c9f37465be864a548cade16a62069ab8d896c56\": container with ID starting with 9d3e723b619b3154ec81c92e7c9f37465be864a548cade16a62069ab8d896c56 not found: ID does not exist" Feb 23 08:31:31 crc kubenswrapper[5047]: I0223 08:31:31.519815 5047 generic.go:334] "Generic (PLEG): container finished" podID="db558a41-6dbf-4b18-af50-6a5311530ef4" containerID="764eca4b1eae80106e477be230c3c004a88237b926e233ec66dd335cc48d62a3" exitCode=0 Feb 23 08:31:31 crc kubenswrapper[5047]: I0223 08:31:31.519884 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"db558a41-6dbf-4b18-af50-6a5311530ef4","Type":"ContainerDied","Data":"764eca4b1eae80106e477be230c3c004a88237b926e233ec66dd335cc48d62a3"} Feb 23 08:31:32 crc kubenswrapper[5047]: I0223 08:31:32.358119 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="823c75d9-7382-4bae-9971-94ff993c8a2b" path="/var/lib/kubelet/pods/823c75d9-7382-4bae-9971-94ff993c8a2b/volumes" Feb 23 08:31:32 crc kubenswrapper[5047]: I0223 08:31:32.559178 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"db558a41-6dbf-4b18-af50-6a5311530ef4","Type":"ContainerStarted","Data":"c118d283468399c97557ee04375c5d709e9aa2386c25e5bb23ba33476c1b630a"} Feb 23 08:31:32 crc kubenswrapper[5047]: I0223 08:31:32.561536 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:31:32 crc kubenswrapper[5047]: I0223 08:31:32.564773 5047 generic.go:334] "Generic (PLEG): container finished" podID="d8f82aad-7df9-4b14-a328-2cc708aeed84" containerID="5b1e9abcc11ca95b25f760ee4921480a4a3de7b01708fdf6b170d338c0245738" exitCode=0 Feb 23 08:31:32 crc kubenswrapper[5047]: I0223 08:31:32.564827 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d8f82aad-7df9-4b14-a328-2cc708aeed84","Type":"ContainerDied","Data":"5b1e9abcc11ca95b25f760ee4921480a4a3de7b01708fdf6b170d338c0245738"} Feb 23 08:31:32 crc kubenswrapper[5047]: I0223 08:31:32.608244 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.608225787 podStartE2EDuration="37.608225787s" podCreationTimestamp="2026-02-23 08:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:31:32.607363974 +0000 UTC m=+6414.858691138" watchObservedRunningTime="2026-02-23 08:31:32.608225787 +0000 UTC m=+6414.859552931" Feb 23 08:31:33 crc kubenswrapper[5047]: I0223 08:31:33.581681 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d8f82aad-7df9-4b14-a328-2cc708aeed84","Type":"ContainerStarted","Data":"1b91d13309e4eceb1e8a3d37f030db4d6384384b7d04e3f74e24403de0da2f57"} Feb 23 08:31:33 crc kubenswrapper[5047]: I0223 08:31:33.582322 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 23 08:31:33 crc kubenswrapper[5047]: I0223 08:31:33.611074 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.611049062 podStartE2EDuration="38.611049062s" podCreationTimestamp="2026-02-23 08:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:31:33.606080259 +0000 UTC m=+6415.857407423" watchObservedRunningTime="2026-02-23 08:31:33.611049062 +0000 UTC m=+6415.862376196" Feb 23 08:31:45 crc kubenswrapper[5047]: I0223 08:31:45.869150 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 23 08:31:45 crc kubenswrapper[5047]: I0223 08:31:45.913228 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:31:49 crc kubenswrapper[5047]: I0223 08:31:49.133486 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 23 08:31:49 crc kubenswrapper[5047]: E0223 08:31:49.134468 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823c75d9-7382-4bae-9971-94ff993c8a2b" containerName="extract-utilities" Feb 23 08:31:49 crc kubenswrapper[5047]: I0223 08:31:49.134486 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="823c75d9-7382-4bae-9971-94ff993c8a2b" containerName="extract-utilities" Feb 23 08:31:49 crc kubenswrapper[5047]: E0223 08:31:49.134506 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823c75d9-7382-4bae-9971-94ff993c8a2b" containerName="registry-server" Feb 23 08:31:49 crc kubenswrapper[5047]: I0223 08:31:49.134512 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="823c75d9-7382-4bae-9971-94ff993c8a2b" containerName="registry-server" Feb 23 08:31:49 crc kubenswrapper[5047]: E0223 08:31:49.134551 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823c75d9-7382-4bae-9971-94ff993c8a2b" containerName="extract-content" Feb 23 08:31:49 crc kubenswrapper[5047]: I0223 08:31:49.134558 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="823c75d9-7382-4bae-9971-94ff993c8a2b" containerName="extract-content" Feb 23 08:31:49 crc kubenswrapper[5047]: I0223 08:31:49.134739 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="823c75d9-7382-4bae-9971-94ff993c8a2b" containerName="registry-server" Feb 23 08:31:49 crc kubenswrapper[5047]: I0223 08:31:49.135470 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:31:49 crc kubenswrapper[5047]: I0223 08:31:49.139114 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-tnc55" Feb 23 08:31:49 crc kubenswrapper[5047]: I0223 08:31:49.153434 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:31:49 crc kubenswrapper[5047]: I0223 08:31:49.226173 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pqc5\" (UniqueName: \"kubernetes.io/projected/e07e3ae2-20c0-48e9-9daf-58476130de7c-kube-api-access-9pqc5\") pod \"mariadb-client\" (UID: \"e07e3ae2-20c0-48e9-9daf-58476130de7c\") " pod="openstack/mariadb-client" Feb 23 08:31:49 crc kubenswrapper[5047]: I0223 08:31:49.327535 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pqc5\" (UniqueName: \"kubernetes.io/projected/e07e3ae2-20c0-48e9-9daf-58476130de7c-kube-api-access-9pqc5\") pod \"mariadb-client\" (UID: \"e07e3ae2-20c0-48e9-9daf-58476130de7c\") " pod="openstack/mariadb-client" Feb 23 08:31:49 crc kubenswrapper[5047]: I0223 08:31:49.348454 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pqc5\" (UniqueName: \"kubernetes.io/projected/e07e3ae2-20c0-48e9-9daf-58476130de7c-kube-api-access-9pqc5\") pod \"mariadb-client\" (UID: \"e07e3ae2-20c0-48e9-9daf-58476130de7c\") " pod="openstack/mariadb-client" Feb 23 08:31:49 crc kubenswrapper[5047]: I0223 08:31:49.460430 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:31:50 crc kubenswrapper[5047]: I0223 08:31:50.065814 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:31:50 crc kubenswrapper[5047]: I0223 08:31:50.734358 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"e07e3ae2-20c0-48e9-9daf-58476130de7c","Type":"ContainerStarted","Data":"364008fdba9b983dd9347a3cc1d68c203a060e257a6e5b82e260c13e894e69dd"} Feb 23 08:31:51 crc kubenswrapper[5047]: I0223 08:31:51.745100 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"e07e3ae2-20c0-48e9-9daf-58476130de7c","Type":"ContainerStarted","Data":"36768953c78dcd61260bd0c066f29177928a55c1eda8ed7d72c6c310e1892b1b"} Feb 23 08:31:51 crc kubenswrapper[5047]: I0223 08:31:51.771746 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.237244929 podStartE2EDuration="2.771715831s" podCreationTimestamp="2026-02-23 08:31:49 +0000 UTC" firstStartedPulling="2026-02-23 08:31:50.075368914 +0000 UTC m=+6432.326696038" lastFinishedPulling="2026-02-23 08:31:50.609839806 +0000 UTC m=+6432.861166940" observedRunningTime="2026-02-23 08:31:51.763765757 +0000 UTC m=+6434.015092901" watchObservedRunningTime="2026-02-23 08:31:51.771715831 +0000 UTC m=+6434.023042965" Feb 23 08:32:06 crc kubenswrapper[5047]: I0223 08:32:06.863152 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:32:06 crc kubenswrapper[5047]: I0223 08:32:06.864820 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="e07e3ae2-20c0-48e9-9daf-58476130de7c" containerName="mariadb-client" containerID="cri-o://36768953c78dcd61260bd0c066f29177928a55c1eda8ed7d72c6c310e1892b1b" gracePeriod=30 Feb 23 08:32:07 crc kubenswrapper[5047]: I0223 08:32:07.408031 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:32:07 crc kubenswrapper[5047]: I0223 08:32:07.453578 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pqc5\" (UniqueName: \"kubernetes.io/projected/e07e3ae2-20c0-48e9-9daf-58476130de7c-kube-api-access-9pqc5\") pod \"e07e3ae2-20c0-48e9-9daf-58476130de7c\" (UID: \"e07e3ae2-20c0-48e9-9daf-58476130de7c\") " Feb 23 08:32:07 crc kubenswrapper[5047]: I0223 08:32:07.470342 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07e3ae2-20c0-48e9-9daf-58476130de7c-kube-api-access-9pqc5" (OuterVolumeSpecName: "kube-api-access-9pqc5") pod "e07e3ae2-20c0-48e9-9daf-58476130de7c" (UID: "e07e3ae2-20c0-48e9-9daf-58476130de7c"). InnerVolumeSpecName "kube-api-access-9pqc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:32:07 crc kubenswrapper[5047]: I0223 08:32:07.555517 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pqc5\" (UniqueName: \"kubernetes.io/projected/e07e3ae2-20c0-48e9-9daf-58476130de7c-kube-api-access-9pqc5\") on node \"crc\" DevicePath \"\"" Feb 23 08:32:07 crc kubenswrapper[5047]: I0223 08:32:07.877881 5047 generic.go:334] "Generic (PLEG): container finished" podID="e07e3ae2-20c0-48e9-9daf-58476130de7c" containerID="36768953c78dcd61260bd0c066f29177928a55c1eda8ed7d72c6c310e1892b1b" exitCode=143 Feb 23 08:32:07 crc kubenswrapper[5047]: I0223 08:32:07.877962 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"e07e3ae2-20c0-48e9-9daf-58476130de7c","Type":"ContainerDied","Data":"36768953c78dcd61260bd0c066f29177928a55c1eda8ed7d72c6c310e1892b1b"} Feb 23 08:32:07 crc kubenswrapper[5047]: I0223 08:32:07.878050 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"e07e3ae2-20c0-48e9-9daf-58476130de7c","Type":"ContainerDied","Data":"364008fdba9b983dd9347a3cc1d68c203a060e257a6e5b82e260c13e894e69dd"} Feb 23 08:32:07 crc kubenswrapper[5047]: I0223 08:32:07.878085 5047 scope.go:117] "RemoveContainer" containerID="36768953c78dcd61260bd0c066f29177928a55c1eda8ed7d72c6c310e1892b1b" Feb 23 08:32:07 crc kubenswrapper[5047]: I0223 08:32:07.879862 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:32:07 crc kubenswrapper[5047]: I0223 08:32:07.905155 5047 scope.go:117] "RemoveContainer" containerID="36768953c78dcd61260bd0c066f29177928a55c1eda8ed7d72c6c310e1892b1b" Feb 23 08:32:07 crc kubenswrapper[5047]: E0223 08:32:07.905688 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36768953c78dcd61260bd0c066f29177928a55c1eda8ed7d72c6c310e1892b1b\": container with ID starting with 36768953c78dcd61260bd0c066f29177928a55c1eda8ed7d72c6c310e1892b1b not found: ID does not exist" containerID="36768953c78dcd61260bd0c066f29177928a55c1eda8ed7d72c6c310e1892b1b" Feb 23 08:32:07 crc kubenswrapper[5047]: I0223 08:32:07.905754 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36768953c78dcd61260bd0c066f29177928a55c1eda8ed7d72c6c310e1892b1b"} err="failed to get container status \"36768953c78dcd61260bd0c066f29177928a55c1eda8ed7d72c6c310e1892b1b\": rpc error: code = NotFound desc = could not find container \"36768953c78dcd61260bd0c066f29177928a55c1eda8ed7d72c6c310e1892b1b\": container with ID starting with 36768953c78dcd61260bd0c066f29177928a55c1eda8ed7d72c6c310e1892b1b not found: ID does not exist" Feb 23 08:32:07 crc kubenswrapper[5047]: I0223 08:32:07.931642 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:32:07 crc kubenswrapper[5047]: I0223 08:32:07.941363 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:32:08 crc kubenswrapper[5047]: I0223 08:32:08.356579 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e07e3ae2-20c0-48e9-9daf-58476130de7c" path="/var/lib/kubelet/pods/e07e3ae2-20c0-48e9-9daf-58476130de7c/volumes" Feb 23 08:32:55 crc kubenswrapper[5047]: I0223 08:32:55.764277 5047 scope.go:117] "RemoveContainer" containerID="e0028e7efb3f0193fba1eeb4c3255cc8e255cae96ce87cdb24d76ae0b574b86b" Feb 23 08:33:16 crc kubenswrapper[5047]: I0223 08:33:16.759869 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:33:16 crc kubenswrapper[5047]: I0223 08:33:16.760923 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:33:46 crc kubenswrapper[5047]: I0223 08:33:46.759667 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:33:46 crc kubenswrapper[5047]: I0223 08:33:46.761817 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.582749 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8x4fh"] Feb 23 08:34:16 crc kubenswrapper[5047]: E0223 08:34:16.585525 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e07e3ae2-20c0-48e9-9daf-58476130de7c" containerName="mariadb-client" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.585565 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07e3ae2-20c0-48e9-9daf-58476130de7c" containerName="mariadb-client" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.585990 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e07e3ae2-20c0-48e9-9daf-58476130de7c" containerName="mariadb-client" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.588340 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.596694 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8x4fh"] Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.672480 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wngrl\" (UniqueName: \"kubernetes.io/projected/526dcdbe-b00e-4d13-9d28-95ad35819635-kube-api-access-wngrl\") pod \"certified-operators-8x4fh\" (UID: \"526dcdbe-b00e-4d13-9d28-95ad35819635\") " pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.672563 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526dcdbe-b00e-4d13-9d28-95ad35819635-catalog-content\") pod \"certified-operators-8x4fh\" (UID: \"526dcdbe-b00e-4d13-9d28-95ad35819635\") " pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.672594 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526dcdbe-b00e-4d13-9d28-95ad35819635-utilities\") pod \"certified-operators-8x4fh\" (UID: \"526dcdbe-b00e-4d13-9d28-95ad35819635\") " pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.760131 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.760194 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.760246 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.760948 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dc00fe3df405381f420e2b1840543fc3d733eab4a154b2a6713423066de3f8b0"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.761009 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://dc00fe3df405381f420e2b1840543fc3d733eab4a154b2a6713423066de3f8b0" gracePeriod=600 Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.773682 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wngrl\" (UniqueName: \"kubernetes.io/projected/526dcdbe-b00e-4d13-9d28-95ad35819635-kube-api-access-wngrl\") pod \"certified-operators-8x4fh\" (UID: \"526dcdbe-b00e-4d13-9d28-95ad35819635\") " pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.773749 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526dcdbe-b00e-4d13-9d28-95ad35819635-catalog-content\") pod \"certified-operators-8x4fh\" (UID: \"526dcdbe-b00e-4d13-9d28-95ad35819635\") " pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.773776 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526dcdbe-b00e-4d13-9d28-95ad35819635-utilities\") pod \"certified-operators-8x4fh\" (UID: \"526dcdbe-b00e-4d13-9d28-95ad35819635\") " pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.774366 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526dcdbe-b00e-4d13-9d28-95ad35819635-utilities\") pod \"certified-operators-8x4fh\" (UID: \"526dcdbe-b00e-4d13-9d28-95ad35819635\") " pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.774570 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526dcdbe-b00e-4d13-9d28-95ad35819635-catalog-content\") pod \"certified-operators-8x4fh\" (UID: \"526dcdbe-b00e-4d13-9d28-95ad35819635\") " pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.807253 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wngrl\" (UniqueName: \"kubernetes.io/projected/526dcdbe-b00e-4d13-9d28-95ad35819635-kube-api-access-wngrl\") pod \"certified-operators-8x4fh\" (UID: \"526dcdbe-b00e-4d13-9d28-95ad35819635\") " pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:16 crc kubenswrapper[5047]: I0223 08:34:16.919145 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:17 crc kubenswrapper[5047]: I0223 08:34:17.154698 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="dc00fe3df405381f420e2b1840543fc3d733eab4a154b2a6713423066de3f8b0" exitCode=0 Feb 23 08:34:17 crc kubenswrapper[5047]: I0223 08:34:17.154993 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"dc00fe3df405381f420e2b1840543fc3d733eab4a154b2a6713423066de3f8b0"} Feb 23 08:34:17 crc kubenswrapper[5047]: I0223 08:34:17.155019 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357"} Feb 23 08:34:17 crc kubenswrapper[5047]: I0223 08:34:17.155035 5047 scope.go:117] "RemoveContainer" containerID="857f39ae96e2ccb49b22242c1a50b6ba874c4e9b8c6dd7bfe70d3e6f77c68af0" Feb 23 08:34:17 crc kubenswrapper[5047]: I0223 08:34:17.406357 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8x4fh"] Feb 23 08:34:17 crc kubenswrapper[5047]: W0223 08:34:17.417722 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod526dcdbe_b00e_4d13_9d28_95ad35819635.slice/crio-62bd78fd557b52365adb6eec3e3034e0ca1554225f306c153940468be5e5c33c WatchSource:0}: Error finding container 62bd78fd557b52365adb6eec3e3034e0ca1554225f306c153940468be5e5c33c: Status 404 returned error can't find the container with id 62bd78fd557b52365adb6eec3e3034e0ca1554225f306c153940468be5e5c33c Feb 23 08:34:18 crc kubenswrapper[5047]: I0223 08:34:18.167326 5047 generic.go:334] "Generic (PLEG): container finished" podID="526dcdbe-b00e-4d13-9d28-95ad35819635" containerID="b1ebb5b0a36159336aa4a2f4278f14864ad5b4ec5d6a972cc2808b5575a531d9" exitCode=0 Feb 23 08:34:18 crc kubenswrapper[5047]: I0223 08:34:18.167477 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8x4fh" event={"ID":"526dcdbe-b00e-4d13-9d28-95ad35819635","Type":"ContainerDied","Data":"b1ebb5b0a36159336aa4a2f4278f14864ad5b4ec5d6a972cc2808b5575a531d9"} Feb 23 08:34:18 crc kubenswrapper[5047]: I0223 08:34:18.168006 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8x4fh" event={"ID":"526dcdbe-b00e-4d13-9d28-95ad35819635","Type":"ContainerStarted","Data":"62bd78fd557b52365adb6eec3e3034e0ca1554225f306c153940468be5e5c33c"} Feb 23 08:34:20 crc kubenswrapper[5047]: I0223 08:34:20.193684 5047 generic.go:334] "Generic (PLEG): container finished" podID="526dcdbe-b00e-4d13-9d28-95ad35819635" containerID="73c7ff98efea7d9c4794a2712536915a0c469b5ce76d268845e20abfa4491415" exitCode=0 Feb 23 08:34:20 crc kubenswrapper[5047]: I0223 08:34:20.193801 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8x4fh" event={"ID":"526dcdbe-b00e-4d13-9d28-95ad35819635","Type":"ContainerDied","Data":"73c7ff98efea7d9c4794a2712536915a0c469b5ce76d268845e20abfa4491415"} Feb 23 08:34:21 crc kubenswrapper[5047]: I0223 08:34:21.207077 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8x4fh" event={"ID":"526dcdbe-b00e-4d13-9d28-95ad35819635","Type":"ContainerStarted","Data":"db1e37d120d0769078f7cc1fd4e0421d185de890d737c55cfcef7e44b06364a2"} Feb 23 08:34:21 crc kubenswrapper[5047]: I0223 08:34:21.243406 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8x4fh" podStartSLOduration=2.823534347 podStartE2EDuration="5.243375723s" podCreationTimestamp="2026-02-23 08:34:16 +0000 UTC" firstStartedPulling="2026-02-23 08:34:18.172000234 +0000 UTC m=+6580.423327408" lastFinishedPulling="2026-02-23 08:34:20.59184161 +0000 UTC m=+6582.843168784" observedRunningTime="2026-02-23 08:34:21.236360944 +0000 UTC m=+6583.487688108" watchObservedRunningTime="2026-02-23 08:34:21.243375723 +0000 UTC m=+6583.494702887" Feb 23 08:34:26 crc kubenswrapper[5047]: I0223 08:34:26.920309 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:26 crc kubenswrapper[5047]: I0223 08:34:26.921375 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:26 crc kubenswrapper[5047]: I0223 08:34:26.996591 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:27 crc kubenswrapper[5047]: I0223 08:34:27.365254 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:27 crc kubenswrapper[5047]: I0223 08:34:27.441029 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8x4fh"] Feb 23 08:34:29 crc kubenswrapper[5047]: I0223 08:34:29.300706 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8x4fh" podUID="526dcdbe-b00e-4d13-9d28-95ad35819635" containerName="registry-server" containerID="cri-o://db1e37d120d0769078f7cc1fd4e0421d185de890d737c55cfcef7e44b06364a2" gracePeriod=2 Feb 23 08:34:29 crc kubenswrapper[5047]: I0223 08:34:29.805663 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:29 crc kubenswrapper[5047]: I0223 08:34:29.950474 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wngrl\" (UniqueName: \"kubernetes.io/projected/526dcdbe-b00e-4d13-9d28-95ad35819635-kube-api-access-wngrl\") pod \"526dcdbe-b00e-4d13-9d28-95ad35819635\" (UID: \"526dcdbe-b00e-4d13-9d28-95ad35819635\") " Feb 23 08:34:29 crc kubenswrapper[5047]: I0223 08:34:29.950552 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526dcdbe-b00e-4d13-9d28-95ad35819635-catalog-content\") pod \"526dcdbe-b00e-4d13-9d28-95ad35819635\" (UID: \"526dcdbe-b00e-4d13-9d28-95ad35819635\") " Feb 23 08:34:29 crc kubenswrapper[5047]: I0223 08:34:29.950618 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526dcdbe-b00e-4d13-9d28-95ad35819635-utilities\") pod \"526dcdbe-b00e-4d13-9d28-95ad35819635\" (UID: \"526dcdbe-b00e-4d13-9d28-95ad35819635\") " Feb 23 08:34:29 crc kubenswrapper[5047]: I0223 08:34:29.952568 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/526dcdbe-b00e-4d13-9d28-95ad35819635-utilities" (OuterVolumeSpecName: "utilities") pod "526dcdbe-b00e-4d13-9d28-95ad35819635" (UID: "526dcdbe-b00e-4d13-9d28-95ad35819635"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:34:29 crc kubenswrapper[5047]: I0223 08:34:29.961378 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/526dcdbe-b00e-4d13-9d28-95ad35819635-kube-api-access-wngrl" (OuterVolumeSpecName: "kube-api-access-wngrl") pod "526dcdbe-b00e-4d13-9d28-95ad35819635" (UID: "526dcdbe-b00e-4d13-9d28-95ad35819635"). InnerVolumeSpecName "kube-api-access-wngrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.053162 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526dcdbe-b00e-4d13-9d28-95ad35819635-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.053220 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wngrl\" (UniqueName: \"kubernetes.io/projected/526dcdbe-b00e-4d13-9d28-95ad35819635-kube-api-access-wngrl\") on node \"crc\" DevicePath \"\"" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.225276 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/526dcdbe-b00e-4d13-9d28-95ad35819635-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "526dcdbe-b00e-4d13-9d28-95ad35819635" (UID: "526dcdbe-b00e-4d13-9d28-95ad35819635"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.256832 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526dcdbe-b00e-4d13-9d28-95ad35819635-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.323723 5047 generic.go:334] "Generic (PLEG): container finished" podID="526dcdbe-b00e-4d13-9d28-95ad35819635" containerID="db1e37d120d0769078f7cc1fd4e0421d185de890d737c55cfcef7e44b06364a2" exitCode=0 Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.323796 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8x4fh" event={"ID":"526dcdbe-b00e-4d13-9d28-95ad35819635","Type":"ContainerDied","Data":"db1e37d120d0769078f7cc1fd4e0421d185de890d737c55cfcef7e44b06364a2"} Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.323840 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8x4fh" event={"ID":"526dcdbe-b00e-4d13-9d28-95ad35819635","Type":"ContainerDied","Data":"62bd78fd557b52365adb6eec3e3034e0ca1554225f306c153940468be5e5c33c"} Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.323878 5047 scope.go:117] "RemoveContainer" containerID="db1e37d120d0769078f7cc1fd4e0421d185de890d737c55cfcef7e44b06364a2" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.323900 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8x4fh" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.377492 5047 scope.go:117] "RemoveContainer" containerID="73c7ff98efea7d9c4794a2712536915a0c469b5ce76d268845e20abfa4491415" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.389300 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8x4fh"] Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.400129 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8x4fh"] Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.405253 5047 scope.go:117] "RemoveContainer" containerID="b1ebb5b0a36159336aa4a2f4278f14864ad5b4ec5d6a972cc2808b5575a531d9" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.445502 5047 scope.go:117] "RemoveContainer" containerID="db1e37d120d0769078f7cc1fd4e0421d185de890d737c55cfcef7e44b06364a2" Feb 23 08:34:30 crc kubenswrapper[5047]: E0223 08:34:30.446027 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db1e37d120d0769078f7cc1fd4e0421d185de890d737c55cfcef7e44b06364a2\": container with ID starting with db1e37d120d0769078f7cc1fd4e0421d185de890d737c55cfcef7e44b06364a2 not found: ID does not exist" containerID="db1e37d120d0769078f7cc1fd4e0421d185de890d737c55cfcef7e44b06364a2" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.446067 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db1e37d120d0769078f7cc1fd4e0421d185de890d737c55cfcef7e44b06364a2"} err="failed to get container status \"db1e37d120d0769078f7cc1fd4e0421d185de890d737c55cfcef7e44b06364a2\": rpc error: code = NotFound desc = could not find container \"db1e37d120d0769078f7cc1fd4e0421d185de890d737c55cfcef7e44b06364a2\": container with ID starting with db1e37d120d0769078f7cc1fd4e0421d185de890d737c55cfcef7e44b06364a2 not found: ID does not exist" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.446098 5047 scope.go:117] "RemoveContainer" containerID="73c7ff98efea7d9c4794a2712536915a0c469b5ce76d268845e20abfa4491415" Feb 23 08:34:30 crc kubenswrapper[5047]: E0223 08:34:30.446544 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c7ff98efea7d9c4794a2712536915a0c469b5ce76d268845e20abfa4491415\": container with ID starting with 73c7ff98efea7d9c4794a2712536915a0c469b5ce76d268845e20abfa4491415 not found: ID does not exist" containerID="73c7ff98efea7d9c4794a2712536915a0c469b5ce76d268845e20abfa4491415" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.446572 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c7ff98efea7d9c4794a2712536915a0c469b5ce76d268845e20abfa4491415"} err="failed to get container status \"73c7ff98efea7d9c4794a2712536915a0c469b5ce76d268845e20abfa4491415\": rpc error: code = NotFound desc = could not find container \"73c7ff98efea7d9c4794a2712536915a0c469b5ce76d268845e20abfa4491415\": container with ID starting with 73c7ff98efea7d9c4794a2712536915a0c469b5ce76d268845e20abfa4491415 not found: ID does not exist" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.446589 5047 scope.go:117] "RemoveContainer" containerID="b1ebb5b0a36159336aa4a2f4278f14864ad5b4ec5d6a972cc2808b5575a531d9" Feb 23 08:34:30 crc kubenswrapper[5047]: E0223 08:34:30.446879 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1ebb5b0a36159336aa4a2f4278f14864ad5b4ec5d6a972cc2808b5575a531d9\": container with ID starting with b1ebb5b0a36159336aa4a2f4278f14864ad5b4ec5d6a972cc2808b5575a531d9 not found: ID does not exist" containerID="b1ebb5b0a36159336aa4a2f4278f14864ad5b4ec5d6a972cc2808b5575a531d9" Feb 23 08:34:30 crc kubenswrapper[5047]: I0223 08:34:30.447000 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1ebb5b0a36159336aa4a2f4278f14864ad5b4ec5d6a972cc2808b5575a531d9"} err="failed to get container status \"b1ebb5b0a36159336aa4a2f4278f14864ad5b4ec5d6a972cc2808b5575a531d9\": rpc error: code = NotFound desc = could not find container \"b1ebb5b0a36159336aa4a2f4278f14864ad5b4ec5d6a972cc2808b5575a531d9\": container with ID starting with b1ebb5b0a36159336aa4a2f4278f14864ad5b4ec5d6a972cc2808b5575a531d9 not found: ID does not exist" Feb 23 08:34:32 crc kubenswrapper[5047]: I0223 08:34:32.361601 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="526dcdbe-b00e-4d13-9d28-95ad35819635" path="/var/lib/kubelet/pods/526dcdbe-b00e-4d13-9d28-95ad35819635/volumes" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.821267 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9dslh"] Feb 23 08:35:00 crc kubenswrapper[5047]: E0223 08:35:00.822523 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526dcdbe-b00e-4d13-9d28-95ad35819635" containerName="extract-content" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.822546 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="526dcdbe-b00e-4d13-9d28-95ad35819635" containerName="extract-content" Feb 23 08:35:00 crc kubenswrapper[5047]: E0223 08:35:00.822571 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526dcdbe-b00e-4d13-9d28-95ad35819635" containerName="registry-server" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.822583 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="526dcdbe-b00e-4d13-9d28-95ad35819635" containerName="registry-server" Feb 23 08:35:00 crc kubenswrapper[5047]: E0223 08:35:00.822603 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526dcdbe-b00e-4d13-9d28-95ad35819635" containerName="extract-utilities" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.822616 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="526dcdbe-b00e-4d13-9d28-95ad35819635" containerName="extract-utilities" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.822887 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="526dcdbe-b00e-4d13-9d28-95ad35819635" containerName="registry-server" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.824785 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.837595 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9dslh"] Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.888499 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71326e33-03f3-4acf-91fe-a37d9edbef26-catalog-content\") pod \"community-operators-9dslh\" (UID: \"71326e33-03f3-4acf-91fe-a37d9edbef26\") " pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.888664 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71326e33-03f3-4acf-91fe-a37d9edbef26-utilities\") pod \"community-operators-9dslh\" (UID: \"71326e33-03f3-4acf-91fe-a37d9edbef26\") " pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.888706 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp2sk\" (UniqueName: \"kubernetes.io/projected/71326e33-03f3-4acf-91fe-a37d9edbef26-kube-api-access-hp2sk\") pod \"community-operators-9dslh\" (UID: \"71326e33-03f3-4acf-91fe-a37d9edbef26\") " pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.990078 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71326e33-03f3-4acf-91fe-a37d9edbef26-catalog-content\") pod \"community-operators-9dslh\" (UID: \"71326e33-03f3-4acf-91fe-a37d9edbef26\") " pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.990393 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71326e33-03f3-4acf-91fe-a37d9edbef26-utilities\") pod \"community-operators-9dslh\" (UID: \"71326e33-03f3-4acf-91fe-a37d9edbef26\") " pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.990478 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp2sk\" (UniqueName: \"kubernetes.io/projected/71326e33-03f3-4acf-91fe-a37d9edbef26-kube-api-access-hp2sk\") pod \"community-operators-9dslh\" (UID: \"71326e33-03f3-4acf-91fe-a37d9edbef26\") " pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.990727 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71326e33-03f3-4acf-91fe-a37d9edbef26-catalog-content\") pod \"community-operators-9dslh\" (UID: \"71326e33-03f3-4acf-91fe-a37d9edbef26\") " pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:00 crc kubenswrapper[5047]: I0223 08:35:00.990891 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71326e33-03f3-4acf-91fe-a37d9edbef26-utilities\") pod \"community-operators-9dslh\" (UID: \"71326e33-03f3-4acf-91fe-a37d9edbef26\") " pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:01 crc kubenswrapper[5047]: I0223 08:35:01.023143 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp2sk\" (UniqueName: \"kubernetes.io/projected/71326e33-03f3-4acf-91fe-a37d9edbef26-kube-api-access-hp2sk\") pod \"community-operators-9dslh\" (UID: \"71326e33-03f3-4acf-91fe-a37d9edbef26\") " pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:01 crc kubenswrapper[5047]: I0223 08:35:01.151369 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:01 crc kubenswrapper[5047]: I0223 08:35:01.633571 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9dslh"] Feb 23 08:35:02 crc kubenswrapper[5047]: I0223 08:35:02.069159 5047 generic.go:334] "Generic (PLEG): container finished" podID="71326e33-03f3-4acf-91fe-a37d9edbef26" containerID="c0f48cd930b6c9fc29a281bb9d62694f172264bf13270e7bb3246f98c6dae567" exitCode=0 Feb 23 08:35:02 crc kubenswrapper[5047]: I0223 08:35:02.069268 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dslh" event={"ID":"71326e33-03f3-4acf-91fe-a37d9edbef26","Type":"ContainerDied","Data":"c0f48cd930b6c9fc29a281bb9d62694f172264bf13270e7bb3246f98c6dae567"} Feb 23 08:35:02 crc kubenswrapper[5047]: I0223 08:35:02.069318 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dslh" event={"ID":"71326e33-03f3-4acf-91fe-a37d9edbef26","Type":"ContainerStarted","Data":"24e686f4b1a96336b7815c4f23d9f81ca47afafdac3a466d803a31a44cd9ffe9"} Feb 23 08:35:03 crc kubenswrapper[5047]: I0223 08:35:03.080630 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dslh" event={"ID":"71326e33-03f3-4acf-91fe-a37d9edbef26","Type":"ContainerStarted","Data":"bcdeba4e11b18e63e5c93e1d671bc3af0d53d7e7204345f79635852c3a9f1cfc"} Feb 23 08:35:04 crc kubenswrapper[5047]: I0223 08:35:04.089412 5047 generic.go:334] "Generic (PLEG): container finished" podID="71326e33-03f3-4acf-91fe-a37d9edbef26" containerID="bcdeba4e11b18e63e5c93e1d671bc3af0d53d7e7204345f79635852c3a9f1cfc" exitCode=0 Feb 23 08:35:04 crc kubenswrapper[5047]: I0223 08:35:04.089575 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dslh" event={"ID":"71326e33-03f3-4acf-91fe-a37d9edbef26","Type":"ContainerDied","Data":"bcdeba4e11b18e63e5c93e1d671bc3af0d53d7e7204345f79635852c3a9f1cfc"} Feb 23 08:35:05 crc kubenswrapper[5047]: I0223 08:35:05.104834 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dslh" event={"ID":"71326e33-03f3-4acf-91fe-a37d9edbef26","Type":"ContainerStarted","Data":"8faafdd6d60be5f877f0bb5ec5ecab251af5202165eb9c261e0cba4853418b52"} Feb 23 08:35:05 crc kubenswrapper[5047]: I0223 08:35:05.137557 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9dslh" podStartSLOduration=2.717443293 podStartE2EDuration="5.137526855s" podCreationTimestamp="2026-02-23 08:35:00 +0000 UTC" firstStartedPulling="2026-02-23 08:35:02.073435693 +0000 UTC m=+6624.324762867" lastFinishedPulling="2026-02-23 08:35:04.493519285 +0000 UTC m=+6626.744846429" observedRunningTime="2026-02-23 08:35:05.127397603 +0000 UTC m=+6627.378724737" watchObservedRunningTime="2026-02-23 08:35:05.137526855 +0000 UTC m=+6627.388854029" Feb 23 08:35:11 crc kubenswrapper[5047]: I0223 08:35:11.152104 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:11 crc kubenswrapper[5047]: I0223 08:35:11.152623 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:11 crc kubenswrapper[5047]: I0223 08:35:11.236038 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:11 crc kubenswrapper[5047]: I0223 08:35:11.304184 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:14 crc kubenswrapper[5047]: I0223 08:35:14.197330 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9dslh"] Feb 23 08:35:14 crc kubenswrapper[5047]: I0223 08:35:14.199984 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9dslh" podUID="71326e33-03f3-4acf-91fe-a37d9edbef26" containerName="registry-server" containerID="cri-o://8faafdd6d60be5f877f0bb5ec5ecab251af5202165eb9c261e0cba4853418b52" gracePeriod=2 Feb 23 08:35:14 crc kubenswrapper[5047]: I0223 08:35:14.754938 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:14 crc kubenswrapper[5047]: I0223 08:35:14.781235 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71326e33-03f3-4acf-91fe-a37d9edbef26-utilities\") pod \"71326e33-03f3-4acf-91fe-a37d9edbef26\" (UID: \"71326e33-03f3-4acf-91fe-a37d9edbef26\") " Feb 23 08:35:14 crc kubenswrapper[5047]: I0223 08:35:14.781386 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp2sk\" (UniqueName: \"kubernetes.io/projected/71326e33-03f3-4acf-91fe-a37d9edbef26-kube-api-access-hp2sk\") pod \"71326e33-03f3-4acf-91fe-a37d9edbef26\" (UID: \"71326e33-03f3-4acf-91fe-a37d9edbef26\") " Feb 23 08:35:14 crc kubenswrapper[5047]: I0223 08:35:14.781487 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71326e33-03f3-4acf-91fe-a37d9edbef26-catalog-content\") pod \"71326e33-03f3-4acf-91fe-a37d9edbef26\" (UID: \"71326e33-03f3-4acf-91fe-a37d9edbef26\") " Feb 23 08:35:14 crc kubenswrapper[5047]: I0223 08:35:14.784037 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71326e33-03f3-4acf-91fe-a37d9edbef26-utilities" (OuterVolumeSpecName: "utilities") pod "71326e33-03f3-4acf-91fe-a37d9edbef26" (UID: "71326e33-03f3-4acf-91fe-a37d9edbef26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:35:14 crc kubenswrapper[5047]: I0223 08:35:14.795697 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71326e33-03f3-4acf-91fe-a37d9edbef26-kube-api-access-hp2sk" (OuterVolumeSpecName: "kube-api-access-hp2sk") pod "71326e33-03f3-4acf-91fe-a37d9edbef26" (UID: "71326e33-03f3-4acf-91fe-a37d9edbef26"). InnerVolumeSpecName "kube-api-access-hp2sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:35:14 crc kubenswrapper[5047]: I0223 08:35:14.853816 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71326e33-03f3-4acf-91fe-a37d9edbef26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71326e33-03f3-4acf-91fe-a37d9edbef26" (UID: "71326e33-03f3-4acf-91fe-a37d9edbef26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:35:14 crc kubenswrapper[5047]: I0223 08:35:14.882452 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp2sk\" (UniqueName: \"kubernetes.io/projected/71326e33-03f3-4acf-91fe-a37d9edbef26-kube-api-access-hp2sk\") on node \"crc\" DevicePath \"\"" Feb 23 08:35:14 crc kubenswrapper[5047]: I0223 08:35:14.882518 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71326e33-03f3-4acf-91fe-a37d9edbef26-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:35:14 crc kubenswrapper[5047]: I0223 08:35:14.882528 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71326e33-03f3-4acf-91fe-a37d9edbef26-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.208786 5047 generic.go:334] "Generic (PLEG): container finished" podID="71326e33-03f3-4acf-91fe-a37d9edbef26" containerID="8faafdd6d60be5f877f0bb5ec5ecab251af5202165eb9c261e0cba4853418b52" exitCode=0 Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.208829 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dslh" event={"ID":"71326e33-03f3-4acf-91fe-a37d9edbef26","Type":"ContainerDied","Data":"8faafdd6d60be5f877f0bb5ec5ecab251af5202165eb9c261e0cba4853418b52"} Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.208836 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dslh" Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.208861 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dslh" event={"ID":"71326e33-03f3-4acf-91fe-a37d9edbef26","Type":"ContainerDied","Data":"24e686f4b1a96336b7815c4f23d9f81ca47afafdac3a466d803a31a44cd9ffe9"} Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.208881 5047 scope.go:117] "RemoveContainer" containerID="8faafdd6d60be5f877f0bb5ec5ecab251af5202165eb9c261e0cba4853418b52" Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.240596 5047 scope.go:117] "RemoveContainer" containerID="bcdeba4e11b18e63e5c93e1d671bc3af0d53d7e7204345f79635852c3a9f1cfc" Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.254521 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9dslh"] Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.267194 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9dslh"] Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.272704 5047 scope.go:117] "RemoveContainer" containerID="c0f48cd930b6c9fc29a281bb9d62694f172264bf13270e7bb3246f98c6dae567" Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.312211 5047 scope.go:117] "RemoveContainer" containerID="8faafdd6d60be5f877f0bb5ec5ecab251af5202165eb9c261e0cba4853418b52" Feb 23 08:35:15 crc kubenswrapper[5047]: E0223 08:35:15.312675 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8faafdd6d60be5f877f0bb5ec5ecab251af5202165eb9c261e0cba4853418b52\": container with ID starting with 8faafdd6d60be5f877f0bb5ec5ecab251af5202165eb9c261e0cba4853418b52 not found: ID does not exist" containerID="8faafdd6d60be5f877f0bb5ec5ecab251af5202165eb9c261e0cba4853418b52" Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.312708 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8faafdd6d60be5f877f0bb5ec5ecab251af5202165eb9c261e0cba4853418b52"} err="failed to get container status \"8faafdd6d60be5f877f0bb5ec5ecab251af5202165eb9c261e0cba4853418b52\": rpc error: code = NotFound desc = could not find container \"8faafdd6d60be5f877f0bb5ec5ecab251af5202165eb9c261e0cba4853418b52\": container with ID starting with 8faafdd6d60be5f877f0bb5ec5ecab251af5202165eb9c261e0cba4853418b52 not found: ID does not exist" Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.312730 5047 scope.go:117] "RemoveContainer" containerID="bcdeba4e11b18e63e5c93e1d671bc3af0d53d7e7204345f79635852c3a9f1cfc" Feb 23 08:35:15 crc kubenswrapper[5047]: E0223 08:35:15.313306 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcdeba4e11b18e63e5c93e1d671bc3af0d53d7e7204345f79635852c3a9f1cfc\": container with ID starting with bcdeba4e11b18e63e5c93e1d671bc3af0d53d7e7204345f79635852c3a9f1cfc not found: ID does not exist" containerID="bcdeba4e11b18e63e5c93e1d671bc3af0d53d7e7204345f79635852c3a9f1cfc" Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.313326 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcdeba4e11b18e63e5c93e1d671bc3af0d53d7e7204345f79635852c3a9f1cfc"} err="failed to get container status \"bcdeba4e11b18e63e5c93e1d671bc3af0d53d7e7204345f79635852c3a9f1cfc\": rpc error: code = NotFound desc = could not find container \"bcdeba4e11b18e63e5c93e1d671bc3af0d53d7e7204345f79635852c3a9f1cfc\": container with ID starting with bcdeba4e11b18e63e5c93e1d671bc3af0d53d7e7204345f79635852c3a9f1cfc not found: ID does not exist" Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.313340 5047 scope.go:117] "RemoveContainer" containerID="c0f48cd930b6c9fc29a281bb9d62694f172264bf13270e7bb3246f98c6dae567" Feb 23 08:35:15 crc kubenswrapper[5047]: E0223 08:35:15.313871 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f48cd930b6c9fc29a281bb9d62694f172264bf13270e7bb3246f98c6dae567\": container with ID starting with c0f48cd930b6c9fc29a281bb9d62694f172264bf13270e7bb3246f98c6dae567 not found: ID does not exist" containerID="c0f48cd930b6c9fc29a281bb9d62694f172264bf13270e7bb3246f98c6dae567" Feb 23 08:35:15 crc kubenswrapper[5047]: I0223 08:35:15.313900 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f48cd930b6c9fc29a281bb9d62694f172264bf13270e7bb3246f98c6dae567"} err="failed to get container status \"c0f48cd930b6c9fc29a281bb9d62694f172264bf13270e7bb3246f98c6dae567\": rpc error: code = NotFound desc = could not find container \"c0f48cd930b6c9fc29a281bb9d62694f172264bf13270e7bb3246f98c6dae567\": container with ID starting with c0f48cd930b6c9fc29a281bb9d62694f172264bf13270e7bb3246f98c6dae567 not found: ID does not exist" Feb 23 08:35:16 crc kubenswrapper[5047]: I0223 08:35:16.358703 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71326e33-03f3-4acf-91fe-a37d9edbef26" path="/var/lib/kubelet/pods/71326e33-03f3-4acf-91fe-a37d9edbef26/volumes" Feb 23 08:36:46 crc kubenswrapper[5047]: I0223 08:36:46.760334 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:36:46 crc kubenswrapper[5047]: I0223 08:36:46.761240 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:36:55 crc kubenswrapper[5047]: I0223 08:36:55.990770 5047 scope.go:117] "RemoveContainer" containerID="6fbe93cb216655ba7c9450d1c62801e8c95c2afe0c00afd532f5ed1da9b6cacb" Feb 23 08:37:16 crc kubenswrapper[5047]: I0223 08:37:16.759654 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:37:16 crc kubenswrapper[5047]: I0223 08:37:16.760307 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:37:46 crc kubenswrapper[5047]: I0223 08:37:46.759613 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:37:46 crc kubenswrapper[5047]: I0223 08:37:46.760518 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:37:46 crc kubenswrapper[5047]: I0223 08:37:46.760592 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 08:37:46 crc kubenswrapper[5047]: I0223 08:37:46.761686 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:37:46 crc kubenswrapper[5047]: I0223 08:37:46.761842 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" gracePeriod=600 Feb 23 08:37:46 crc kubenswrapper[5047]: E0223 08:37:46.899295 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:37:46 crc kubenswrapper[5047]: I0223 08:37:46.936374 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" exitCode=0 Feb 23 08:37:46 crc kubenswrapper[5047]: I0223 08:37:46.936431 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357"} Feb 23 08:37:46 crc kubenswrapper[5047]: I0223 08:37:46.936492 5047 scope.go:117] "RemoveContainer" containerID="dc00fe3df405381f420e2b1840543fc3d733eab4a154b2a6713423066de3f8b0" Feb 23 08:37:46 crc kubenswrapper[5047]: I0223 08:37:46.937120 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:37:46 crc kubenswrapper[5047]: E0223 08:37:46.937472 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.124082 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-czbx2"] Feb 23 08:37:55 crc kubenswrapper[5047]: E0223 08:37:55.124917 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71326e33-03f3-4acf-91fe-a37d9edbef26" containerName="extract-utilities" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.124931 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="71326e33-03f3-4acf-91fe-a37d9edbef26" containerName="extract-utilities" Feb 23 08:37:55 crc kubenswrapper[5047]: E0223 08:37:55.124946 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71326e33-03f3-4acf-91fe-a37d9edbef26" containerName="extract-content" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.124952 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="71326e33-03f3-4acf-91fe-a37d9edbef26" containerName="extract-content" Feb 23 08:37:55 crc kubenswrapper[5047]: E0223 08:37:55.124978 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71326e33-03f3-4acf-91fe-a37d9edbef26" containerName="registry-server" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.124985 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="71326e33-03f3-4acf-91fe-a37d9edbef26" containerName="registry-server" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.125135 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="71326e33-03f3-4acf-91fe-a37d9edbef26" containerName="registry-server" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.126185 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.138709 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-czbx2"] Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.284346 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-catalog-content\") pod \"redhat-marketplace-czbx2\" (UID: \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\") " pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.284427 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvt4l\" (UniqueName: \"kubernetes.io/projected/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-kube-api-access-dvt4l\") pod \"redhat-marketplace-czbx2\" (UID: \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\") " pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.284571 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-utilities\") pod \"redhat-marketplace-czbx2\" (UID: \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\") " pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.385732 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-utilities\") pod \"redhat-marketplace-czbx2\" (UID: \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\") " pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.385874 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-catalog-content\") pod \"redhat-marketplace-czbx2\" (UID: \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\") " pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.385922 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvt4l\" (UniqueName: \"kubernetes.io/projected/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-kube-api-access-dvt4l\") pod \"redhat-marketplace-czbx2\" (UID: \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\") " pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.386388 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-utilities\") pod \"redhat-marketplace-czbx2\" (UID: \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\") " pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.386428 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-catalog-content\") pod \"redhat-marketplace-czbx2\" (UID: \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\") " pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.413779 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvt4l\" (UniqueName: \"kubernetes.io/projected/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-kube-api-access-dvt4l\") pod \"redhat-marketplace-czbx2\" (UID: \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\") " pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.444129 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:37:55 crc kubenswrapper[5047]: I0223 08:37:55.913831 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-czbx2"] Feb 23 08:37:56 crc kubenswrapper[5047]: I0223 08:37:56.025963 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czbx2" event={"ID":"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd","Type":"ContainerStarted","Data":"1998e7b76b35415fe7b824636ae9cb5fdd1b9d1caa231da5c2ca1953bf657132"} Feb 23 08:37:57 crc kubenswrapper[5047]: I0223 08:37:57.038556 5047 generic.go:334] "Generic (PLEG): container finished" podID="ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" containerID="3fda5a99379ce048626166f80991543c035c602b0aece1cbd22ba8378478dd6a" exitCode=0 Feb 23 08:37:57 crc kubenswrapper[5047]: I0223 08:37:57.038649 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czbx2" event={"ID":"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd","Type":"ContainerDied","Data":"3fda5a99379ce048626166f80991543c035c602b0aece1cbd22ba8378478dd6a"} Feb 23 08:37:57 crc kubenswrapper[5047]: I0223 08:37:57.042340 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:37:58 crc kubenswrapper[5047]: I0223 08:37:58.056415 5047 generic.go:334] "Generic (PLEG): container finished" podID="ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" containerID="2a3249401c2c82dc8f7a5e72c6e565d9ebfeed9b486f702e9e34f20ed564fd74" exitCode=0 Feb 23 08:37:58 crc kubenswrapper[5047]: I0223 08:37:58.056515 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czbx2" event={"ID":"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd","Type":"ContainerDied","Data":"2a3249401c2c82dc8f7a5e72c6e565d9ebfeed9b486f702e9e34f20ed564fd74"} Feb 23 08:37:59 crc kubenswrapper[5047]: I0223 08:37:59.085068 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czbx2" event={"ID":"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd","Type":"ContainerStarted","Data":"4d1d1945a392130ee03f635e1a5073643f078899edf35ad7d9e9c39893f76bc2"} Feb 23 08:37:59 crc kubenswrapper[5047]: I0223 08:37:59.120533 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-czbx2" podStartSLOduration=2.711921104 podStartE2EDuration="4.120514689s" podCreationTimestamp="2026-02-23 08:37:55 +0000 UTC" firstStartedPulling="2026-02-23 08:37:57.042008348 +0000 UTC m=+6799.293335502" lastFinishedPulling="2026-02-23 08:37:58.450601943 +0000 UTC m=+6800.701929087" observedRunningTime="2026-02-23 08:37:59.116590134 +0000 UTC m=+6801.367917288" watchObservedRunningTime="2026-02-23 08:37:59.120514689 +0000 UTC m=+6801.371841813" Feb 23 08:37:59 crc kubenswrapper[5047]: I0223 08:37:59.341088 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:37:59 crc kubenswrapper[5047]: E0223 08:37:59.341830 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:38:05 crc kubenswrapper[5047]: I0223 08:38:05.445701 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:38:05 crc kubenswrapper[5047]: I0223 08:38:05.448740 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:38:05 crc kubenswrapper[5047]: I0223 08:38:05.524150 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:38:06 crc kubenswrapper[5047]: I0223 08:38:06.220872 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:38:06 crc kubenswrapper[5047]: I0223 08:38:06.293463 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-czbx2"] Feb 23 08:38:08 crc kubenswrapper[5047]: I0223 08:38:08.176896 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-czbx2" podUID="ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" containerName="registry-server" containerID="cri-o://4d1d1945a392130ee03f635e1a5073643f078899edf35ad7d9e9c39893f76bc2" gracePeriod=2 Feb 23 08:38:08 crc kubenswrapper[5047]: I0223 08:38:08.645685 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:38:08 crc kubenswrapper[5047]: I0223 08:38:08.735571 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-catalog-content\") pod \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\" (UID: \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\") " Feb 23 08:38:08 crc kubenswrapper[5047]: I0223 08:38:08.735750 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-utilities\") pod \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\" (UID: \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\") " Feb 23 08:38:08 crc kubenswrapper[5047]: I0223 08:38:08.735820 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvt4l\" (UniqueName: \"kubernetes.io/projected/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-kube-api-access-dvt4l\") pod \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\" (UID: \"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd\") " Feb 23 08:38:08 crc kubenswrapper[5047]: I0223 08:38:08.737533 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-utilities" (OuterVolumeSpecName: "utilities") pod "ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" (UID: "ebf040c9-b5af-472b-8a2a-1c7a8b3119cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:38:08 crc kubenswrapper[5047]: I0223 08:38:08.747136 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-kube-api-access-dvt4l" (OuterVolumeSpecName: "kube-api-access-dvt4l") pod "ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" (UID: "ebf040c9-b5af-472b-8a2a-1c7a8b3119cd"). InnerVolumeSpecName "kube-api-access-dvt4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:38:08 crc kubenswrapper[5047]: I0223 08:38:08.771276 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" (UID: "ebf040c9-b5af-472b-8a2a-1c7a8b3119cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:38:08 crc kubenswrapper[5047]: I0223 08:38:08.838798 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:38:08 crc kubenswrapper[5047]: I0223 08:38:08.838854 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:38:08 crc kubenswrapper[5047]: I0223 08:38:08.838875 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvt4l\" (UniqueName: \"kubernetes.io/projected/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd-kube-api-access-dvt4l\") on node \"crc\" DevicePath \"\"" Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.193031 5047 generic.go:334] "Generic (PLEG): container finished" podID="ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" containerID="4d1d1945a392130ee03f635e1a5073643f078899edf35ad7d9e9c39893f76bc2" exitCode=0 Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.193101 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czbx2" event={"ID":"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd","Type":"ContainerDied","Data":"4d1d1945a392130ee03f635e1a5073643f078899edf35ad7d9e9c39893f76bc2"} Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.193144 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czbx2" event={"ID":"ebf040c9-b5af-472b-8a2a-1c7a8b3119cd","Type":"ContainerDied","Data":"1998e7b76b35415fe7b824636ae9cb5fdd1b9d1caa231da5c2ca1953bf657132"} Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.193175 5047 scope.go:117] "RemoveContainer" containerID="4d1d1945a392130ee03f635e1a5073643f078899edf35ad7d9e9c39893f76bc2" Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.194469 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-czbx2" Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.250175 5047 scope.go:117] "RemoveContainer" containerID="2a3249401c2c82dc8f7a5e72c6e565d9ebfeed9b486f702e9e34f20ed564fd74" Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.255348 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-czbx2"] Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.273818 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-czbx2"] Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.278831 5047 scope.go:117] "RemoveContainer" containerID="3fda5a99379ce048626166f80991543c035c602b0aece1cbd22ba8378478dd6a" Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.330702 5047 scope.go:117] "RemoveContainer" containerID="4d1d1945a392130ee03f635e1a5073643f078899edf35ad7d9e9c39893f76bc2" Feb 23 08:38:09 crc kubenswrapper[5047]: E0223 08:38:09.331989 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1d1945a392130ee03f635e1a5073643f078899edf35ad7d9e9c39893f76bc2\": container with ID starting with 4d1d1945a392130ee03f635e1a5073643f078899edf35ad7d9e9c39893f76bc2 not found: ID does not exist" containerID="4d1d1945a392130ee03f635e1a5073643f078899edf35ad7d9e9c39893f76bc2" Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.332092 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1d1945a392130ee03f635e1a5073643f078899edf35ad7d9e9c39893f76bc2"} err="failed to get container status \"4d1d1945a392130ee03f635e1a5073643f078899edf35ad7d9e9c39893f76bc2\": rpc error: code = NotFound desc = could not find container \"4d1d1945a392130ee03f635e1a5073643f078899edf35ad7d9e9c39893f76bc2\": container with ID starting with 4d1d1945a392130ee03f635e1a5073643f078899edf35ad7d9e9c39893f76bc2 not found: ID does not exist" Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.332150 5047 scope.go:117] "RemoveContainer" containerID="2a3249401c2c82dc8f7a5e72c6e565d9ebfeed9b486f702e9e34f20ed564fd74" Feb 23 08:38:09 crc kubenswrapper[5047]: E0223 08:38:09.332619 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a3249401c2c82dc8f7a5e72c6e565d9ebfeed9b486f702e9e34f20ed564fd74\": container with ID starting with 2a3249401c2c82dc8f7a5e72c6e565d9ebfeed9b486f702e9e34f20ed564fd74 not found: ID does not exist" containerID="2a3249401c2c82dc8f7a5e72c6e565d9ebfeed9b486f702e9e34f20ed564fd74" Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.332733 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a3249401c2c82dc8f7a5e72c6e565d9ebfeed9b486f702e9e34f20ed564fd74"} err="failed to get container status \"2a3249401c2c82dc8f7a5e72c6e565d9ebfeed9b486f702e9e34f20ed564fd74\": rpc error: code = NotFound desc = could not find container \"2a3249401c2c82dc8f7a5e72c6e565d9ebfeed9b486f702e9e34f20ed564fd74\": container with ID starting with 2a3249401c2c82dc8f7a5e72c6e565d9ebfeed9b486f702e9e34f20ed564fd74 not found: ID does not exist" Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.332813 5047 scope.go:117] "RemoveContainer" containerID="3fda5a99379ce048626166f80991543c035c602b0aece1cbd22ba8378478dd6a" Feb 23 08:38:09 crc kubenswrapper[5047]: E0223 08:38:09.333384 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fda5a99379ce048626166f80991543c035c602b0aece1cbd22ba8378478dd6a\": container with ID starting with 3fda5a99379ce048626166f80991543c035c602b0aece1cbd22ba8378478dd6a not found: ID does not exist" containerID="3fda5a99379ce048626166f80991543c035c602b0aece1cbd22ba8378478dd6a" Feb 23 08:38:09 crc kubenswrapper[5047]: I0223 08:38:09.333444 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fda5a99379ce048626166f80991543c035c602b0aece1cbd22ba8378478dd6a"} err="failed to get container status \"3fda5a99379ce048626166f80991543c035c602b0aece1cbd22ba8378478dd6a\": rpc error: code = NotFound desc = could not find container \"3fda5a99379ce048626166f80991543c035c602b0aece1cbd22ba8378478dd6a\": container with ID starting with 3fda5a99379ce048626166f80991543c035c602b0aece1cbd22ba8378478dd6a not found: ID does not exist" Feb 23 08:38:10 crc kubenswrapper[5047]: I0223 08:38:10.341283 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:38:10 crc kubenswrapper[5047]: E0223 08:38:10.341949 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:38:10 crc kubenswrapper[5047]: I0223 08:38:10.364830 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" path="/var/lib/kubelet/pods/ebf040c9-b5af-472b-8a2a-1c7a8b3119cd/volumes" Feb 23 08:38:22 crc kubenswrapper[5047]: I0223 08:38:22.341345 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:38:22 crc kubenswrapper[5047]: E0223 08:38:22.344659 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:38:33 crc kubenswrapper[5047]: I0223 08:38:33.341774 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:38:33 crc kubenswrapper[5047]: E0223 08:38:33.343235 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:38:44 crc kubenswrapper[5047]: I0223 08:38:44.342187 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:38:44 crc kubenswrapper[5047]: E0223 08:38:44.343463 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:38:57 crc kubenswrapper[5047]: I0223 08:38:57.341525 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:38:57 crc kubenswrapper[5047]: E0223 08:38:57.342562 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:39:11 crc kubenswrapper[5047]: I0223 08:39:11.342048 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:39:11 crc kubenswrapper[5047]: E0223 08:39:11.343412 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:39:23 crc kubenswrapper[5047]: I0223 08:39:23.341341 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:39:23 crc kubenswrapper[5047]: E0223 08:39:23.342527 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:39:35 crc kubenswrapper[5047]: I0223 08:39:35.341133 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:39:35 crc kubenswrapper[5047]: E0223 08:39:35.342232 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:39:49 crc kubenswrapper[5047]: I0223 08:39:49.341018 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:39:49 crc kubenswrapper[5047]: E0223 08:39:49.342210 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:40:04 crc kubenswrapper[5047]: I0223 08:40:04.341678 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:40:04 crc kubenswrapper[5047]: E0223 08:40:04.343054 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:40:19 crc kubenswrapper[5047]: I0223 08:40:19.341466 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:40:19 crc kubenswrapper[5047]: E0223 08:40:19.342350 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:40:20 crc kubenswrapper[5047]: I0223 08:40:20.088952 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-z9f6r"] Feb 23 08:40:20 crc kubenswrapper[5047]: I0223 08:40:20.097383 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-z9f6r"] Feb 23 08:40:20 crc kubenswrapper[5047]: I0223 08:40:20.352819 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3877d0e2-e421-45d2-ab02-608987a4ab03" path="/var/lib/kubelet/pods/3877d0e2-e421-45d2-ab02-608987a4ab03/volumes" Feb 23 08:40:34 crc kubenswrapper[5047]: I0223 08:40:34.341695 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:40:34 crc kubenswrapper[5047]: E0223 08:40:34.343300 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.128686 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Feb 23 08:40:39 crc kubenswrapper[5047]: E0223 08:40:39.130054 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" containerName="extract-content" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.130079 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" containerName="extract-content" Feb 23 08:40:39 crc kubenswrapper[5047]: E0223 08:40:39.130112 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" containerName="extract-utilities" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.130122 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" containerName="extract-utilities" Feb 23 08:40:39 crc kubenswrapper[5047]: E0223 08:40:39.130154 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" containerName="registry-server" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.130164 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" containerName="registry-server" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.130418 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf040c9-b5af-472b-8a2a-1c7a8b3119cd" containerName="registry-server" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.131313 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.134014 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-tnc55" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.140759 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.317988 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e00019bb-2318-453f-86fb-844b28e7993c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e00019bb-2318-453f-86fb-844b28e7993c\") pod \"mariadb-copy-data\" (UID: \"f7982ba2-03dd-461c-bef5-d4223fd9ddd5\") " pod="openstack/mariadb-copy-data" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.318421 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlwzc\" (UniqueName: \"kubernetes.io/projected/f7982ba2-03dd-461c-bef5-d4223fd9ddd5-kube-api-access-mlwzc\") pod \"mariadb-copy-data\" (UID: \"f7982ba2-03dd-461c-bef5-d4223fd9ddd5\") " pod="openstack/mariadb-copy-data" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.420568 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e00019bb-2318-453f-86fb-844b28e7993c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e00019bb-2318-453f-86fb-844b28e7993c\") pod \"mariadb-copy-data\" (UID: \"f7982ba2-03dd-461c-bef5-d4223fd9ddd5\") " pod="openstack/mariadb-copy-data" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.420716 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwzc\" (UniqueName: \"kubernetes.io/projected/f7982ba2-03dd-461c-bef5-d4223fd9ddd5-kube-api-access-mlwzc\") pod \"mariadb-copy-data\" (UID: \"f7982ba2-03dd-461c-bef5-d4223fd9ddd5\") " pod="openstack/mariadb-copy-data" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.425656 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.425748 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e00019bb-2318-453f-86fb-844b28e7993c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e00019bb-2318-453f-86fb-844b28e7993c\") pod \"mariadb-copy-data\" (UID: \"f7982ba2-03dd-461c-bef5-d4223fd9ddd5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6d0d26aa3b4b1d3fe824ae7247ea0c7bc0fa4973b4bfedf7ace83de09e3fa76e/globalmount\"" pod="openstack/mariadb-copy-data" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.455854 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwzc\" (UniqueName: \"kubernetes.io/projected/f7982ba2-03dd-461c-bef5-d4223fd9ddd5-kube-api-access-mlwzc\") pod \"mariadb-copy-data\" (UID: \"f7982ba2-03dd-461c-bef5-d4223fd9ddd5\") " pod="openstack/mariadb-copy-data" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.472855 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e00019bb-2318-453f-86fb-844b28e7993c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e00019bb-2318-453f-86fb-844b28e7993c\") pod \"mariadb-copy-data\" (UID: \"f7982ba2-03dd-461c-bef5-d4223fd9ddd5\") " pod="openstack/mariadb-copy-data" Feb 23 08:40:39 crc kubenswrapper[5047]: I0223 08:40:39.765132 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 23 08:40:40 crc kubenswrapper[5047]: I0223 08:40:40.189068 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 23 08:40:40 crc kubenswrapper[5047]: I0223 08:40:40.782303 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"f7982ba2-03dd-461c-bef5-d4223fd9ddd5","Type":"ContainerStarted","Data":"275245be4e1d753203f6c3c8cc1e21aa7ee24902f5fb7405f17a6544e6c06d89"} Feb 23 08:40:40 crc kubenswrapper[5047]: I0223 08:40:40.782395 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"f7982ba2-03dd-461c-bef5-d4223fd9ddd5","Type":"ContainerStarted","Data":"bca99637c6f292db0aaad36e9beadb2bd35f59a9d94a6bd7f9450fcfc165a7b2"} Feb 23 08:40:40 crc kubenswrapper[5047]: I0223 08:40:40.807253 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.807233819 podStartE2EDuration="2.807233819s" podCreationTimestamp="2026-02-23 08:40:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:40:40.804324851 +0000 UTC m=+6963.055651985" watchObservedRunningTime="2026-02-23 08:40:40.807233819 +0000 UTC m=+6963.058560943" Feb 23 08:40:44 crc kubenswrapper[5047]: I0223 08:40:44.247581 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 23 08:40:44 crc kubenswrapper[5047]: I0223 08:40:44.250001 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:40:44 crc kubenswrapper[5047]: I0223 08:40:44.290370 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:40:44 crc kubenswrapper[5047]: I0223 08:40:44.417216 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xklzx\" (UniqueName: \"kubernetes.io/projected/65f24843-9f11-45ae-85d1-79f8c330e645-kube-api-access-xklzx\") pod \"mariadb-client\" (UID: \"65f24843-9f11-45ae-85d1-79f8c330e645\") " pod="openstack/mariadb-client" Feb 23 08:40:44 crc kubenswrapper[5047]: I0223 08:40:44.519376 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xklzx\" (UniqueName: \"kubernetes.io/projected/65f24843-9f11-45ae-85d1-79f8c330e645-kube-api-access-xklzx\") pod \"mariadb-client\" (UID: \"65f24843-9f11-45ae-85d1-79f8c330e645\") " pod="openstack/mariadb-client" Feb 23 08:40:44 crc kubenswrapper[5047]: I0223 08:40:44.555599 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xklzx\" (UniqueName: \"kubernetes.io/projected/65f24843-9f11-45ae-85d1-79f8c330e645-kube-api-access-xklzx\") pod \"mariadb-client\" (UID: \"65f24843-9f11-45ae-85d1-79f8c330e645\") " pod="openstack/mariadb-client" Feb 23 08:40:44 crc kubenswrapper[5047]: I0223 08:40:44.610696 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:40:44 crc kubenswrapper[5047]: I0223 08:40:44.902799 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:40:44 crc kubenswrapper[5047]: W0223 08:40:44.908145 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65f24843_9f11_45ae_85d1_79f8c330e645.slice/crio-e542e1da7952f97955ce11cd2b601b3dd99d09717f17ce295d147343eaff886e WatchSource:0}: Error finding container e542e1da7952f97955ce11cd2b601b3dd99d09717f17ce295d147343eaff886e: Status 404 returned error can't find the container with id e542e1da7952f97955ce11cd2b601b3dd99d09717f17ce295d147343eaff886e Feb 23 08:40:45 crc kubenswrapper[5047]: I0223 08:40:45.854794 5047 generic.go:334] "Generic (PLEG): container finished" podID="65f24843-9f11-45ae-85d1-79f8c330e645" containerID="7c22430a25e9a0f924e9db23fa9441d74399301b524a94e01670b1358c8a671a" exitCode=0 Feb 23 08:40:45 crc kubenswrapper[5047]: I0223 08:40:45.855338 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"65f24843-9f11-45ae-85d1-79f8c330e645","Type":"ContainerDied","Data":"7c22430a25e9a0f924e9db23fa9441d74399301b524a94e01670b1358c8a671a"} Feb 23 08:40:45 crc kubenswrapper[5047]: I0223 08:40:45.855393 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"65f24843-9f11-45ae-85d1-79f8c330e645","Type":"ContainerStarted","Data":"e542e1da7952f97955ce11cd2b601b3dd99d09717f17ce295d147343eaff886e"} Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.299390 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.330380 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_65f24843-9f11-45ae-85d1-79f8c330e645/mariadb-client/0.log" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.364801 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.370941 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.488653 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xklzx\" (UniqueName: \"kubernetes.io/projected/65f24843-9f11-45ae-85d1-79f8c330e645-kube-api-access-xklzx\") pod \"65f24843-9f11-45ae-85d1-79f8c330e645\" (UID: \"65f24843-9f11-45ae-85d1-79f8c330e645\") " Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.498077 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f24843-9f11-45ae-85d1-79f8c330e645-kube-api-access-xklzx" (OuterVolumeSpecName: "kube-api-access-xklzx") pod "65f24843-9f11-45ae-85d1-79f8c330e645" (UID: "65f24843-9f11-45ae-85d1-79f8c330e645"). InnerVolumeSpecName "kube-api-access-xklzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.549277 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 23 08:40:47 crc kubenswrapper[5047]: E0223 08:40:47.550021 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f24843-9f11-45ae-85d1-79f8c330e645" containerName="mariadb-client" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.550056 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f24843-9f11-45ae-85d1-79f8c330e645" containerName="mariadb-client" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.550356 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f24843-9f11-45ae-85d1-79f8c330e645" containerName="mariadb-client" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.551283 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.562646 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.590754 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xklzx\" (UniqueName: \"kubernetes.io/projected/65f24843-9f11-45ae-85d1-79f8c330e645-kube-api-access-xklzx\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.692540 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzpnr\" (UniqueName: \"kubernetes.io/projected/a36c8032-a285-4492-8ac3-6b82b455b865-kube-api-access-dzpnr\") pod \"mariadb-client\" (UID: \"a36c8032-a285-4492-8ac3-6b82b455b865\") " pod="openstack/mariadb-client" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.794681 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzpnr\" (UniqueName: \"kubernetes.io/projected/a36c8032-a285-4492-8ac3-6b82b455b865-kube-api-access-dzpnr\") pod \"mariadb-client\" (UID: \"a36c8032-a285-4492-8ac3-6b82b455b865\") " pod="openstack/mariadb-client" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.818822 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzpnr\" (UniqueName: \"kubernetes.io/projected/a36c8032-a285-4492-8ac3-6b82b455b865-kube-api-access-dzpnr\") pod \"mariadb-client\" (UID: \"a36c8032-a285-4492-8ac3-6b82b455b865\") " pod="openstack/mariadb-client" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.877865 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e542e1da7952f97955ce11cd2b601b3dd99d09717f17ce295d147343eaff886e" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.877986 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.891612 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:40:47 crc kubenswrapper[5047]: I0223 08:40:47.915048 5047 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="65f24843-9f11-45ae-85d1-79f8c330e645" podUID="a36c8032-a285-4492-8ac3-6b82b455b865" Feb 23 08:40:48 crc kubenswrapper[5047]: I0223 08:40:48.351282 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:40:48 crc kubenswrapper[5047]: E0223 08:40:48.352000 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:40:48 crc kubenswrapper[5047]: I0223 08:40:48.353414 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f24843-9f11-45ae-85d1-79f8c330e645" path="/var/lib/kubelet/pods/65f24843-9f11-45ae-85d1-79f8c330e645/volumes" Feb 23 08:40:48 crc kubenswrapper[5047]: I0223 08:40:48.404202 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:40:48 crc kubenswrapper[5047]: W0223 08:40:48.413622 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda36c8032_a285_4492_8ac3_6b82b455b865.slice/crio-939d212c39951f71f106ff8d32ebb2aeaecba663285c836a72dc7e5f84d57a35 WatchSource:0}: Error finding container 939d212c39951f71f106ff8d32ebb2aeaecba663285c836a72dc7e5f84d57a35: Status 404 returned error can't find the container with id 939d212c39951f71f106ff8d32ebb2aeaecba663285c836a72dc7e5f84d57a35 Feb 23 08:40:48 crc kubenswrapper[5047]: I0223 08:40:48.890700 5047 generic.go:334] "Generic (PLEG): container finished" podID="a36c8032-a285-4492-8ac3-6b82b455b865" containerID="7204b4d0e16eefbedca9234be35959b70f726eadc685bee72d0427c7e67f3040" exitCode=0 Feb 23 08:40:48 crc kubenswrapper[5047]: I0223 08:40:48.890771 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"a36c8032-a285-4492-8ac3-6b82b455b865","Type":"ContainerDied","Data":"7204b4d0e16eefbedca9234be35959b70f726eadc685bee72d0427c7e67f3040"} Feb 23 08:40:48 crc kubenswrapper[5047]: I0223 08:40:48.890814 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"a36c8032-a285-4492-8ac3-6b82b455b865","Type":"ContainerStarted","Data":"939d212c39951f71f106ff8d32ebb2aeaecba663285c836a72dc7e5f84d57a35"} Feb 23 08:40:50 crc kubenswrapper[5047]: I0223 08:40:50.262468 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:40:50 crc kubenswrapper[5047]: I0223 08:40:50.325128 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_a36c8032-a285-4492-8ac3-6b82b455b865/mariadb-client/0.log" Feb 23 08:40:50 crc kubenswrapper[5047]: I0223 08:40:50.371945 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:40:50 crc kubenswrapper[5047]: I0223 08:40:50.371982 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:40:50 crc kubenswrapper[5047]: I0223 08:40:50.439041 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzpnr\" (UniqueName: \"kubernetes.io/projected/a36c8032-a285-4492-8ac3-6b82b455b865-kube-api-access-dzpnr\") pod \"a36c8032-a285-4492-8ac3-6b82b455b865\" (UID: \"a36c8032-a285-4492-8ac3-6b82b455b865\") " Feb 23 08:40:50 crc kubenswrapper[5047]: I0223 08:40:50.444890 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36c8032-a285-4492-8ac3-6b82b455b865-kube-api-access-dzpnr" (OuterVolumeSpecName: "kube-api-access-dzpnr") pod "a36c8032-a285-4492-8ac3-6b82b455b865" (UID: "a36c8032-a285-4492-8ac3-6b82b455b865"). InnerVolumeSpecName "kube-api-access-dzpnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:40:50 crc kubenswrapper[5047]: I0223 08:40:50.541462 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzpnr\" (UniqueName: \"kubernetes.io/projected/a36c8032-a285-4492-8ac3-6b82b455b865-kube-api-access-dzpnr\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:50 crc kubenswrapper[5047]: I0223 08:40:50.913524 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="939d212c39951f71f106ff8d32ebb2aeaecba663285c836a72dc7e5f84d57a35" Feb 23 08:40:50 crc kubenswrapper[5047]: I0223 08:40:50.913674 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:40:52 crc kubenswrapper[5047]: I0223 08:40:52.350929 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a36c8032-a285-4492-8ac3-6b82b455b865" path="/var/lib/kubelet/pods/a36c8032-a285-4492-8ac3-6b82b455b865/volumes" Feb 23 08:40:56 crc kubenswrapper[5047]: I0223 08:40:56.150200 5047 scope.go:117] "RemoveContainer" containerID="f93cfea23a9ba94ac4dc13889ee91f2d871df1cde600bd475bb6e57aa79f0b38" Feb 23 08:41:00 crc kubenswrapper[5047]: I0223 08:41:00.344692 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:41:00 crc kubenswrapper[5047]: E0223 08:41:00.346756 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:41:12 crc kubenswrapper[5047]: I0223 08:41:12.341107 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:41:12 crc kubenswrapper[5047]: E0223 08:41:12.342107 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:41:25 crc kubenswrapper[5047]: I0223 08:41:25.341477 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:41:25 crc kubenswrapper[5047]: E0223 08:41:25.342534 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.930211 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 08:41:26 crc kubenswrapper[5047]: E0223 08:41:26.934242 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36c8032-a285-4492-8ac3-6b82b455b865" containerName="mariadb-client" Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.934272 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36c8032-a285-4492-8ac3-6b82b455b865" containerName="mariadb-client" Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.934499 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36c8032-a285-4492-8ac3-6b82b455b865" containerName="mariadb-client" Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.935428 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.940188 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-h2pm2" Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.940501 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.940869 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.940967 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.940291 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.947170 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.948562 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.955182 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.968800 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.970522 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:26 crc kubenswrapper[5047]: I0223 08:41:26.992873 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.017006 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057337 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057427 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057458 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057492 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057539 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057573 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057610 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057645 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057674 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057697 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-config\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057719 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057756 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057786 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvbwj\" (UniqueName: \"kubernetes.io/projected/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-kube-api-access-tvbwj\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057817 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-config\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057840 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxpj9\" (UniqueName: \"kubernetes.io/projected/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-kube-api-access-kxpj9\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.057871 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.159054 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.159384 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.159495 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.159613 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgjm7\" (UniqueName: \"kubernetes.io/projected/146c85be-9d67-4281-873e-b27f5e90d957-kube-api-access-mgjm7\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.159738 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.159838 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.159967 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f7bf93e5-6c47-47d7-a283-272db430f832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7bf93e5-6c47-47d7-a283-272db430f832\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.160090 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.160210 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.160338 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-config\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.160444 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.160552 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.160641 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvbwj\" (UniqueName: \"kubernetes.io/projected/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-kube-api-access-tvbwj\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.160721 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-config\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.160808 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxpj9\" (UniqueName: \"kubernetes.io/projected/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-kube-api-access-kxpj9\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.160928 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/146c85be-9d67-4281-873e-b27f5e90d957-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.161008 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.161113 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.161211 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/146c85be-9d67-4281-873e-b27f5e90d957-config\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.161290 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.161380 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.161447 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/146c85be-9d67-4281-873e-b27f5e90d957-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.161507 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-config\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.161536 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.161636 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.161698 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.161818 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.162076 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-config\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.162407 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.163120 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.171717 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.171827 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.173537 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.174660 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.181746 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.185844 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.187662 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.187707 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxpj9\" (UniqueName: \"kubernetes.io/projected/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-kube-api-access-kxpj9\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.187713 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/75909b15159b80fc45be95e0d8e99ad5bd302afe81684a58c9340d4d9bd09c0d/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.187776 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.187805 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e195c44cc1445861eea1c51cdadd293c055e4700ba35bc76472e3d6f376d5b4c/globalmount\"" pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.215275 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvbwj\" (UniqueName: \"kubernetes.io/projected/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-kube-api-access-tvbwj\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.265821 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.268192 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgjm7\" (UniqueName: \"kubernetes.io/projected/146c85be-9d67-4281-873e-b27f5e90d957-kube-api-access-mgjm7\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.268240 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.268273 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f7bf93e5-6c47-47d7-a283-272db430f832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7bf93e5-6c47-47d7-a283-272db430f832\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.268542 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/146c85be-9d67-4281-873e-b27f5e90d957-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.268647 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/146c85be-9d67-4281-873e-b27f5e90d957-config\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.268724 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/146c85be-9d67-4281-873e-b27f5e90d957-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.268769 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.269246 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/146c85be-9d67-4281-873e-b27f5e90d957-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.270080 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.270100 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/146c85be-9d67-4281-873e-b27f5e90d957-config\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.270341 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/146c85be-9d67-4281-873e-b27f5e90d957-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.274766 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.275312 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.275343 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f7bf93e5-6c47-47d7-a283-272db430f832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7bf93e5-6c47-47d7-a283-272db430f832\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/984c2954317e7722f0faaf86039f7b3b0341ee6c986b15b7d6fd7d6c84792479/globalmount\"" pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.277164 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.294210 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c\") pod \"ovsdbserver-nb-2\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.299164 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgjm7\" (UniqueName: \"kubernetes.io/projected/146c85be-9d67-4281-873e-b27f5e90d957-kube-api-access-mgjm7\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.302789 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e\") pod \"ovsdbserver-nb-0\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.316480 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f7bf93e5-6c47-47d7-a283-272db430f832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7bf93e5-6c47-47d7-a283-272db430f832\") pod \"ovsdbserver-nb-1\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.583064 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.598368 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.608551 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.821964 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.824069 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.826727 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.827020 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-48jm6" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.827343 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.827486 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.835275 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.849282 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.849468 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.851257 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.851399 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.854876 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.861831 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.979171 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.979239 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74acc7db-4095-419a-9b09-afa04283a69f-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.979272 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn7bw\" (UniqueName: \"kubernetes.io/projected/385bab5b-7cad-4274-b672-a0614be3f41e-kube-api-access-kn7bw\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.979302 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.979334 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.979387 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.979409 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346c3f0d-e5fb-40f6-bd1e-65679466165f-config\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.979433 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.979462 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74acc7db-4095-419a-9b09-afa04283a69f-config\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.979494 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.979524 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.979567 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkz4p\" (UniqueName: \"kubernetes.io/projected/74acc7db-4095-419a-9b09-afa04283a69f-kube-api-access-mkz4p\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.979591 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/346c3f0d-e5fb-40f6-bd1e-65679466165f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.980018 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74acc7db-4095-419a-9b09-afa04283a69f-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.980115 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/385bab5b-7cad-4274-b672-a0614be3f41e-config\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.980172 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.980253 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.980287 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385bab5b-7cad-4274-b672-a0614be3f41e-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.980332 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/346c3f0d-e5fb-40f6-bd1e-65679466165f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.980360 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.980416 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.980449 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.980474 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9n4c\" (UniqueName: \"kubernetes.io/projected/346c3f0d-e5fb-40f6-bd1e-65679466165f-kube-api-access-l9n4c\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:27 crc kubenswrapper[5047]: I0223 08:41:27.980495 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/385bab5b-7cad-4274-b672-a0614be3f41e-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.031547 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.103966 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104018 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74acc7db-4095-419a-9b09-afa04283a69f-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104040 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn7bw\" (UniqueName: \"kubernetes.io/projected/385bab5b-7cad-4274-b672-a0614be3f41e-kube-api-access-kn7bw\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104056 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104076 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104099 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104117 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346c3f0d-e5fb-40f6-bd1e-65679466165f-config\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104134 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104151 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74acc7db-4095-419a-9b09-afa04283a69f-config\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104172 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104194 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104226 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkz4p\" (UniqueName: \"kubernetes.io/projected/74acc7db-4095-419a-9b09-afa04283a69f-kube-api-access-mkz4p\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104244 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/346c3f0d-e5fb-40f6-bd1e-65679466165f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104268 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74acc7db-4095-419a-9b09-afa04283a69f-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104286 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/385bab5b-7cad-4274-b672-a0614be3f41e-config\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104302 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104321 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104335 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385bab5b-7cad-4274-b672-a0614be3f41e-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104351 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/346c3f0d-e5fb-40f6-bd1e-65679466165f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104369 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104386 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104403 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104420 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9n4c\" (UniqueName: \"kubernetes.io/projected/346c3f0d-e5fb-40f6-bd1e-65679466165f-kube-api-access-l9n4c\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104436 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/385bab5b-7cad-4274-b672-a0614be3f41e-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.104927 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/385bab5b-7cad-4274-b672-a0614be3f41e-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.106429 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74acc7db-4095-419a-9b09-afa04283a69f-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.107665 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385bab5b-7cad-4274-b672-a0614be3f41e-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.108493 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/346c3f0d-e5fb-40f6-bd1e-65679466165f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.109258 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74acc7db-4095-419a-9b09-afa04283a69f-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.110275 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/385bab5b-7cad-4274-b672-a0614be3f41e-config\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.111781 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/346c3f0d-e5fb-40f6-bd1e-65679466165f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.117001 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.117404 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74acc7db-4095-419a-9b09-afa04283a69f-config\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.117640 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346c3f0d-e5fb-40f6-bd1e-65679466165f-config\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.120538 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.120742 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.121301 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.121862 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.122310 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.127085 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.127161 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ce62cbc5c40901c855d860bf713d925e7f9a6c3d378bf90274f50a9f73927776/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.130790 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn7bw\" (UniqueName: \"kubernetes.io/projected/385bab5b-7cad-4274-b672-a0614be3f41e-kube-api-access-kn7bw\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.130977 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.131112 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a04dccc6888226b0d965915158ddb808a2d1985c80342815370347581745e551/globalmount\"" pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.131160 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.131211 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.131223 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.131257 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ab40536b527c97bcc6cf5bf829d584e142b699132416e067392966d4ea70696a/globalmount\"" pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.131832 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkz4p\" (UniqueName: \"kubernetes.io/projected/74acc7db-4095-419a-9b09-afa04283a69f-kube-api-access-mkz4p\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.133568 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9n4c\" (UniqueName: \"kubernetes.io/projected/346c3f0d-e5fb-40f6-bd1e-65679466165f-kube-api-access-l9n4c\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.135539 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.165464 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3\") pod \"ovsdbserver-sb-0\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.166049 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4\") pod \"ovsdbserver-sb-2\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.170037 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110\") pod \"ovsdbserver-sb-1\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.186945 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.199017 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.234420 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9205930b-2303-4b01-a3bf-cf4ef3ad0a49","Type":"ContainerStarted","Data":"b5e9c347395e22b45b3d92ad8de4b552ed83b5cfa51c306ad4ea5f6100207545"} Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.349227 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.466181 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.801573 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 23 08:41:28 crc kubenswrapper[5047]: W0223 08:41:28.820079 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod385bab5b_7cad_4274_b672_a0614be3f41e.slice/crio-8533a01011ace10ae16910553d260b19301102bbb29f5efb54691e2b02ca56af WatchSource:0}: Error finding container 8533a01011ace10ae16910553d260b19301102bbb29f5efb54691e2b02ca56af: Status 404 returned error can't find the container with id 8533a01011ace10ae16910553d260b19301102bbb29f5efb54691e2b02ca56af Feb 23 08:41:28 crc kubenswrapper[5047]: I0223 08:41:28.885163 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 23 08:41:28 crc kubenswrapper[5047]: W0223 08:41:28.886091 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74acc7db_4095_419a_9b09_afa04283a69f.slice/crio-ab37837a45462c9622cfc0a373f7b8dc97474dd28b811ca6b9653d8931daf9f8 WatchSource:0}: Error finding container ab37837a45462c9622cfc0a373f7b8dc97474dd28b811ca6b9653d8931daf9f8: Status 404 returned error can't find the container with id ab37837a45462c9622cfc0a373f7b8dc97474dd28b811ca6b9653d8931daf9f8 Feb 23 08:41:29 crc kubenswrapper[5047]: I0223 08:41:29.007292 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 08:41:29 crc kubenswrapper[5047]: W0223 08:41:29.029441 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod346c3f0d_e5fb_40f6_bd1e_65679466165f.slice/crio-8e0a04b317d902506666f366cdbba232698e745e2cfd26007172c793701ab106 WatchSource:0}: Error finding container 8e0a04b317d902506666f366cdbba232698e745e2cfd26007172c793701ab106: Status 404 returned error can't find the container with id 8e0a04b317d902506666f366cdbba232698e745e2cfd26007172c793701ab106 Feb 23 08:41:29 crc kubenswrapper[5047]: I0223 08:41:29.260669 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8","Type":"ContainerStarted","Data":"c1fa523d70d012697da63410618631befcddece6cb31ecc5063ca4369f02f911"} Feb 23 08:41:29 crc kubenswrapper[5047]: I0223 08:41:29.265166 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"385bab5b-7cad-4274-b672-a0614be3f41e","Type":"ContainerStarted","Data":"8533a01011ace10ae16910553d260b19301102bbb29f5efb54691e2b02ca56af"} Feb 23 08:41:29 crc kubenswrapper[5047]: I0223 08:41:29.266789 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"346c3f0d-e5fb-40f6-bd1e-65679466165f","Type":"ContainerStarted","Data":"8e0a04b317d902506666f366cdbba232698e745e2cfd26007172c793701ab106"} Feb 23 08:41:29 crc kubenswrapper[5047]: I0223 08:41:29.269330 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"74acc7db-4095-419a-9b09-afa04283a69f","Type":"ContainerStarted","Data":"ab37837a45462c9622cfc0a373f7b8dc97474dd28b811ca6b9653d8931daf9f8"} Feb 23 08:41:29 crc kubenswrapper[5047]: I0223 08:41:29.453589 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 23 08:41:29 crc kubenswrapper[5047]: W0223 08:41:29.465853 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod146c85be_9d67_4281_873e_b27f5e90d957.slice/crio-7190a44dff669a9b3906efced5dbb427934cbde97e946429826653f924f96da8 WatchSource:0}: Error finding container 7190a44dff669a9b3906efced5dbb427934cbde97e946429826653f924f96da8: Status 404 returned error can't find the container with id 7190a44dff669a9b3906efced5dbb427934cbde97e946429826653f924f96da8 Feb 23 08:41:30 crc kubenswrapper[5047]: I0223 08:41:30.281069 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"146c85be-9d67-4281-873e-b27f5e90d957","Type":"ContainerStarted","Data":"7190a44dff669a9b3906efced5dbb427934cbde97e946429826653f924f96da8"} Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.385565 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"385bab5b-7cad-4274-b672-a0614be3f41e","Type":"ContainerStarted","Data":"f3246bebc9ebdff6972450f73eaa67d34d962a161929ce55ff3505b8bc91d983"} Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.386228 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"385bab5b-7cad-4274-b672-a0614be3f41e","Type":"ContainerStarted","Data":"d823703061a177710055372592a26735509301e028b1a82a26fc0eb8b4c976a3"} Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.389836 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9205930b-2303-4b01-a3bf-cf4ef3ad0a49","Type":"ContainerStarted","Data":"33c766011377115c89385b76818662881b608592e8f5fe9eacc9c3fff7aad9a0"} Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.390009 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9205930b-2303-4b01-a3bf-cf4ef3ad0a49","Type":"ContainerStarted","Data":"809716648c4107c38d6ed5e1da34f1681092785995acc0e50a5dfbd5e3502aa8"} Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.392504 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"146c85be-9d67-4281-873e-b27f5e90d957","Type":"ContainerStarted","Data":"855e342977c6d200e9eb4077a8fada2e25afccca32a14e64e70912cbfbb54b29"} Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.392547 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"146c85be-9d67-4281-873e-b27f5e90d957","Type":"ContainerStarted","Data":"b95626b92928b330aeada129bc33bfc7f515fbbc0d2b15b4671ea5ae25433071"} Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.396819 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"346c3f0d-e5fb-40f6-bd1e-65679466165f","Type":"ContainerStarted","Data":"5b5cdce23bc6b1850489f770e11653c0e0f5c790fb261d4311d63e59f1d91444"} Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.396900 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"346c3f0d-e5fb-40f6-bd1e-65679466165f","Type":"ContainerStarted","Data":"3af56d9850e342eaf92994ecc8a16c1e2545b5684d7c31b87e2addd0e9540af4"} Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.400411 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"74acc7db-4095-419a-9b09-afa04283a69f","Type":"ContainerStarted","Data":"c44ef2aca4d1a0638cb9dfd26e45f7f7dab935e68d88e63b1aa79658027107f1"} Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.400484 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"74acc7db-4095-419a-9b09-afa04283a69f","Type":"ContainerStarted","Data":"1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224"} Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.403652 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8","Type":"ContainerStarted","Data":"dc2a3c442046d0ca90c2e2cf7cce3b69605e0fac97b43c4fa57bbf160f1e4fe4"} Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.403714 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8","Type":"ContainerStarted","Data":"e18a052509e18885714dbaa0ad71543da1d6cd717884fceb5aff08abd9fae5c9"} Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.418724 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.122780416 podStartE2EDuration="9.418691338s" podCreationTimestamp="2026-02-23 08:41:26 +0000 UTC" firstStartedPulling="2026-02-23 08:41:28.822146692 +0000 UTC m=+7011.073473846" lastFinishedPulling="2026-02-23 08:41:34.118057594 +0000 UTC m=+7016.369384768" observedRunningTime="2026-02-23 08:41:35.418137663 +0000 UTC m=+7017.669464837" watchObservedRunningTime="2026-02-23 08:41:35.418691338 +0000 UTC m=+7017.670018512" Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.480666 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.380369207 podStartE2EDuration="10.480647861s" podCreationTimestamp="2026-02-23 08:41:25 +0000 UTC" firstStartedPulling="2026-02-23 08:41:28.031405936 +0000 UTC m=+7010.282733070" lastFinishedPulling="2026-02-23 08:41:34.13168457 +0000 UTC m=+7016.383011724" observedRunningTime="2026-02-23 08:41:35.456837582 +0000 UTC m=+7017.708164736" watchObservedRunningTime="2026-02-23 08:41:35.480647861 +0000 UTC m=+7017.731974995" Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.500440 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.40937309 podStartE2EDuration="9.500408132s" podCreationTimestamp="2026-02-23 08:41:26 +0000 UTC" firstStartedPulling="2026-02-23 08:41:29.034616376 +0000 UTC m=+7011.285943510" lastFinishedPulling="2026-02-23 08:41:34.125651418 +0000 UTC m=+7016.376978552" observedRunningTime="2026-02-23 08:41:35.499591349 +0000 UTC m=+7017.750918573" watchObservedRunningTime="2026-02-23 08:41:35.500408132 +0000 UTC m=+7017.751735326" Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.513064 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.749878826 podStartE2EDuration="10.51304342s" podCreationTimestamp="2026-02-23 08:41:25 +0000 UTC" firstStartedPulling="2026-02-23 08:41:28.355071534 +0000 UTC m=+7010.606398668" lastFinishedPulling="2026-02-23 08:41:34.118236118 +0000 UTC m=+7016.369563262" observedRunningTime="2026-02-23 08:41:35.477147637 +0000 UTC m=+7017.728474791" watchObservedRunningTime="2026-02-23 08:41:35.51304342 +0000 UTC m=+7017.764370564" Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.519305 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=5.857363494 podStartE2EDuration="10.519294218s" podCreationTimestamp="2026-02-23 08:41:25 +0000 UTC" firstStartedPulling="2026-02-23 08:41:29.468549634 +0000 UTC m=+7011.719876768" lastFinishedPulling="2026-02-23 08:41:34.130480338 +0000 UTC m=+7016.381807492" observedRunningTime="2026-02-23 08:41:35.517291995 +0000 UTC m=+7017.768619139" watchObservedRunningTime="2026-02-23 08:41:35.519294218 +0000 UTC m=+7017.770621362" Feb 23 08:41:35 crc kubenswrapper[5047]: I0223 08:41:35.572133 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.341848616 podStartE2EDuration="9.572112356s" podCreationTimestamp="2026-02-23 08:41:26 +0000 UTC" firstStartedPulling="2026-02-23 08:41:28.888142393 +0000 UTC m=+7011.139469527" lastFinishedPulling="2026-02-23 08:41:34.118406093 +0000 UTC m=+7016.369733267" observedRunningTime="2026-02-23 08:41:35.56406059 +0000 UTC m=+7017.815387734" watchObservedRunningTime="2026-02-23 08:41:35.572112356 +0000 UTC m=+7017.823439490" Feb 23 08:41:36 crc kubenswrapper[5047]: I0223 08:41:36.583695 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:36 crc kubenswrapper[5047]: I0223 08:41:36.599033 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:36 crc kubenswrapper[5047]: I0223 08:41:36.609338 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:37 crc kubenswrapper[5047]: I0223 08:41:37.213132 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:37 crc kubenswrapper[5047]: I0223 08:41:37.213210 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:37 crc kubenswrapper[5047]: I0223 08:41:37.282033 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:37 crc kubenswrapper[5047]: I0223 08:41:37.285325 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:37 crc kubenswrapper[5047]: I0223 08:41:37.421537 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:37 crc kubenswrapper[5047]: I0223 08:41:37.422251 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:37 crc kubenswrapper[5047]: I0223 08:41:37.466795 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:37 crc kubenswrapper[5047]: I0223 08:41:37.525364 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:37 crc kubenswrapper[5047]: I0223 08:41:37.583158 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:37 crc kubenswrapper[5047]: I0223 08:41:37.598610 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:37 crc kubenswrapper[5047]: I0223 08:41:37.608748 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:38 crc kubenswrapper[5047]: I0223 08:41:38.346112 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:41:38 crc kubenswrapper[5047]: E0223 08:41:38.346675 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:41:38 crc kubenswrapper[5047]: I0223 08:41:38.430396 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:39 crc kubenswrapper[5047]: I0223 08:41:39.501454 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Feb 23 08:41:39 crc kubenswrapper[5047]: I0223 08:41:39.511361 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 23 08:41:39 crc kubenswrapper[5047]: I0223 08:41:39.673048 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:39 crc kubenswrapper[5047]: I0223 08:41:39.733941 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:39 crc kubenswrapper[5047]: I0223 08:41:39.872320 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:39 crc kubenswrapper[5047]: I0223 08:41:39.872580 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Feb 23 08:41:39 crc kubenswrapper[5047]: I0223 08:41:39.872927 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 23 08:41:39 crc kubenswrapper[5047]: I0223 08:41:39.953525 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-858786496c-dlgrk"] Feb 23 08:41:39 crc kubenswrapper[5047]: I0223 08:41:39.955074 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:39 crc kubenswrapper[5047]: I0223 08:41:39.957551 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Feb 23 08:41:39 crc kubenswrapper[5047]: I0223 08:41:39.960002 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 23 08:41:39 crc kubenswrapper[5047]: I0223 08:41:39.966304 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-858786496c-dlgrk"] Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.095454 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-ovsdbserver-sb\") pod \"dnsmasq-dns-858786496c-dlgrk\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.095497 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd78w\" (UniqueName: \"kubernetes.io/projected/1b002fb8-6b16-4ffb-88be-7f2e793e766d-kube-api-access-kd78w\") pod \"dnsmasq-dns-858786496c-dlgrk\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.095552 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-config\") pod \"dnsmasq-dns-858786496c-dlgrk\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.095619 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-dns-svc\") pod \"dnsmasq-dns-858786496c-dlgrk\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.196934 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-ovsdbserver-sb\") pod \"dnsmasq-dns-858786496c-dlgrk\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.197386 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd78w\" (UniqueName: \"kubernetes.io/projected/1b002fb8-6b16-4ffb-88be-7f2e793e766d-kube-api-access-kd78w\") pod \"dnsmasq-dns-858786496c-dlgrk\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.197437 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-config\") pod \"dnsmasq-dns-858786496c-dlgrk\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.197517 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-dns-svc\") pod \"dnsmasq-dns-858786496c-dlgrk\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.202778 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-dns-svc\") pod \"dnsmasq-dns-858786496c-dlgrk\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.202865 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-ovsdbserver-sb\") pod \"dnsmasq-dns-858786496c-dlgrk\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.203039 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-config\") pod \"dnsmasq-dns-858786496c-dlgrk\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.231056 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd78w\" (UniqueName: \"kubernetes.io/projected/1b002fb8-6b16-4ffb-88be-7f2e793e766d-kube-api-access-kd78w\") pod \"dnsmasq-dns-858786496c-dlgrk\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.237231 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-858786496c-dlgrk"] Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.259983 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.273036 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fdb4868d5-th7hh"] Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.274695 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.280189 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.309372 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdb4868d5-th7hh"] Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.406933 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-config\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.407225 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-dns-svc\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.407415 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.407532 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zlkz\" (UniqueName: \"kubernetes.io/projected/61bec6f6-3b99-42f1-bc30-c73dada51d9d-kube-api-access-6zlkz\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.407590 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.508732 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-dns-svc\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.509314 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.509366 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zlkz\" (UniqueName: \"kubernetes.io/projected/61bec6f6-3b99-42f1-bc30-c73dada51d9d-kube-api-access-6zlkz\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.509399 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.509493 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-config\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.510385 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-dns-svc\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.512266 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.512418 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.513310 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-config\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.533273 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zlkz\" (UniqueName: \"kubernetes.io/projected/61bec6f6-3b99-42f1-bc30-c73dada51d9d-kube-api-access-6zlkz\") pod \"dnsmasq-dns-fdb4868d5-th7hh\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.650281 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:40 crc kubenswrapper[5047]: I0223 08:41:40.801468 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-858786496c-dlgrk"] Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.094421 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdb4868d5-th7hh"] Feb 23 08:41:41 crc kubenswrapper[5047]: W0223 08:41:41.098064 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61bec6f6_3b99_42f1_bc30_c73dada51d9d.slice/crio-6d0c8470e1e28a8205943cb0a4f69e813f46c6b6104e1ef4ee07145a4dfcb069 WatchSource:0}: Error finding container 6d0c8470e1e28a8205943cb0a4f69e813f46c6b6104e1ef4ee07145a4dfcb069: Status 404 returned error can't find the container with id 6d0c8470e1e28a8205943cb0a4f69e813f46c6b6104e1ef4ee07145a4dfcb069 Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.473595 5047 generic.go:334] "Generic (PLEG): container finished" podID="61bec6f6-3b99-42f1-bc30-c73dada51d9d" containerID="a4e3762cdf941e48ed1fc8b7bf038777c5a39177378fecc991b895afb54624b2" exitCode=0 Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.473772 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" event={"ID":"61bec6f6-3b99-42f1-bc30-c73dada51d9d","Type":"ContainerDied","Data":"a4e3762cdf941e48ed1fc8b7bf038777c5a39177378fecc991b895afb54624b2"} Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.474011 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" event={"ID":"61bec6f6-3b99-42f1-bc30-c73dada51d9d","Type":"ContainerStarted","Data":"6d0c8470e1e28a8205943cb0a4f69e813f46c6b6104e1ef4ee07145a4dfcb069"} Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.477333 5047 generic.go:334] "Generic (PLEG): container finished" podID="1b002fb8-6b16-4ffb-88be-7f2e793e766d" containerID="4d4acfa5e000f7506a51c997de04c086848bd72d4844479bfbd0fac1fbc9c10b" exitCode=0 Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.477433 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858786496c-dlgrk" event={"ID":"1b002fb8-6b16-4ffb-88be-7f2e793e766d","Type":"ContainerDied","Data":"4d4acfa5e000f7506a51c997de04c086848bd72d4844479bfbd0fac1fbc9c10b"} Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.477481 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858786496c-dlgrk" event={"ID":"1b002fb8-6b16-4ffb-88be-7f2e793e766d","Type":"ContainerStarted","Data":"31f333b3b11279dda6c7d674b282b3fef34bfe44174dec1fea7ff79dbade269d"} Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.812105 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.932983 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-dns-svc\") pod \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.933317 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-ovsdbserver-sb\") pod \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.933516 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-config\") pod \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.933556 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd78w\" (UniqueName: \"kubernetes.io/projected/1b002fb8-6b16-4ffb-88be-7f2e793e766d-kube-api-access-kd78w\") pod \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\" (UID: \"1b002fb8-6b16-4ffb-88be-7f2e793e766d\") " Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.937364 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b002fb8-6b16-4ffb-88be-7f2e793e766d-kube-api-access-kd78w" (OuterVolumeSpecName: "kube-api-access-kd78w") pod "1b002fb8-6b16-4ffb-88be-7f2e793e766d" (UID: "1b002fb8-6b16-4ffb-88be-7f2e793e766d"). InnerVolumeSpecName "kube-api-access-kd78w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.951801 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b002fb8-6b16-4ffb-88be-7f2e793e766d" (UID: "1b002fb8-6b16-4ffb-88be-7f2e793e766d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.955393 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-config" (OuterVolumeSpecName: "config") pod "1b002fb8-6b16-4ffb-88be-7f2e793e766d" (UID: "1b002fb8-6b16-4ffb-88be-7f2e793e766d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:41:41 crc kubenswrapper[5047]: I0223 08:41:41.959596 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b002fb8-6b16-4ffb-88be-7f2e793e766d" (UID: "1b002fb8-6b16-4ffb-88be-7f2e793e766d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:41:42 crc kubenswrapper[5047]: I0223 08:41:42.036640 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:41:42 crc kubenswrapper[5047]: I0223 08:41:42.036709 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd78w\" (UniqueName: \"kubernetes.io/projected/1b002fb8-6b16-4ffb-88be-7f2e793e766d-kube-api-access-kd78w\") on node \"crc\" DevicePath \"\"" Feb 23 08:41:42 crc kubenswrapper[5047]: I0223 08:41:42.036732 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:41:42 crc kubenswrapper[5047]: I0223 08:41:42.036750 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b002fb8-6b16-4ffb-88be-7f2e793e766d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:41:42 crc kubenswrapper[5047]: I0223 08:41:42.489721 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" event={"ID":"61bec6f6-3b99-42f1-bc30-c73dada51d9d","Type":"ContainerStarted","Data":"5206799de092a8bb39701d066a005aac2011999f7a223af18f57b3429283ceeb"} Feb 23 08:41:42 crc kubenswrapper[5047]: I0223 08:41:42.489995 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:42 crc kubenswrapper[5047]: I0223 08:41:42.492384 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-858786496c-dlgrk" event={"ID":"1b002fb8-6b16-4ffb-88be-7f2e793e766d","Type":"ContainerDied","Data":"31f333b3b11279dda6c7d674b282b3fef34bfe44174dec1fea7ff79dbade269d"} Feb 23 08:41:42 crc kubenswrapper[5047]: I0223 08:41:42.492433 5047 scope.go:117] "RemoveContainer" containerID="4d4acfa5e000f7506a51c997de04c086848bd72d4844479bfbd0fac1fbc9c10b" Feb 23 08:41:42 crc kubenswrapper[5047]: I0223 08:41:42.492564 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858786496c-dlgrk" Feb 23 08:41:42 crc kubenswrapper[5047]: I0223 08:41:42.515174 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" podStartSLOduration=2.515134583 podStartE2EDuration="2.515134583s" podCreationTimestamp="2026-02-23 08:41:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:41:42.512529493 +0000 UTC m=+7024.763856657" watchObservedRunningTime="2026-02-23 08:41:42.515134583 +0000 UTC m=+7024.766461717" Feb 23 08:41:42 crc kubenswrapper[5047]: I0223 08:41:42.600541 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-858786496c-dlgrk"] Feb 23 08:41:42 crc kubenswrapper[5047]: I0223 08:41:42.606598 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-858786496c-dlgrk"] Feb 23 08:41:43 crc kubenswrapper[5047]: I0223 08:41:43.266266 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Feb 23 08:41:44 crc kubenswrapper[5047]: I0223 08:41:44.358031 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b002fb8-6b16-4ffb-88be-7f2e793e766d" path="/var/lib/kubelet/pods/1b002fb8-6b16-4ffb-88be-7f2e793e766d/volumes" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.525541 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Feb 23 08:41:45 crc kubenswrapper[5047]: E0223 08:41:45.526416 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b002fb8-6b16-4ffb-88be-7f2e793e766d" containerName="init" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.526436 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b002fb8-6b16-4ffb-88be-7f2e793e766d" containerName="init" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.526675 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b002fb8-6b16-4ffb-88be-7f2e793e766d" containerName="init" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.527372 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.530430 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.540803 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.707340 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-05b84435-efb2-4b49-8b88-57874eed5701\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-05b84435-efb2-4b49-8b88-57874eed5701\") pod \"ovn-copy-data\" (UID: \"33e156a6-7aa5-4769-8219-118aecb3b161\") " pod="openstack/ovn-copy-data" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.707411 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/33e156a6-7aa5-4769-8219-118aecb3b161-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"33e156a6-7aa5-4769-8219-118aecb3b161\") " pod="openstack/ovn-copy-data" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.707509 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmdbw\" (UniqueName: \"kubernetes.io/projected/33e156a6-7aa5-4769-8219-118aecb3b161-kube-api-access-qmdbw\") pod \"ovn-copy-data\" (UID: \"33e156a6-7aa5-4769-8219-118aecb3b161\") " pod="openstack/ovn-copy-data" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.809392 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-05b84435-efb2-4b49-8b88-57874eed5701\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-05b84435-efb2-4b49-8b88-57874eed5701\") pod \"ovn-copy-data\" (UID: \"33e156a6-7aa5-4769-8219-118aecb3b161\") " pod="openstack/ovn-copy-data" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.809470 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/33e156a6-7aa5-4769-8219-118aecb3b161-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"33e156a6-7aa5-4769-8219-118aecb3b161\") " pod="openstack/ovn-copy-data" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.809505 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmdbw\" (UniqueName: \"kubernetes.io/projected/33e156a6-7aa5-4769-8219-118aecb3b161-kube-api-access-qmdbw\") pod \"ovn-copy-data\" (UID: \"33e156a6-7aa5-4769-8219-118aecb3b161\") " pod="openstack/ovn-copy-data" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.816423 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.816473 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-05b84435-efb2-4b49-8b88-57874eed5701\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-05b84435-efb2-4b49-8b88-57874eed5701\") pod \"ovn-copy-data\" (UID: \"33e156a6-7aa5-4769-8219-118aecb3b161\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c49a2d0fd0de365112872c65550329010f750c91b3f0a18e4cba49505b015078/globalmount\"" pod="openstack/ovn-copy-data" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.818043 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/33e156a6-7aa5-4769-8219-118aecb3b161-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"33e156a6-7aa5-4769-8219-118aecb3b161\") " pod="openstack/ovn-copy-data" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.842760 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmdbw\" (UniqueName: \"kubernetes.io/projected/33e156a6-7aa5-4769-8219-118aecb3b161-kube-api-access-qmdbw\") pod \"ovn-copy-data\" (UID: \"33e156a6-7aa5-4769-8219-118aecb3b161\") " pod="openstack/ovn-copy-data" Feb 23 08:41:45 crc kubenswrapper[5047]: I0223 08:41:45.880311 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-05b84435-efb2-4b49-8b88-57874eed5701\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-05b84435-efb2-4b49-8b88-57874eed5701\") pod \"ovn-copy-data\" (UID: \"33e156a6-7aa5-4769-8219-118aecb3b161\") " pod="openstack/ovn-copy-data" Feb 23 08:41:46 crc kubenswrapper[5047]: I0223 08:41:46.161123 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 23 08:41:46 crc kubenswrapper[5047]: I0223 08:41:46.720302 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 23 08:41:46 crc kubenswrapper[5047]: W0223 08:41:46.726129 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33e156a6_7aa5_4769_8219_118aecb3b161.slice/crio-3ed959145df59d193fbf97cb9f6f6526e3274b1e4aa71dc030b245e4512ecd17 WatchSource:0}: Error finding container 3ed959145df59d193fbf97cb9f6f6526e3274b1e4aa71dc030b245e4512ecd17: Status 404 returned error can't find the container with id 3ed959145df59d193fbf97cb9f6f6526e3274b1e4aa71dc030b245e4512ecd17 Feb 23 08:41:47 crc kubenswrapper[5047]: I0223 08:41:47.577201 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"33e156a6-7aa5-4769-8219-118aecb3b161","Type":"ContainerStarted","Data":"1c77c1f0f1e6b25a138a342acba27c33a4de799901c4d45d6ddcd5cd88c94c61"} Feb 23 08:41:47 crc kubenswrapper[5047]: I0223 08:41:47.577837 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"33e156a6-7aa5-4769-8219-118aecb3b161","Type":"ContainerStarted","Data":"3ed959145df59d193fbf97cb9f6f6526e3274b1e4aa71dc030b245e4512ecd17"} Feb 23 08:41:47 crc kubenswrapper[5047]: I0223 08:41:47.614433 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.417339545 podStartE2EDuration="3.614409056s" podCreationTimestamp="2026-02-23 08:41:44 +0000 UTC" firstStartedPulling="2026-02-23 08:41:46.727743604 +0000 UTC m=+7028.979070738" lastFinishedPulling="2026-02-23 08:41:46.924813115 +0000 UTC m=+7029.176140249" observedRunningTime="2026-02-23 08:41:47.596704281 +0000 UTC m=+7029.848031415" watchObservedRunningTime="2026-02-23 08:41:47.614409056 +0000 UTC m=+7029.865736200" Feb 23 08:41:49 crc kubenswrapper[5047]: I0223 08:41:49.341223 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:41:49 crc kubenswrapper[5047]: E0223 08:41:49.341951 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:41:50 crc kubenswrapper[5047]: I0223 08:41:50.653124 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:41:50 crc kubenswrapper[5047]: I0223 08:41:50.736216 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79496f79cc-hph48"] Feb 23 08:41:50 crc kubenswrapper[5047]: I0223 08:41:50.736602 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79496f79cc-hph48" podUID="79488449-9115-4c14-a738-6e8a8fc93dd6" containerName="dnsmasq-dns" containerID="cri-o://a1ba1d8ec540e8dc98dd9804ecbed5fdb8255cd10c575f40a12e1a4b9f8237dc" gracePeriod=10 Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.336180 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.425607 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v25c6\" (UniqueName: \"kubernetes.io/projected/79488449-9115-4c14-a738-6e8a8fc93dd6-kube-api-access-v25c6\") pod \"79488449-9115-4c14-a738-6e8a8fc93dd6\" (UID: \"79488449-9115-4c14-a738-6e8a8fc93dd6\") " Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.425973 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79488449-9115-4c14-a738-6e8a8fc93dd6-config\") pod \"79488449-9115-4c14-a738-6e8a8fc93dd6\" (UID: \"79488449-9115-4c14-a738-6e8a8fc93dd6\") " Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.426086 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79488449-9115-4c14-a738-6e8a8fc93dd6-dns-svc\") pod \"79488449-9115-4c14-a738-6e8a8fc93dd6\" (UID: \"79488449-9115-4c14-a738-6e8a8fc93dd6\") " Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.439090 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79488449-9115-4c14-a738-6e8a8fc93dd6-kube-api-access-v25c6" (OuterVolumeSpecName: "kube-api-access-v25c6") pod "79488449-9115-4c14-a738-6e8a8fc93dd6" (UID: "79488449-9115-4c14-a738-6e8a8fc93dd6"). InnerVolumeSpecName "kube-api-access-v25c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.484689 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79488449-9115-4c14-a738-6e8a8fc93dd6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79488449-9115-4c14-a738-6e8a8fc93dd6" (UID: "79488449-9115-4c14-a738-6e8a8fc93dd6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.494128 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79488449-9115-4c14-a738-6e8a8fc93dd6-config" (OuterVolumeSpecName: "config") pod "79488449-9115-4c14-a738-6e8a8fc93dd6" (UID: "79488449-9115-4c14-a738-6e8a8fc93dd6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.529042 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v25c6\" (UniqueName: \"kubernetes.io/projected/79488449-9115-4c14-a738-6e8a8fc93dd6-kube-api-access-v25c6\") on node \"crc\" DevicePath \"\"" Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.529101 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79488449-9115-4c14-a738-6e8a8fc93dd6-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.529127 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79488449-9115-4c14-a738-6e8a8fc93dd6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.621380 5047 generic.go:334] "Generic (PLEG): container finished" podID="79488449-9115-4c14-a738-6e8a8fc93dd6" containerID="a1ba1d8ec540e8dc98dd9804ecbed5fdb8255cd10c575f40a12e1a4b9f8237dc" exitCode=0 Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.621439 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79496f79cc-hph48" event={"ID":"79488449-9115-4c14-a738-6e8a8fc93dd6","Type":"ContainerDied","Data":"a1ba1d8ec540e8dc98dd9804ecbed5fdb8255cd10c575f40a12e1a4b9f8237dc"} Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.621479 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79496f79cc-hph48" event={"ID":"79488449-9115-4c14-a738-6e8a8fc93dd6","Type":"ContainerDied","Data":"28593a9e2c4cb60ef09566c289bda89d6e48e1f84b67d703a7ba81c568371349"} Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.621508 5047 scope.go:117] "RemoveContainer" containerID="a1ba1d8ec540e8dc98dd9804ecbed5fdb8255cd10c575f40a12e1a4b9f8237dc" Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.621691 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79496f79cc-hph48" Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.675424 5047 scope.go:117] "RemoveContainer" containerID="c86b73b5bd1561cf31ddcefc14de7cd23ddbeef7ad7a77147064cc184b5f144a" Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.685697 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79496f79cc-hph48"] Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.696671 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79496f79cc-hph48"] Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.703659 5047 scope.go:117] "RemoveContainer" containerID="a1ba1d8ec540e8dc98dd9804ecbed5fdb8255cd10c575f40a12e1a4b9f8237dc" Feb 23 08:41:51 crc kubenswrapper[5047]: E0223 08:41:51.704286 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1ba1d8ec540e8dc98dd9804ecbed5fdb8255cd10c575f40a12e1a4b9f8237dc\": container with ID starting with a1ba1d8ec540e8dc98dd9804ecbed5fdb8255cd10c575f40a12e1a4b9f8237dc not found: ID does not exist" containerID="a1ba1d8ec540e8dc98dd9804ecbed5fdb8255cd10c575f40a12e1a4b9f8237dc" Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.704331 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1ba1d8ec540e8dc98dd9804ecbed5fdb8255cd10c575f40a12e1a4b9f8237dc"} err="failed to get container status \"a1ba1d8ec540e8dc98dd9804ecbed5fdb8255cd10c575f40a12e1a4b9f8237dc\": rpc error: code = NotFound desc = could not find container \"a1ba1d8ec540e8dc98dd9804ecbed5fdb8255cd10c575f40a12e1a4b9f8237dc\": container with ID starting with a1ba1d8ec540e8dc98dd9804ecbed5fdb8255cd10c575f40a12e1a4b9f8237dc not found: ID does not exist" Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.704379 5047 scope.go:117] "RemoveContainer" containerID="c86b73b5bd1561cf31ddcefc14de7cd23ddbeef7ad7a77147064cc184b5f144a" Feb 23 08:41:51 crc kubenswrapper[5047]: E0223 08:41:51.705023 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c86b73b5bd1561cf31ddcefc14de7cd23ddbeef7ad7a77147064cc184b5f144a\": container with ID starting with c86b73b5bd1561cf31ddcefc14de7cd23ddbeef7ad7a77147064cc184b5f144a not found: ID does not exist" containerID="c86b73b5bd1561cf31ddcefc14de7cd23ddbeef7ad7a77147064cc184b5f144a" Feb 23 08:41:51 crc kubenswrapper[5047]: I0223 08:41:51.705071 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c86b73b5bd1561cf31ddcefc14de7cd23ddbeef7ad7a77147064cc184b5f144a"} err="failed to get container status \"c86b73b5bd1561cf31ddcefc14de7cd23ddbeef7ad7a77147064cc184b5f144a\": rpc error: code = NotFound desc = could not find container \"c86b73b5bd1561cf31ddcefc14de7cd23ddbeef7ad7a77147064cc184b5f144a\": container with ID starting with c86b73b5bd1561cf31ddcefc14de7cd23ddbeef7ad7a77147064cc184b5f144a not found: ID does not exist" Feb 23 08:41:52 crc kubenswrapper[5047]: I0223 08:41:52.354448 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79488449-9115-4c14-a738-6e8a8fc93dd6" path="/var/lib/kubelet/pods/79488449-9115-4c14-a738-6e8a8fc93dd6/volumes" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.611757 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 23 08:41:55 crc kubenswrapper[5047]: E0223 08:41:55.612592 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79488449-9115-4c14-a738-6e8a8fc93dd6" containerName="init" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.612612 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="79488449-9115-4c14-a738-6e8a8fc93dd6" containerName="init" Feb 23 08:41:55 crc kubenswrapper[5047]: E0223 08:41:55.612634 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79488449-9115-4c14-a738-6e8a8fc93dd6" containerName="dnsmasq-dns" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.612643 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="79488449-9115-4c14-a738-6e8a8fc93dd6" containerName="dnsmasq-dns" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.612855 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="79488449-9115-4c14-a738-6e8a8fc93dd6" containerName="dnsmasq-dns" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.623457 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.639774 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.641820 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.642406 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.642703 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-s7kst" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.643617 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.746945 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c59427cb-019c-4f83-af18-75900909e70f-config\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.747017 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c59427cb-019c-4f83-af18-75900909e70f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.747041 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c59427cb-019c-4f83-af18-75900909e70f-scripts\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.747097 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.747126 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5k5d\" (UniqueName: \"kubernetes.io/projected/c59427cb-019c-4f83-af18-75900909e70f-kube-api-access-j5k5d\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.747155 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.747214 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.849294 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.849614 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.849811 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c59427cb-019c-4f83-af18-75900909e70f-config\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.850023 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c59427cb-019c-4f83-af18-75900909e70f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.850203 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c59427cb-019c-4f83-af18-75900909e70f-scripts\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.850339 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.850474 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5k5d\" (UniqueName: \"kubernetes.io/projected/c59427cb-019c-4f83-af18-75900909e70f-kube-api-access-j5k5d\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.850532 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c59427cb-019c-4f83-af18-75900909e70f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.851217 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c59427cb-019c-4f83-af18-75900909e70f-scripts\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.851288 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c59427cb-019c-4f83-af18-75900909e70f-config\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.857717 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.861848 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.861873 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.872633 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5k5d\" (UniqueName: \"kubernetes.io/projected/c59427cb-019c-4f83-af18-75900909e70f-kube-api-access-j5k5d\") pod \"ovn-northd-0\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " pod="openstack/ovn-northd-0" Feb 23 08:41:55 crc kubenswrapper[5047]: I0223 08:41:55.950278 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 08:41:56 crc kubenswrapper[5047]: I0223 08:41:56.383149 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 08:41:56 crc kubenswrapper[5047]: W0223 08:41:56.390545 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc59427cb_019c_4f83_af18_75900909e70f.slice/crio-9a19259a9f8f0b14e6169dbcfedda2d068f9b140d9214894ccfa3ec424cc30f3 WatchSource:0}: Error finding container 9a19259a9f8f0b14e6169dbcfedda2d068f9b140d9214894ccfa3ec424cc30f3: Status 404 returned error can't find the container with id 9a19259a9f8f0b14e6169dbcfedda2d068f9b140d9214894ccfa3ec424cc30f3 Feb 23 08:41:56 crc kubenswrapper[5047]: I0223 08:41:56.671636 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c59427cb-019c-4f83-af18-75900909e70f","Type":"ContainerStarted","Data":"9a19259a9f8f0b14e6169dbcfedda2d068f9b140d9214894ccfa3ec424cc30f3"} Feb 23 08:41:57 crc kubenswrapper[5047]: I0223 08:41:57.697679 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c59427cb-019c-4f83-af18-75900909e70f","Type":"ContainerStarted","Data":"300cc5bda7124e79cc3bdc763a253160aad67d2c3e2c6d44cbdf59edc78788c3"} Feb 23 08:41:57 crc kubenswrapper[5047]: I0223 08:41:57.698031 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c59427cb-019c-4f83-af18-75900909e70f","Type":"ContainerStarted","Data":"5474b9871e571cc4e16fdda4dacca659354e26baaaa9e5970121b7d51f36100b"} Feb 23 08:41:57 crc kubenswrapper[5047]: I0223 08:41:57.698209 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 23 08:41:57 crc kubenswrapper[5047]: I0223 08:41:57.746083 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.133751433 podStartE2EDuration="2.746057879s" podCreationTimestamp="2026-02-23 08:41:55 +0000 UTC" firstStartedPulling="2026-02-23 08:41:56.393521132 +0000 UTC m=+7038.644848276" lastFinishedPulling="2026-02-23 08:41:57.005827578 +0000 UTC m=+7039.257154722" observedRunningTime="2026-02-23 08:41:57.729152415 +0000 UTC m=+7039.980479559" watchObservedRunningTime="2026-02-23 08:41:57.746057879 +0000 UTC m=+7039.997385043" Feb 23 08:42:00 crc kubenswrapper[5047]: I0223 08:42:00.341455 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:42:00 crc kubenswrapper[5047]: E0223 08:42:00.342282 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.657295 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-sbvrk"] Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.658865 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sbvrk" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.666224 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ce6c-account-create-update-hlzml"] Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.667561 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ce6c-account-create-update-hlzml" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.669386 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.673523 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sbvrk"] Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.733441 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ce6c-account-create-update-hlzml"] Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.759260 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdhnn\" (UniqueName: \"kubernetes.io/projected/9181e45e-92b4-4708-b4f6-6a6d2788f58c-kube-api-access-xdhnn\") pod \"keystone-db-create-sbvrk\" (UID: \"9181e45e-92b4-4708-b4f6-6a6d2788f58c\") " pod="openstack/keystone-db-create-sbvrk" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.759359 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvq9j\" (UniqueName: \"kubernetes.io/projected/8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9-kube-api-access-zvq9j\") pod \"keystone-ce6c-account-create-update-hlzml\" (UID: \"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9\") " pod="openstack/keystone-ce6c-account-create-update-hlzml" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.759388 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9181e45e-92b4-4708-b4f6-6a6d2788f58c-operator-scripts\") pod \"keystone-db-create-sbvrk\" (UID: \"9181e45e-92b4-4708-b4f6-6a6d2788f58c\") " pod="openstack/keystone-db-create-sbvrk" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.759411 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9-operator-scripts\") pod \"keystone-ce6c-account-create-update-hlzml\" (UID: \"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9\") " pod="openstack/keystone-ce6c-account-create-update-hlzml" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.860890 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvq9j\" (UniqueName: \"kubernetes.io/projected/8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9-kube-api-access-zvq9j\") pod \"keystone-ce6c-account-create-update-hlzml\" (UID: \"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9\") " pod="openstack/keystone-ce6c-account-create-update-hlzml" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.860988 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9181e45e-92b4-4708-b4f6-6a6d2788f58c-operator-scripts\") pod \"keystone-db-create-sbvrk\" (UID: \"9181e45e-92b4-4708-b4f6-6a6d2788f58c\") " pod="openstack/keystone-db-create-sbvrk" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.861052 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9-operator-scripts\") pod \"keystone-ce6c-account-create-update-hlzml\" (UID: \"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9\") " pod="openstack/keystone-ce6c-account-create-update-hlzml" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.861181 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdhnn\" (UniqueName: \"kubernetes.io/projected/9181e45e-92b4-4708-b4f6-6a6d2788f58c-kube-api-access-xdhnn\") pod \"keystone-db-create-sbvrk\" (UID: \"9181e45e-92b4-4708-b4f6-6a6d2788f58c\") " pod="openstack/keystone-db-create-sbvrk" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.861820 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9181e45e-92b4-4708-b4f6-6a6d2788f58c-operator-scripts\") pod \"keystone-db-create-sbvrk\" (UID: \"9181e45e-92b4-4708-b4f6-6a6d2788f58c\") " pod="openstack/keystone-db-create-sbvrk" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.862359 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9-operator-scripts\") pod \"keystone-ce6c-account-create-update-hlzml\" (UID: \"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9\") " pod="openstack/keystone-ce6c-account-create-update-hlzml" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.883413 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvq9j\" (UniqueName: \"kubernetes.io/projected/8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9-kube-api-access-zvq9j\") pod \"keystone-ce6c-account-create-update-hlzml\" (UID: \"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9\") " pod="openstack/keystone-ce6c-account-create-update-hlzml" Feb 23 08:42:03 crc kubenswrapper[5047]: I0223 08:42:03.887767 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdhnn\" (UniqueName: \"kubernetes.io/projected/9181e45e-92b4-4708-b4f6-6a6d2788f58c-kube-api-access-xdhnn\") pod \"keystone-db-create-sbvrk\" (UID: \"9181e45e-92b4-4708-b4f6-6a6d2788f58c\") " pod="openstack/keystone-db-create-sbvrk" Feb 23 08:42:04 crc kubenswrapper[5047]: I0223 08:42:04.022033 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sbvrk" Feb 23 08:42:04 crc kubenswrapper[5047]: I0223 08:42:04.033499 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ce6c-account-create-update-hlzml" Feb 23 08:42:04 crc kubenswrapper[5047]: I0223 08:42:04.417926 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sbvrk"] Feb 23 08:42:04 crc kubenswrapper[5047]: W0223 08:42:04.623637 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d9c6bb4_ecce_45e6_abe5_a9312dca8ff9.slice/crio-69821c57acd26c734b8218f2f3686d11994bff6f853e93b3c668ac13770599c7 WatchSource:0}: Error finding container 69821c57acd26c734b8218f2f3686d11994bff6f853e93b3c668ac13770599c7: Status 404 returned error can't find the container with id 69821c57acd26c734b8218f2f3686d11994bff6f853e93b3c668ac13770599c7 Feb 23 08:42:04 crc kubenswrapper[5047]: I0223 08:42:04.626060 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ce6c-account-create-update-hlzml"] Feb 23 08:42:04 crc kubenswrapper[5047]: I0223 08:42:04.761550 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sbvrk" event={"ID":"9181e45e-92b4-4708-b4f6-6a6d2788f58c","Type":"ContainerStarted","Data":"f6f1b79ce137127b65dabf259d8de39db00053284a39ac300bf2af86e9705136"} Feb 23 08:42:04 crc kubenswrapper[5047]: I0223 08:42:04.761617 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sbvrk" event={"ID":"9181e45e-92b4-4708-b4f6-6a6d2788f58c","Type":"ContainerStarted","Data":"53d15c0f8d30fcf08881a3b0ace099b74782e1d29923aa7911e09b7d6979db48"} Feb 23 08:42:04 crc kubenswrapper[5047]: I0223 08:42:04.764019 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ce6c-account-create-update-hlzml" event={"ID":"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9","Type":"ContainerStarted","Data":"5537e05cd0355fd5f0df0c83cefaaf1ce918ee0b53b90094a0be5217922e10bf"} Feb 23 08:42:04 crc kubenswrapper[5047]: I0223 08:42:04.764068 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ce6c-account-create-update-hlzml" event={"ID":"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9","Type":"ContainerStarted","Data":"69821c57acd26c734b8218f2f3686d11994bff6f853e93b3c668ac13770599c7"} Feb 23 08:42:04 crc kubenswrapper[5047]: I0223 08:42:04.790773 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-sbvrk" podStartSLOduration=1.790728353 podStartE2EDuration="1.790728353s" podCreationTimestamp="2026-02-23 08:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:42:04.783771487 +0000 UTC m=+7047.035098641" watchObservedRunningTime="2026-02-23 08:42:04.790728353 +0000 UTC m=+7047.042055487" Feb 23 08:42:04 crc kubenswrapper[5047]: I0223 08:42:04.805755 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-ce6c-account-create-update-hlzml" podStartSLOduration=1.805726007 podStartE2EDuration="1.805726007s" podCreationTimestamp="2026-02-23 08:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:42:04.800438705 +0000 UTC m=+7047.051765879" watchObservedRunningTime="2026-02-23 08:42:04.805726007 +0000 UTC m=+7047.057053161" Feb 23 08:42:05 crc kubenswrapper[5047]: I0223 08:42:05.776636 5047 generic.go:334] "Generic (PLEG): container finished" podID="9181e45e-92b4-4708-b4f6-6a6d2788f58c" containerID="f6f1b79ce137127b65dabf259d8de39db00053284a39ac300bf2af86e9705136" exitCode=0 Feb 23 08:42:05 crc kubenswrapper[5047]: I0223 08:42:05.776722 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sbvrk" event={"ID":"9181e45e-92b4-4708-b4f6-6a6d2788f58c","Type":"ContainerDied","Data":"f6f1b79ce137127b65dabf259d8de39db00053284a39ac300bf2af86e9705136"} Feb 23 08:42:05 crc kubenswrapper[5047]: I0223 08:42:05.779453 5047 generic.go:334] "Generic (PLEG): container finished" podID="8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9" containerID="5537e05cd0355fd5f0df0c83cefaaf1ce918ee0b53b90094a0be5217922e10bf" exitCode=0 Feb 23 08:42:05 crc kubenswrapper[5047]: I0223 08:42:05.779499 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ce6c-account-create-update-hlzml" event={"ID":"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9","Type":"ContainerDied","Data":"5537e05cd0355fd5f0df0c83cefaaf1ce918ee0b53b90094a0be5217922e10bf"} Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.332077 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ce6c-account-create-update-hlzml" Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.339184 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sbvrk" Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.481426 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvq9j\" (UniqueName: \"kubernetes.io/projected/8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9-kube-api-access-zvq9j\") pod \"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9\" (UID: \"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9\") " Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.481592 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9-operator-scripts\") pod \"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9\" (UID: \"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9\") " Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.481738 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9181e45e-92b4-4708-b4f6-6a6d2788f58c-operator-scripts\") pod \"9181e45e-92b4-4708-b4f6-6a6d2788f58c\" (UID: \"9181e45e-92b4-4708-b4f6-6a6d2788f58c\") " Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.482036 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdhnn\" (UniqueName: \"kubernetes.io/projected/9181e45e-92b4-4708-b4f6-6a6d2788f58c-kube-api-access-xdhnn\") pod \"9181e45e-92b4-4708-b4f6-6a6d2788f58c\" (UID: \"9181e45e-92b4-4708-b4f6-6a6d2788f58c\") " Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.482631 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9181e45e-92b4-4708-b4f6-6a6d2788f58c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9181e45e-92b4-4708-b4f6-6a6d2788f58c" (UID: "9181e45e-92b4-4708-b4f6-6a6d2788f58c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.482887 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9181e45e-92b4-4708-b4f6-6a6d2788f58c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.486175 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9" (UID: "8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.492464 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9-kube-api-access-zvq9j" (OuterVolumeSpecName: "kube-api-access-zvq9j") pod "8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9" (UID: "8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9"). InnerVolumeSpecName "kube-api-access-zvq9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.505692 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9181e45e-92b4-4708-b4f6-6a6d2788f58c-kube-api-access-xdhnn" (OuterVolumeSpecName: "kube-api-access-xdhnn") pod "9181e45e-92b4-4708-b4f6-6a6d2788f58c" (UID: "9181e45e-92b4-4708-b4f6-6a6d2788f58c"). InnerVolumeSpecName "kube-api-access-xdhnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.585167 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvq9j\" (UniqueName: \"kubernetes.io/projected/8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9-kube-api-access-zvq9j\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.585237 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.585254 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdhnn\" (UniqueName: \"kubernetes.io/projected/9181e45e-92b4-4708-b4f6-6a6d2788f58c-kube-api-access-xdhnn\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.798598 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ce6c-account-create-update-hlzml" event={"ID":"8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9","Type":"ContainerDied","Data":"69821c57acd26c734b8218f2f3686d11994bff6f853e93b3c668ac13770599c7"} Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.798660 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69821c57acd26c734b8218f2f3686d11994bff6f853e93b3c668ac13770599c7" Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.798693 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ce6c-account-create-update-hlzml" Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.800220 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sbvrk" event={"ID":"9181e45e-92b4-4708-b4f6-6a6d2788f58c","Type":"ContainerDied","Data":"53d15c0f8d30fcf08881a3b0ace099b74782e1d29923aa7911e09b7d6979db48"} Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.800247 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53d15c0f8d30fcf08881a3b0ace099b74782e1d29923aa7911e09b7d6979db48" Feb 23 08:42:07 crc kubenswrapper[5047]: I0223 08:42:07.800300 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sbvrk" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.142082 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-xstzb"] Feb 23 08:42:09 crc kubenswrapper[5047]: E0223 08:42:09.143001 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9" containerName="mariadb-account-create-update" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.143019 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9" containerName="mariadb-account-create-update" Feb 23 08:42:09 crc kubenswrapper[5047]: E0223 08:42:09.143039 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9181e45e-92b4-4708-b4f6-6a6d2788f58c" containerName="mariadb-database-create" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.143045 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9181e45e-92b4-4708-b4f6-6a6d2788f58c" containerName="mariadb-database-create" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.143228 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="9181e45e-92b4-4708-b4f6-6a6d2788f58c" containerName="mariadb-database-create" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.143247 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9" containerName="mariadb-account-create-update" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.143984 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xstzb" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.146373 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.150106 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w65rc" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.150343 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.154766 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.171029 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xstzb"] Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.317603 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423c715b-9e17-481e-8cf4-2ff875c1c45b-config-data\") pod \"keystone-db-sync-xstzb\" (UID: \"423c715b-9e17-481e-8cf4-2ff875c1c45b\") " pod="openstack/keystone-db-sync-xstzb" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.317715 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423c715b-9e17-481e-8cf4-2ff875c1c45b-combined-ca-bundle\") pod \"keystone-db-sync-xstzb\" (UID: \"423c715b-9e17-481e-8cf4-2ff875c1c45b\") " pod="openstack/keystone-db-sync-xstzb" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.317778 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvgmj\" (UniqueName: \"kubernetes.io/projected/423c715b-9e17-481e-8cf4-2ff875c1c45b-kube-api-access-lvgmj\") pod \"keystone-db-sync-xstzb\" (UID: \"423c715b-9e17-481e-8cf4-2ff875c1c45b\") " pod="openstack/keystone-db-sync-xstzb" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.420237 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423c715b-9e17-481e-8cf4-2ff875c1c45b-config-data\") pod \"keystone-db-sync-xstzb\" (UID: \"423c715b-9e17-481e-8cf4-2ff875c1c45b\") " pod="openstack/keystone-db-sync-xstzb" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.420336 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423c715b-9e17-481e-8cf4-2ff875c1c45b-combined-ca-bundle\") pod \"keystone-db-sync-xstzb\" (UID: \"423c715b-9e17-481e-8cf4-2ff875c1c45b\") " pod="openstack/keystone-db-sync-xstzb" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.420433 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvgmj\" (UniqueName: \"kubernetes.io/projected/423c715b-9e17-481e-8cf4-2ff875c1c45b-kube-api-access-lvgmj\") pod \"keystone-db-sync-xstzb\" (UID: \"423c715b-9e17-481e-8cf4-2ff875c1c45b\") " pod="openstack/keystone-db-sync-xstzb" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.427543 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423c715b-9e17-481e-8cf4-2ff875c1c45b-config-data\") pod \"keystone-db-sync-xstzb\" (UID: \"423c715b-9e17-481e-8cf4-2ff875c1c45b\") " pod="openstack/keystone-db-sync-xstzb" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.427801 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423c715b-9e17-481e-8cf4-2ff875c1c45b-combined-ca-bundle\") pod \"keystone-db-sync-xstzb\" (UID: \"423c715b-9e17-481e-8cf4-2ff875c1c45b\") " pod="openstack/keystone-db-sync-xstzb" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.444368 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvgmj\" (UniqueName: \"kubernetes.io/projected/423c715b-9e17-481e-8cf4-2ff875c1c45b-kube-api-access-lvgmj\") pod \"keystone-db-sync-xstzb\" (UID: \"423c715b-9e17-481e-8cf4-2ff875c1c45b\") " pod="openstack/keystone-db-sync-xstzb" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.491858 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xstzb" Feb 23 08:42:09 crc kubenswrapper[5047]: I0223 08:42:09.969719 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xstzb"] Feb 23 08:42:10 crc kubenswrapper[5047]: I0223 08:42:10.827289 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xstzb" event={"ID":"423c715b-9e17-481e-8cf4-2ff875c1c45b","Type":"ContainerStarted","Data":"1778bcb455109828ad32fde50479e06aa89f1b36954d5812b3179dfd565392de"} Feb 23 08:42:13 crc kubenswrapper[5047]: I0223 08:42:13.343354 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:42:13 crc kubenswrapper[5047]: E0223 08:42:13.343590 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:42:15 crc kubenswrapper[5047]: I0223 08:42:15.875224 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xstzb" event={"ID":"423c715b-9e17-481e-8cf4-2ff875c1c45b","Type":"ContainerStarted","Data":"b8db065030ca1be9880114e8eba32cadb35bd96e02839eebdeace90dc4243c79"} Feb 23 08:42:16 crc kubenswrapper[5047]: I0223 08:42:16.034991 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 23 08:42:16 crc kubenswrapper[5047]: I0223 08:42:16.072624 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-xstzb" podStartSLOduration=1.600202392 podStartE2EDuration="7.072589041s" podCreationTimestamp="2026-02-23 08:42:09 +0000 UTC" firstStartedPulling="2026-02-23 08:42:09.982669005 +0000 UTC m=+7052.233996139" lastFinishedPulling="2026-02-23 08:42:15.455055644 +0000 UTC m=+7057.706382788" observedRunningTime="2026-02-23 08:42:15.904480869 +0000 UTC m=+7058.155808013" watchObservedRunningTime="2026-02-23 08:42:16.072589041 +0000 UTC m=+7058.323916185" Feb 23 08:42:17 crc kubenswrapper[5047]: I0223 08:42:17.897230 5047 generic.go:334] "Generic (PLEG): container finished" podID="423c715b-9e17-481e-8cf4-2ff875c1c45b" containerID="b8db065030ca1be9880114e8eba32cadb35bd96e02839eebdeace90dc4243c79" exitCode=0 Feb 23 08:42:17 crc kubenswrapper[5047]: I0223 08:42:17.897278 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xstzb" event={"ID":"423c715b-9e17-481e-8cf4-2ff875c1c45b","Type":"ContainerDied","Data":"b8db065030ca1be9880114e8eba32cadb35bd96e02839eebdeace90dc4243c79"} Feb 23 08:42:19 crc kubenswrapper[5047]: I0223 08:42:19.299342 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xstzb" Feb 23 08:42:19 crc kubenswrapper[5047]: I0223 08:42:19.439276 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423c715b-9e17-481e-8cf4-2ff875c1c45b-config-data\") pod \"423c715b-9e17-481e-8cf4-2ff875c1c45b\" (UID: \"423c715b-9e17-481e-8cf4-2ff875c1c45b\") " Feb 23 08:42:19 crc kubenswrapper[5047]: I0223 08:42:19.439512 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423c715b-9e17-481e-8cf4-2ff875c1c45b-combined-ca-bundle\") pod \"423c715b-9e17-481e-8cf4-2ff875c1c45b\" (UID: \"423c715b-9e17-481e-8cf4-2ff875c1c45b\") " Feb 23 08:42:19 crc kubenswrapper[5047]: I0223 08:42:19.439626 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvgmj\" (UniqueName: \"kubernetes.io/projected/423c715b-9e17-481e-8cf4-2ff875c1c45b-kube-api-access-lvgmj\") pod \"423c715b-9e17-481e-8cf4-2ff875c1c45b\" (UID: \"423c715b-9e17-481e-8cf4-2ff875c1c45b\") " Feb 23 08:42:19 crc kubenswrapper[5047]: I0223 08:42:19.452610 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/423c715b-9e17-481e-8cf4-2ff875c1c45b-kube-api-access-lvgmj" (OuterVolumeSpecName: "kube-api-access-lvgmj") pod "423c715b-9e17-481e-8cf4-2ff875c1c45b" (UID: "423c715b-9e17-481e-8cf4-2ff875c1c45b"). InnerVolumeSpecName "kube-api-access-lvgmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:42:19 crc kubenswrapper[5047]: I0223 08:42:19.461951 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423c715b-9e17-481e-8cf4-2ff875c1c45b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "423c715b-9e17-481e-8cf4-2ff875c1c45b" (UID: "423c715b-9e17-481e-8cf4-2ff875c1c45b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:42:19 crc kubenswrapper[5047]: I0223 08:42:19.514721 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423c715b-9e17-481e-8cf4-2ff875c1c45b-config-data" (OuterVolumeSpecName: "config-data") pod "423c715b-9e17-481e-8cf4-2ff875c1c45b" (UID: "423c715b-9e17-481e-8cf4-2ff875c1c45b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:42:19 crc kubenswrapper[5047]: I0223 08:42:19.542365 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423c715b-9e17-481e-8cf4-2ff875c1c45b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:19 crc kubenswrapper[5047]: I0223 08:42:19.542420 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvgmj\" (UniqueName: \"kubernetes.io/projected/423c715b-9e17-481e-8cf4-2ff875c1c45b-kube-api-access-lvgmj\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:19 crc kubenswrapper[5047]: I0223 08:42:19.542441 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423c715b-9e17-481e-8cf4-2ff875c1c45b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:19 crc kubenswrapper[5047]: I0223 08:42:19.921576 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xstzb" event={"ID":"423c715b-9e17-481e-8cf4-2ff875c1c45b","Type":"ContainerDied","Data":"1778bcb455109828ad32fde50479e06aa89f1b36954d5812b3179dfd565392de"} Feb 23 08:42:19 crc kubenswrapper[5047]: I0223 08:42:19.921642 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1778bcb455109828ad32fde50479e06aa89f1b36954d5812b3179dfd565392de" Feb 23 08:42:19 crc kubenswrapper[5047]: I0223 08:42:19.921650 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xstzb" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.564056 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb8ffc8f7-vngbx"] Feb 23 08:42:20 crc kubenswrapper[5047]: E0223 08:42:20.564387 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="423c715b-9e17-481e-8cf4-2ff875c1c45b" containerName="keystone-db-sync" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.564398 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="423c715b-9e17-481e-8cf4-2ff875c1c45b" containerName="keystone-db-sync" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.564586 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="423c715b-9e17-481e-8cf4-2ff875c1c45b" containerName="keystone-db-sync" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.565373 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.591043 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb8ffc8f7-vngbx"] Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.617437 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-frn7b"] Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.618853 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.623754 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w65rc" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.623972 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.623992 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.624153 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.627432 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.630069 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-frn7b"] Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.665343 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9crr9\" (UniqueName: \"kubernetes.io/projected/ea0fb049-f486-4799-8cc4-31c66c897340-kube-api-access-9crr9\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.665399 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-dns-svc\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.665676 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-ovsdbserver-nb\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.665779 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-config\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.665838 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-ovsdbserver-sb\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.767296 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-fernet-keys\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.767373 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9crr9\" (UniqueName: \"kubernetes.io/projected/ea0fb049-f486-4799-8cc4-31c66c897340-kube-api-access-9crr9\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.767394 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-combined-ca-bundle\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.767499 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-dns-svc\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.767630 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfrmn\" (UniqueName: \"kubernetes.io/projected/120f91c7-b542-44c3-8399-bbb9f2638e9d-kube-api-access-nfrmn\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.767722 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-ovsdbserver-nb\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.767801 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-config\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.767864 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-ovsdbserver-sb\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.767965 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-scripts\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.767996 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-config-data\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.768118 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-credential-keys\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.768553 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-dns-svc\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.768592 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-ovsdbserver-nb\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.768972 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-config\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.769096 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-ovsdbserver-sb\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.791276 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9crr9\" (UniqueName: \"kubernetes.io/projected/ea0fb049-f486-4799-8cc4-31c66c897340-kube-api-access-9crr9\") pod \"dnsmasq-dns-cb8ffc8f7-vngbx\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.870276 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfrmn\" (UniqueName: \"kubernetes.io/projected/120f91c7-b542-44c3-8399-bbb9f2638e9d-kube-api-access-nfrmn\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.870425 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-scripts\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.870450 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-config-data\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.870505 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-credential-keys\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.870566 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-fernet-keys\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.870615 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-combined-ca-bundle\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.873774 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-scripts\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.874217 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-combined-ca-bundle\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.875344 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-credential-keys\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.875669 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-config-data\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.883965 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-fernet-keys\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.890362 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfrmn\" (UniqueName: \"kubernetes.io/projected/120f91c7-b542-44c3-8399-bbb9f2638e9d-kube-api-access-nfrmn\") pod \"keystone-bootstrap-frn7b\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.896615 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:20 crc kubenswrapper[5047]: I0223 08:42:20.953870 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:21 crc kubenswrapper[5047]: I0223 08:42:21.374693 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb8ffc8f7-vngbx"] Feb 23 08:42:21 crc kubenswrapper[5047]: W0223 08:42:21.442983 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod120f91c7_b542_44c3_8399_bbb9f2638e9d.slice/crio-3e53f0fe35cdc660da6791cd04e2fa8ccce60280f0b740e0552eb44df1b6f347 WatchSource:0}: Error finding container 3e53f0fe35cdc660da6791cd04e2fa8ccce60280f0b740e0552eb44df1b6f347: Status 404 returned error can't find the container with id 3e53f0fe35cdc660da6791cd04e2fa8ccce60280f0b740e0552eb44df1b6f347 Feb 23 08:42:21 crc kubenswrapper[5047]: I0223 08:42:21.443274 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-frn7b"] Feb 23 08:42:21 crc kubenswrapper[5047]: I0223 08:42:21.947106 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-frn7b" event={"ID":"120f91c7-b542-44c3-8399-bbb9f2638e9d","Type":"ContainerStarted","Data":"b1646bdaab221b27450b2e82c3c3b57b4d98452c88093359f43031a9f8604659"} Feb 23 08:42:21 crc kubenswrapper[5047]: I0223 08:42:21.947231 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-frn7b" event={"ID":"120f91c7-b542-44c3-8399-bbb9f2638e9d","Type":"ContainerStarted","Data":"3e53f0fe35cdc660da6791cd04e2fa8ccce60280f0b740e0552eb44df1b6f347"} Feb 23 08:42:21 crc kubenswrapper[5047]: I0223 08:42:21.950178 5047 generic.go:334] "Generic (PLEG): container finished" podID="ea0fb049-f486-4799-8cc4-31c66c897340" containerID="1dac1d9d7ceac931c476e1fa89a68153eaabc3efa8351493368913e15b132aa7" exitCode=0 Feb 23 08:42:21 crc kubenswrapper[5047]: I0223 08:42:21.950244 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" event={"ID":"ea0fb049-f486-4799-8cc4-31c66c897340","Type":"ContainerDied","Data":"1dac1d9d7ceac931c476e1fa89a68153eaabc3efa8351493368913e15b132aa7"} Feb 23 08:42:21 crc kubenswrapper[5047]: I0223 08:42:21.950281 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" event={"ID":"ea0fb049-f486-4799-8cc4-31c66c897340","Type":"ContainerStarted","Data":"1e0b35e267dd0cf0de662bb8dbf36b4dc5bb65d9763cf0979b958577af88e7ea"} Feb 23 08:42:21 crc kubenswrapper[5047]: I0223 08:42:21.978176 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-frn7b" podStartSLOduration=1.978144849 podStartE2EDuration="1.978144849s" podCreationTimestamp="2026-02-23 08:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:42:21.971776367 +0000 UTC m=+7064.223103531" watchObservedRunningTime="2026-02-23 08:42:21.978144849 +0000 UTC m=+7064.229471973" Feb 23 08:42:22 crc kubenswrapper[5047]: I0223 08:42:22.965250 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" event={"ID":"ea0fb049-f486-4799-8cc4-31c66c897340","Type":"ContainerStarted","Data":"ab938f6b20aa6b8432c90c929ffb9c529dc3c97a1dc55488a857777ae5c5002e"} Feb 23 08:42:23 crc kubenswrapper[5047]: I0223 08:42:23.004547 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" podStartSLOduration=3.00452023 podStartE2EDuration="3.00452023s" podCreationTimestamp="2026-02-23 08:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:42:22.991714447 +0000 UTC m=+7065.243041611" watchObservedRunningTime="2026-02-23 08:42:23.00452023 +0000 UTC m=+7065.255847374" Feb 23 08:42:23 crc kubenswrapper[5047]: I0223 08:42:23.979497 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:26 crc kubenswrapper[5047]: I0223 08:42:26.003512 5047 generic.go:334] "Generic (PLEG): container finished" podID="120f91c7-b542-44c3-8399-bbb9f2638e9d" containerID="b1646bdaab221b27450b2e82c3c3b57b4d98452c88093359f43031a9f8604659" exitCode=0 Feb 23 08:42:26 crc kubenswrapper[5047]: I0223 08:42:26.003586 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-frn7b" event={"ID":"120f91c7-b542-44c3-8399-bbb9f2638e9d","Type":"ContainerDied","Data":"b1646bdaab221b27450b2e82c3c3b57b4d98452c88093359f43031a9f8604659"} Feb 23 08:42:26 crc kubenswrapper[5047]: I0223 08:42:26.342328 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:42:26 crc kubenswrapper[5047]: E0223 08:42:26.342852 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.500223 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.541506 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfrmn\" (UniqueName: \"kubernetes.io/projected/120f91c7-b542-44c3-8399-bbb9f2638e9d-kube-api-access-nfrmn\") pod \"120f91c7-b542-44c3-8399-bbb9f2638e9d\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.541696 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-fernet-keys\") pod \"120f91c7-b542-44c3-8399-bbb9f2638e9d\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.541754 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-config-data\") pod \"120f91c7-b542-44c3-8399-bbb9f2638e9d\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.541813 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-combined-ca-bundle\") pod \"120f91c7-b542-44c3-8399-bbb9f2638e9d\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.541845 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-scripts\") pod \"120f91c7-b542-44c3-8399-bbb9f2638e9d\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.541947 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-credential-keys\") pod \"120f91c7-b542-44c3-8399-bbb9f2638e9d\" (UID: \"120f91c7-b542-44c3-8399-bbb9f2638e9d\") " Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.550855 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "120f91c7-b542-44c3-8399-bbb9f2638e9d" (UID: "120f91c7-b542-44c3-8399-bbb9f2638e9d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.560172 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-scripts" (OuterVolumeSpecName: "scripts") pod "120f91c7-b542-44c3-8399-bbb9f2638e9d" (UID: "120f91c7-b542-44c3-8399-bbb9f2638e9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.564172 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/120f91c7-b542-44c3-8399-bbb9f2638e9d-kube-api-access-nfrmn" (OuterVolumeSpecName: "kube-api-access-nfrmn") pod "120f91c7-b542-44c3-8399-bbb9f2638e9d" (UID: "120f91c7-b542-44c3-8399-bbb9f2638e9d"). InnerVolumeSpecName "kube-api-access-nfrmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.566015 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "120f91c7-b542-44c3-8399-bbb9f2638e9d" (UID: "120f91c7-b542-44c3-8399-bbb9f2638e9d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.594359 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-config-data" (OuterVolumeSpecName: "config-data") pod "120f91c7-b542-44c3-8399-bbb9f2638e9d" (UID: "120f91c7-b542-44c3-8399-bbb9f2638e9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.605167 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "120f91c7-b542-44c3-8399-bbb9f2638e9d" (UID: "120f91c7-b542-44c3-8399-bbb9f2638e9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.646493 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfrmn\" (UniqueName: \"kubernetes.io/projected/120f91c7-b542-44c3-8399-bbb9f2638e9d-kube-api-access-nfrmn\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.646983 5047 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.646995 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.647005 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.647014 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:27 crc kubenswrapper[5047]: I0223 08:42:27.647023 5047 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/120f91c7-b542-44c3-8399-bbb9f2638e9d-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.028384 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-frn7b" event={"ID":"120f91c7-b542-44c3-8399-bbb9f2638e9d","Type":"ContainerDied","Data":"3e53f0fe35cdc660da6791cd04e2fa8ccce60280f0b740e0552eb44df1b6f347"} Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.028449 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e53f0fe35cdc660da6791cd04e2fa8ccce60280f0b740e0552eb44df1b6f347" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.028490 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-frn7b" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.208784 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-frn7b"] Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.216342 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-frn7b"] Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.319849 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2tfvh"] Feb 23 08:42:28 crc kubenswrapper[5047]: E0223 08:42:28.320403 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120f91c7-b542-44c3-8399-bbb9f2638e9d" containerName="keystone-bootstrap" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.320428 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="120f91c7-b542-44c3-8399-bbb9f2638e9d" containerName="keystone-bootstrap" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.320622 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="120f91c7-b542-44c3-8399-bbb9f2638e9d" containerName="keystone-bootstrap" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.321535 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.326334 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.326877 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w65rc" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.327029 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.327672 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.327979 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.332742 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2tfvh"] Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.352746 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="120f91c7-b542-44c3-8399-bbb9f2638e9d" path="/var/lib/kubelet/pods/120f91c7-b542-44c3-8399-bbb9f2638e9d/volumes" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.359113 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-scripts\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.359563 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-fernet-keys\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.359779 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-combined-ca-bundle\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.360885 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-credential-keys\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.361115 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-config-data\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.361326 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw54h\" (UniqueName: \"kubernetes.io/projected/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-kube-api-access-sw54h\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.462719 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-scripts\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.462791 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-fernet-keys\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.462838 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-combined-ca-bundle\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.462891 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-credential-keys\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.462926 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-config-data\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.462960 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw54h\" (UniqueName: \"kubernetes.io/projected/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-kube-api-access-sw54h\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.468346 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-fernet-keys\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.468360 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-combined-ca-bundle\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.468506 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-scripts\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.471249 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-credential-keys\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.488317 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-config-data\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.496540 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw54h\" (UniqueName: \"kubernetes.io/projected/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-kube-api-access-sw54h\") pod \"keystone-bootstrap-2tfvh\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:28 crc kubenswrapper[5047]: I0223 08:42:28.688797 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:29 crc kubenswrapper[5047]: I0223 08:42:29.188967 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2tfvh"] Feb 23 08:42:30 crc kubenswrapper[5047]: I0223 08:42:30.053087 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2tfvh" event={"ID":"8ab4fce8-e5e0-42e1-b0ec-b185ab028913","Type":"ContainerStarted","Data":"5f07377a165e12fbd1188e6620e4071673d8d697c7450db3dcd1239ec9b501a5"} Feb 23 08:42:30 crc kubenswrapper[5047]: I0223 08:42:30.053436 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2tfvh" event={"ID":"8ab4fce8-e5e0-42e1-b0ec-b185ab028913","Type":"ContainerStarted","Data":"e1626e01475e4bcb67dc692712a3bef84a237bc7229f116086c951cd74fe84d9"} Feb 23 08:42:30 crc kubenswrapper[5047]: I0223 08:42:30.089724 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2tfvh" podStartSLOduration=2.089702095 podStartE2EDuration="2.089702095s" podCreationTimestamp="2026-02-23 08:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:42:30.079356118 +0000 UTC m=+7072.330683272" watchObservedRunningTime="2026-02-23 08:42:30.089702095 +0000 UTC m=+7072.341029259" Feb 23 08:42:30 crc kubenswrapper[5047]: I0223 08:42:30.898146 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.009379 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdb4868d5-th7hh"] Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.009712 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" podUID="61bec6f6-3b99-42f1-bc30-c73dada51d9d" containerName="dnsmasq-dns" containerID="cri-o://5206799de092a8bb39701d066a005aac2011999f7a223af18f57b3429283ceeb" gracePeriod=10 Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.489017 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.620072 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-ovsdbserver-sb\") pod \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.620153 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-config\") pod \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.620250 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zlkz\" (UniqueName: \"kubernetes.io/projected/61bec6f6-3b99-42f1-bc30-c73dada51d9d-kube-api-access-6zlkz\") pod \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.620333 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-ovsdbserver-nb\") pod \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.620356 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-dns-svc\") pod \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\" (UID: \"61bec6f6-3b99-42f1-bc30-c73dada51d9d\") " Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.635998 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61bec6f6-3b99-42f1-bc30-c73dada51d9d-kube-api-access-6zlkz" (OuterVolumeSpecName: "kube-api-access-6zlkz") pod "61bec6f6-3b99-42f1-bc30-c73dada51d9d" (UID: "61bec6f6-3b99-42f1-bc30-c73dada51d9d"). InnerVolumeSpecName "kube-api-access-6zlkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.662500 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "61bec6f6-3b99-42f1-bc30-c73dada51d9d" (UID: "61bec6f6-3b99-42f1-bc30-c73dada51d9d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.686065 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "61bec6f6-3b99-42f1-bc30-c73dada51d9d" (UID: "61bec6f6-3b99-42f1-bc30-c73dada51d9d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.686465 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-config" (OuterVolumeSpecName: "config") pod "61bec6f6-3b99-42f1-bc30-c73dada51d9d" (UID: "61bec6f6-3b99-42f1-bc30-c73dada51d9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.694733 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "61bec6f6-3b99-42f1-bc30-c73dada51d9d" (UID: "61bec6f6-3b99-42f1-bc30-c73dada51d9d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.722143 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.722181 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.722193 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.722207 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61bec6f6-3b99-42f1-bc30-c73dada51d9d-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:31 crc kubenswrapper[5047]: I0223 08:42:31.722220 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zlkz\" (UniqueName: \"kubernetes.io/projected/61bec6f6-3b99-42f1-bc30-c73dada51d9d-kube-api-access-6zlkz\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:32 crc kubenswrapper[5047]: I0223 08:42:32.076775 5047 generic.go:334] "Generic (PLEG): container finished" podID="61bec6f6-3b99-42f1-bc30-c73dada51d9d" containerID="5206799de092a8bb39701d066a005aac2011999f7a223af18f57b3429283ceeb" exitCode=0 Feb 23 08:42:32 crc kubenswrapper[5047]: I0223 08:42:32.076836 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" Feb 23 08:42:32 crc kubenswrapper[5047]: I0223 08:42:32.076862 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" event={"ID":"61bec6f6-3b99-42f1-bc30-c73dada51d9d","Type":"ContainerDied","Data":"5206799de092a8bb39701d066a005aac2011999f7a223af18f57b3429283ceeb"} Feb 23 08:42:32 crc kubenswrapper[5047]: I0223 08:42:32.077152 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb4868d5-th7hh" event={"ID":"61bec6f6-3b99-42f1-bc30-c73dada51d9d","Type":"ContainerDied","Data":"6d0c8470e1e28a8205943cb0a4f69e813f46c6b6104e1ef4ee07145a4dfcb069"} Feb 23 08:42:32 crc kubenswrapper[5047]: I0223 08:42:32.077212 5047 scope.go:117] "RemoveContainer" containerID="5206799de092a8bb39701d066a005aac2011999f7a223af18f57b3429283ceeb" Feb 23 08:42:32 crc kubenswrapper[5047]: I0223 08:42:32.113748 5047 scope.go:117] "RemoveContainer" containerID="a4e3762cdf941e48ed1fc8b7bf038777c5a39177378fecc991b895afb54624b2" Feb 23 08:42:32 crc kubenswrapper[5047]: I0223 08:42:32.152412 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdb4868d5-th7hh"] Feb 23 08:42:32 crc kubenswrapper[5047]: I0223 08:42:32.163866 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fdb4868d5-th7hh"] Feb 23 08:42:32 crc kubenswrapper[5047]: I0223 08:42:32.280725 5047 scope.go:117] "RemoveContainer" containerID="5206799de092a8bb39701d066a005aac2011999f7a223af18f57b3429283ceeb" Feb 23 08:42:32 crc kubenswrapper[5047]: E0223 08:42:32.281359 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5206799de092a8bb39701d066a005aac2011999f7a223af18f57b3429283ceeb\": container with ID starting with 5206799de092a8bb39701d066a005aac2011999f7a223af18f57b3429283ceeb not found: ID does not exist" containerID="5206799de092a8bb39701d066a005aac2011999f7a223af18f57b3429283ceeb" Feb 23 08:42:32 crc kubenswrapper[5047]: I0223 08:42:32.281433 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5206799de092a8bb39701d066a005aac2011999f7a223af18f57b3429283ceeb"} err="failed to get container status \"5206799de092a8bb39701d066a005aac2011999f7a223af18f57b3429283ceeb\": rpc error: code = NotFound desc = could not find container \"5206799de092a8bb39701d066a005aac2011999f7a223af18f57b3429283ceeb\": container with ID starting with 5206799de092a8bb39701d066a005aac2011999f7a223af18f57b3429283ceeb not found: ID does not exist" Feb 23 08:42:32 crc kubenswrapper[5047]: I0223 08:42:32.281493 5047 scope.go:117] "RemoveContainer" containerID="a4e3762cdf941e48ed1fc8b7bf038777c5a39177378fecc991b895afb54624b2" Feb 23 08:42:32 crc kubenswrapper[5047]: E0223 08:42:32.282281 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4e3762cdf941e48ed1fc8b7bf038777c5a39177378fecc991b895afb54624b2\": container with ID starting with a4e3762cdf941e48ed1fc8b7bf038777c5a39177378fecc991b895afb54624b2 not found: ID does not exist" containerID="a4e3762cdf941e48ed1fc8b7bf038777c5a39177378fecc991b895afb54624b2" Feb 23 08:42:32 crc kubenswrapper[5047]: I0223 08:42:32.282345 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4e3762cdf941e48ed1fc8b7bf038777c5a39177378fecc991b895afb54624b2"} err="failed to get container status \"a4e3762cdf941e48ed1fc8b7bf038777c5a39177378fecc991b895afb54624b2\": rpc error: code = NotFound desc = could not find container \"a4e3762cdf941e48ed1fc8b7bf038777c5a39177378fecc991b895afb54624b2\": container with ID starting with a4e3762cdf941e48ed1fc8b7bf038777c5a39177378fecc991b895afb54624b2 not found: ID does not exist" Feb 23 08:42:32 crc kubenswrapper[5047]: I0223 08:42:32.361956 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61bec6f6-3b99-42f1-bc30-c73dada51d9d" path="/var/lib/kubelet/pods/61bec6f6-3b99-42f1-bc30-c73dada51d9d/volumes" Feb 23 08:42:33 crc kubenswrapper[5047]: I0223 08:42:33.109070 5047 generic.go:334] "Generic (PLEG): container finished" podID="8ab4fce8-e5e0-42e1-b0ec-b185ab028913" containerID="5f07377a165e12fbd1188e6620e4071673d8d697c7450db3dcd1239ec9b501a5" exitCode=0 Feb 23 08:42:33 crc kubenswrapper[5047]: I0223 08:42:33.109164 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2tfvh" event={"ID":"8ab4fce8-e5e0-42e1-b0ec-b185ab028913","Type":"ContainerDied","Data":"5f07377a165e12fbd1188e6620e4071673d8d697c7450db3dcd1239ec9b501a5"} Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.536476 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.705193 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-config-data\") pod \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.705317 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-fernet-keys\") pod \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.705350 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-credential-keys\") pod \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.705438 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-combined-ca-bundle\") pod \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.705544 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw54h\" (UniqueName: \"kubernetes.io/projected/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-kube-api-access-sw54h\") pod \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.705586 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-scripts\") pod \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\" (UID: \"8ab4fce8-e5e0-42e1-b0ec-b185ab028913\") " Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.711872 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8ab4fce8-e5e0-42e1-b0ec-b185ab028913" (UID: "8ab4fce8-e5e0-42e1-b0ec-b185ab028913"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.713210 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8ab4fce8-e5e0-42e1-b0ec-b185ab028913" (UID: "8ab4fce8-e5e0-42e1-b0ec-b185ab028913"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.716007 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-kube-api-access-sw54h" (OuterVolumeSpecName: "kube-api-access-sw54h") pod "8ab4fce8-e5e0-42e1-b0ec-b185ab028913" (UID: "8ab4fce8-e5e0-42e1-b0ec-b185ab028913"). InnerVolumeSpecName "kube-api-access-sw54h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.716062 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-scripts" (OuterVolumeSpecName: "scripts") pod "8ab4fce8-e5e0-42e1-b0ec-b185ab028913" (UID: "8ab4fce8-e5e0-42e1-b0ec-b185ab028913"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.734826 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ab4fce8-e5e0-42e1-b0ec-b185ab028913" (UID: "8ab4fce8-e5e0-42e1-b0ec-b185ab028913"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.748797 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-config-data" (OuterVolumeSpecName: "config-data") pod "8ab4fce8-e5e0-42e1-b0ec-b185ab028913" (UID: "8ab4fce8-e5e0-42e1-b0ec-b185ab028913"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.818731 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw54h\" (UniqueName: \"kubernetes.io/projected/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-kube-api-access-sw54h\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.818778 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.818792 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.818807 5047 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.818826 5047 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:34 crc kubenswrapper[5047]: I0223 08:42:34.818841 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab4fce8-e5e0-42e1-b0ec-b185ab028913-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.136559 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2tfvh" event={"ID":"8ab4fce8-e5e0-42e1-b0ec-b185ab028913","Type":"ContainerDied","Data":"e1626e01475e4bcb67dc692712a3bef84a237bc7229f116086c951cd74fe84d9"} Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.136654 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1626e01475e4bcb67dc692712a3bef84a237bc7229f116086c951cd74fe84d9" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.136673 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2tfvh" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.378404 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-75f6dd64bf-sswjk"] Feb 23 08:42:35 crc kubenswrapper[5047]: E0223 08:42:35.378950 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bec6f6-3b99-42f1-bc30-c73dada51d9d" containerName="init" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.378977 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bec6f6-3b99-42f1-bc30-c73dada51d9d" containerName="init" Feb 23 08:42:35 crc kubenswrapper[5047]: E0223 08:42:35.379024 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bec6f6-3b99-42f1-bc30-c73dada51d9d" containerName="dnsmasq-dns" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.379037 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bec6f6-3b99-42f1-bc30-c73dada51d9d" containerName="dnsmasq-dns" Feb 23 08:42:35 crc kubenswrapper[5047]: E0223 08:42:35.379050 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab4fce8-e5e0-42e1-b0ec-b185ab028913" containerName="keystone-bootstrap" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.379063 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab4fce8-e5e0-42e1-b0ec-b185ab028913" containerName="keystone-bootstrap" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.379327 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab4fce8-e5e0-42e1-b0ec-b185ab028913" containerName="keystone-bootstrap" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.379348 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="61bec6f6-3b99-42f1-bc30-c73dada51d9d" containerName="dnsmasq-dns" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.380262 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.387535 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.387580 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.388122 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.388138 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.388640 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.390324 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w65rc" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.408532 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-75f6dd64bf-sswjk"] Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.429076 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-combined-ca-bundle\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.429129 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-public-tls-certs\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.429175 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd7x2\" (UniqueName: \"kubernetes.io/projected/55697218-de1b-424f-b5ff-2d0806e54a96-kube-api-access-gd7x2\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.429242 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-internal-tls-certs\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.429269 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-scripts\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.429302 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-fernet-keys\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.429339 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-credential-keys\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.429368 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-config-data\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.531103 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-scripts\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.531162 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-fernet-keys\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.531192 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-credential-keys\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.531217 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-config-data\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.531257 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-combined-ca-bundle\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.531278 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-public-tls-certs\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.531312 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd7x2\" (UniqueName: \"kubernetes.io/projected/55697218-de1b-424f-b5ff-2d0806e54a96-kube-api-access-gd7x2\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.531364 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-internal-tls-certs\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.537387 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-public-tls-certs\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.538484 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-scripts\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.538532 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-credential-keys\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.539205 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-internal-tls-certs\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.539509 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-combined-ca-bundle\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.546537 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-fernet-keys\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.551861 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-config-data\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.552006 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd7x2\" (UniqueName: \"kubernetes.io/projected/55697218-de1b-424f-b5ff-2d0806e54a96-kube-api-access-gd7x2\") pod \"keystone-75f6dd64bf-sswjk\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:35 crc kubenswrapper[5047]: I0223 08:42:35.707699 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:36 crc kubenswrapper[5047]: I0223 08:42:36.276493 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-75f6dd64bf-sswjk"] Feb 23 08:42:37 crc kubenswrapper[5047]: I0223 08:42:37.160621 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-75f6dd64bf-sswjk" event={"ID":"55697218-de1b-424f-b5ff-2d0806e54a96","Type":"ContainerStarted","Data":"a2aa3c71cf56d92d78a6319101098dd98b5f40cdc1f1960e2e0f189d122e8285"} Feb 23 08:42:37 crc kubenswrapper[5047]: I0223 08:42:37.160722 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-75f6dd64bf-sswjk" event={"ID":"55697218-de1b-424f-b5ff-2d0806e54a96","Type":"ContainerStarted","Data":"6e6604c64fd88ae2771dbb3358825a5cb8bb77bda481a6b3261006aa838d5630"} Feb 23 08:42:37 crc kubenswrapper[5047]: I0223 08:42:37.160949 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:42:37 crc kubenswrapper[5047]: I0223 08:42:37.196617 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-75f6dd64bf-sswjk" podStartSLOduration=2.196570931 podStartE2EDuration="2.196570931s" podCreationTimestamp="2026-02-23 08:42:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:42:37.182990726 +0000 UTC m=+7079.434317930" watchObservedRunningTime="2026-02-23 08:42:37.196570931 +0000 UTC m=+7079.447898115" Feb 23 08:42:40 crc kubenswrapper[5047]: I0223 08:42:40.341245 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:42:40 crc kubenswrapper[5047]: E0223 08:42:40.341848 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:42:53 crc kubenswrapper[5047]: I0223 08:42:53.340603 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:42:54 crc kubenswrapper[5047]: I0223 08:42:54.365091 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"5480ffc93828193b25025f4f8362ac8f87100f59cbe44b98393e3159b56048e0"} Feb 23 08:43:07 crc kubenswrapper[5047]: I0223 08:43:07.377579 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.560119 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.562123 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.567367 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.567388 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.567967 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7wvdc" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.594129 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.614188 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c6ac225-0341-4574-b14d-8c904e846730-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " pod="openstack/openstackclient" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.614308 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bx4s\" (UniqueName: \"kubernetes.io/projected/7c6ac225-0341-4574-b14d-8c904e846730-kube-api-access-6bx4s\") pod \"openstackclient\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " pod="openstack/openstackclient" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.614372 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c6ac225-0341-4574-b14d-8c904e846730-openstack-config-secret\") pod \"openstackclient\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " pod="openstack/openstackclient" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.614519 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c6ac225-0341-4574-b14d-8c904e846730-openstack-config\") pod \"openstackclient\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " pod="openstack/openstackclient" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.716522 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c6ac225-0341-4574-b14d-8c904e846730-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " pod="openstack/openstackclient" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.716603 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bx4s\" (UniqueName: \"kubernetes.io/projected/7c6ac225-0341-4574-b14d-8c904e846730-kube-api-access-6bx4s\") pod \"openstackclient\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " pod="openstack/openstackclient" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.716642 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c6ac225-0341-4574-b14d-8c904e846730-openstack-config-secret\") pod \"openstackclient\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " pod="openstack/openstackclient" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.716726 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c6ac225-0341-4574-b14d-8c904e846730-openstack-config\") pod \"openstackclient\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " pod="openstack/openstackclient" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.717762 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c6ac225-0341-4574-b14d-8c904e846730-openstack-config\") pod \"openstackclient\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " pod="openstack/openstackclient" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.725621 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c6ac225-0341-4574-b14d-8c904e846730-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " pod="openstack/openstackclient" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.726965 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c6ac225-0341-4574-b14d-8c904e846730-openstack-config-secret\") pod \"openstackclient\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " pod="openstack/openstackclient" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.738489 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bx4s\" (UniqueName: \"kubernetes.io/projected/7c6ac225-0341-4574-b14d-8c904e846730-kube-api-access-6bx4s\") pod \"openstackclient\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " pod="openstack/openstackclient" Feb 23 08:43:08 crc kubenswrapper[5047]: I0223 08:43:08.892783 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:43:09 crc kubenswrapper[5047]: I0223 08:43:09.389612 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 08:43:09 crc kubenswrapper[5047]: I0223 08:43:09.392155 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:43:09 crc kubenswrapper[5047]: I0223 08:43:09.537867 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7c6ac225-0341-4574-b14d-8c904e846730","Type":"ContainerStarted","Data":"2345e36fc4ce1562be52384a07e6b535e8827a9d5331de30db540a505ff761eb"} Feb 23 08:43:21 crc kubenswrapper[5047]: I0223 08:43:21.720132 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7c6ac225-0341-4574-b14d-8c904e846730","Type":"ContainerStarted","Data":"ced05c29dc6620c644d56af1bbfde94a7814b0753018993fa42f72ce7a42dbad"} Feb 23 08:43:21 crc kubenswrapper[5047]: I0223 08:43:21.756450 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.9304072620000001 podStartE2EDuration="13.756424605s" podCreationTimestamp="2026-02-23 08:43:08 +0000 UTC" firstStartedPulling="2026-02-23 08:43:09.391820383 +0000 UTC m=+7111.643147517" lastFinishedPulling="2026-02-23 08:43:21.217837686 +0000 UTC m=+7123.469164860" observedRunningTime="2026-02-23 08:43:21.742095619 +0000 UTC m=+7123.993422763" watchObservedRunningTime="2026-02-23 08:43:21.756424605 +0000 UTC m=+7124.007751749" Feb 23 08:44:35 crc kubenswrapper[5047]: I0223 08:44:35.409005 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2z2bj"] Feb 23 08:44:35 crc kubenswrapper[5047]: I0223 08:44:35.412876 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:35 crc kubenswrapper[5047]: I0223 08:44:35.420089 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2z2bj"] Feb 23 08:44:35 crc kubenswrapper[5047]: I0223 08:44:35.510620 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee247f6-dbc5-4e5c-b25c-bfbab835035d-catalog-content\") pod \"certified-operators-2z2bj\" (UID: \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\") " pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:35 crc kubenswrapper[5047]: I0223 08:44:35.510700 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs6nk\" (UniqueName: \"kubernetes.io/projected/bee247f6-dbc5-4e5c-b25c-bfbab835035d-kube-api-access-bs6nk\") pod \"certified-operators-2z2bj\" (UID: \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\") " pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:35 crc kubenswrapper[5047]: I0223 08:44:35.510741 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee247f6-dbc5-4e5c-b25c-bfbab835035d-utilities\") pod \"certified-operators-2z2bj\" (UID: \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\") " pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:35 crc kubenswrapper[5047]: I0223 08:44:35.612514 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee247f6-dbc5-4e5c-b25c-bfbab835035d-catalog-content\") pod \"certified-operators-2z2bj\" (UID: \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\") " pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:35 crc kubenswrapper[5047]: I0223 08:44:35.612605 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs6nk\" (UniqueName: \"kubernetes.io/projected/bee247f6-dbc5-4e5c-b25c-bfbab835035d-kube-api-access-bs6nk\") pod \"certified-operators-2z2bj\" (UID: \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\") " pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:35 crc kubenswrapper[5047]: I0223 08:44:35.612671 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee247f6-dbc5-4e5c-b25c-bfbab835035d-utilities\") pod \"certified-operators-2z2bj\" (UID: \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\") " pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:35 crc kubenswrapper[5047]: I0223 08:44:35.613241 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee247f6-dbc5-4e5c-b25c-bfbab835035d-catalog-content\") pod \"certified-operators-2z2bj\" (UID: \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\") " pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:35 crc kubenswrapper[5047]: I0223 08:44:35.613284 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee247f6-dbc5-4e5c-b25c-bfbab835035d-utilities\") pod \"certified-operators-2z2bj\" (UID: \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\") " pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:35 crc kubenswrapper[5047]: I0223 08:44:35.635354 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs6nk\" (UniqueName: \"kubernetes.io/projected/bee247f6-dbc5-4e5c-b25c-bfbab835035d-kube-api-access-bs6nk\") pod \"certified-operators-2z2bj\" (UID: \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\") " pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:35 crc kubenswrapper[5047]: I0223 08:44:35.775482 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:36 crc kubenswrapper[5047]: I0223 08:44:36.313834 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2z2bj"] Feb 23 08:44:36 crc kubenswrapper[5047]: I0223 08:44:36.544977 5047 generic.go:334] "Generic (PLEG): container finished" podID="bee247f6-dbc5-4e5c-b25c-bfbab835035d" containerID="df0af1a508e149626e6ca134fafb2a65056b0ce3810af72bcccc5d83a3ddaa3a" exitCode=0 Feb 23 08:44:36 crc kubenswrapper[5047]: I0223 08:44:36.545140 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z2bj" event={"ID":"bee247f6-dbc5-4e5c-b25c-bfbab835035d","Type":"ContainerDied","Data":"df0af1a508e149626e6ca134fafb2a65056b0ce3810af72bcccc5d83a3ddaa3a"} Feb 23 08:44:36 crc kubenswrapper[5047]: I0223 08:44:36.545426 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z2bj" event={"ID":"bee247f6-dbc5-4e5c-b25c-bfbab835035d","Type":"ContainerStarted","Data":"2f193230a71a26a56d62af829d0c6ca4414046ea8a674059df70b007c72fcde6"} Feb 23 08:44:37 crc kubenswrapper[5047]: I0223 08:44:37.559977 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z2bj" event={"ID":"bee247f6-dbc5-4e5c-b25c-bfbab835035d","Type":"ContainerStarted","Data":"65eea516b40172b5c82c770a28dda13166e627545ad919eb521414268da7a0da"} Feb 23 08:44:38 crc kubenswrapper[5047]: I0223 08:44:38.573478 5047 generic.go:334] "Generic (PLEG): container finished" podID="bee247f6-dbc5-4e5c-b25c-bfbab835035d" containerID="65eea516b40172b5c82c770a28dda13166e627545ad919eb521414268da7a0da" exitCode=0 Feb 23 08:44:38 crc kubenswrapper[5047]: I0223 08:44:38.573561 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z2bj" event={"ID":"bee247f6-dbc5-4e5c-b25c-bfbab835035d","Type":"ContainerDied","Data":"65eea516b40172b5c82c770a28dda13166e627545ad919eb521414268da7a0da"} Feb 23 08:44:39 crc kubenswrapper[5047]: I0223 08:44:39.586184 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z2bj" event={"ID":"bee247f6-dbc5-4e5c-b25c-bfbab835035d","Type":"ContainerStarted","Data":"f5855d662d26d2b2338ff96e0b73c3bede628c670627dc9ab27cebbf5732dfde"} Feb 23 08:44:39 crc kubenswrapper[5047]: I0223 08:44:39.638844 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2z2bj" podStartSLOduration=2.228725926 podStartE2EDuration="4.638828802s" podCreationTimestamp="2026-02-23 08:44:35 +0000 UTC" firstStartedPulling="2026-02-23 08:44:36.548254899 +0000 UTC m=+7198.799582043" lastFinishedPulling="2026-02-23 08:44:38.958357755 +0000 UTC m=+7201.209684919" observedRunningTime="2026-02-23 08:44:39.633217231 +0000 UTC m=+7201.884544355" watchObservedRunningTime="2026-02-23 08:44:39.638828802 +0000 UTC m=+7201.890155936" Feb 23 08:44:45 crc kubenswrapper[5047]: I0223 08:44:45.775985 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:45 crc kubenswrapper[5047]: I0223 08:44:45.776655 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:45 crc kubenswrapper[5047]: I0223 08:44:45.852974 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:46 crc kubenswrapper[5047]: I0223 08:44:46.708594 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:46 crc kubenswrapper[5047]: I0223 08:44:46.778193 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2z2bj"] Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.618407 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-k2z59"] Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.619873 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-k2z59" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.637012 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b60b-account-create-update-qclq5"] Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.638527 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b60b-account-create-update-qclq5" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.644529 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.648948 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-k2z59"] Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.666266 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b60b-account-create-update-qclq5"] Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.672071 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2z2bj" podUID="bee247f6-dbc5-4e5c-b25c-bfbab835035d" containerName="registry-server" containerID="cri-o://f5855d662d26d2b2338ff96e0b73c3bede628c670627dc9ab27cebbf5732dfde" gracePeriod=2 Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.727040 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rps2b\" (UniqueName: \"kubernetes.io/projected/18e78237-727f-4a10-bc58-53cbb650f5e2-kube-api-access-rps2b\") pod \"barbican-db-create-k2z59\" (UID: \"18e78237-727f-4a10-bc58-53cbb650f5e2\") " pod="openstack/barbican-db-create-k2z59" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.727384 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e78237-727f-4a10-bc58-53cbb650f5e2-operator-scripts\") pod \"barbican-db-create-k2z59\" (UID: \"18e78237-727f-4a10-bc58-53cbb650f5e2\") " pod="openstack/barbican-db-create-k2z59" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.829295 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rps2b\" (UniqueName: \"kubernetes.io/projected/18e78237-727f-4a10-bc58-53cbb650f5e2-kube-api-access-rps2b\") pod \"barbican-db-create-k2z59\" (UID: \"18e78237-727f-4a10-bc58-53cbb650f5e2\") " pod="openstack/barbican-db-create-k2z59" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.829424 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d3eac5f-445c-426a-8b40-da353ac3e4d7-operator-scripts\") pod \"barbican-b60b-account-create-update-qclq5\" (UID: \"2d3eac5f-445c-426a-8b40-da353ac3e4d7\") " pod="openstack/barbican-b60b-account-create-update-qclq5" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.829549 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2crmh\" (UniqueName: \"kubernetes.io/projected/2d3eac5f-445c-426a-8b40-da353ac3e4d7-kube-api-access-2crmh\") pod \"barbican-b60b-account-create-update-qclq5\" (UID: \"2d3eac5f-445c-426a-8b40-da353ac3e4d7\") " pod="openstack/barbican-b60b-account-create-update-qclq5" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.829600 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e78237-727f-4a10-bc58-53cbb650f5e2-operator-scripts\") pod \"barbican-db-create-k2z59\" (UID: \"18e78237-727f-4a10-bc58-53cbb650f5e2\") " pod="openstack/barbican-db-create-k2z59" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.830897 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e78237-727f-4a10-bc58-53cbb650f5e2-operator-scripts\") pod \"barbican-db-create-k2z59\" (UID: \"18e78237-727f-4a10-bc58-53cbb650f5e2\") " pod="openstack/barbican-db-create-k2z59" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.854005 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rps2b\" (UniqueName: \"kubernetes.io/projected/18e78237-727f-4a10-bc58-53cbb650f5e2-kube-api-access-rps2b\") pod \"barbican-db-create-k2z59\" (UID: \"18e78237-727f-4a10-bc58-53cbb650f5e2\") " pod="openstack/barbican-db-create-k2z59" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.931082 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d3eac5f-445c-426a-8b40-da353ac3e4d7-operator-scripts\") pod \"barbican-b60b-account-create-update-qclq5\" (UID: \"2d3eac5f-445c-426a-8b40-da353ac3e4d7\") " pod="openstack/barbican-b60b-account-create-update-qclq5" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.932173 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d3eac5f-445c-426a-8b40-da353ac3e4d7-operator-scripts\") pod \"barbican-b60b-account-create-update-qclq5\" (UID: \"2d3eac5f-445c-426a-8b40-da353ac3e4d7\") " pod="openstack/barbican-b60b-account-create-update-qclq5" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.932774 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2crmh\" (UniqueName: \"kubernetes.io/projected/2d3eac5f-445c-426a-8b40-da353ac3e4d7-kube-api-access-2crmh\") pod \"barbican-b60b-account-create-update-qclq5\" (UID: \"2d3eac5f-445c-426a-8b40-da353ac3e4d7\") " pod="openstack/barbican-b60b-account-create-update-qclq5" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.954645 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-k2z59" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.958488 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2crmh\" (UniqueName: \"kubernetes.io/projected/2d3eac5f-445c-426a-8b40-da353ac3e4d7-kube-api-access-2crmh\") pod \"barbican-b60b-account-create-update-qclq5\" (UID: \"2d3eac5f-445c-426a-8b40-da353ac3e4d7\") " pod="openstack/barbican-b60b-account-create-update-qclq5" Feb 23 08:44:48 crc kubenswrapper[5047]: I0223 08:44:48.965659 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b60b-account-create-update-qclq5" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.147370 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.341720 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee247f6-dbc5-4e5c-b25c-bfbab835035d-utilities\") pod \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\" (UID: \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\") " Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.342114 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee247f6-dbc5-4e5c-b25c-bfbab835035d-catalog-content\") pod \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\" (UID: \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\") " Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.342252 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs6nk\" (UniqueName: \"kubernetes.io/projected/bee247f6-dbc5-4e5c-b25c-bfbab835035d-kube-api-access-bs6nk\") pod \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\" (UID: \"bee247f6-dbc5-4e5c-b25c-bfbab835035d\") " Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.342960 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bee247f6-dbc5-4e5c-b25c-bfbab835035d-utilities" (OuterVolumeSpecName: "utilities") pod "bee247f6-dbc5-4e5c-b25c-bfbab835035d" (UID: "bee247f6-dbc5-4e5c-b25c-bfbab835035d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.348013 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee247f6-dbc5-4e5c-b25c-bfbab835035d-kube-api-access-bs6nk" (OuterVolumeSpecName: "kube-api-access-bs6nk") pod "bee247f6-dbc5-4e5c-b25c-bfbab835035d" (UID: "bee247f6-dbc5-4e5c-b25c-bfbab835035d"). InnerVolumeSpecName "kube-api-access-bs6nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.400506 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bee247f6-dbc5-4e5c-b25c-bfbab835035d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bee247f6-dbc5-4e5c-b25c-bfbab835035d" (UID: "bee247f6-dbc5-4e5c-b25c-bfbab835035d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.444150 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs6nk\" (UniqueName: \"kubernetes.io/projected/bee247f6-dbc5-4e5c-b25c-bfbab835035d-kube-api-access-bs6nk\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.444188 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bee247f6-dbc5-4e5c-b25c-bfbab835035d-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.444199 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bee247f6-dbc5-4e5c-b25c-bfbab835035d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.521856 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b60b-account-create-update-qclq5"] Feb 23 08:44:49 crc kubenswrapper[5047]: W0223 08:44:49.528211 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e78237_727f_4a10_bc58_53cbb650f5e2.slice/crio-856bb48a3816a37e9a853f606bf42674776bf893e16542ae424726c844ee13d9 WatchSource:0}: Error finding container 856bb48a3816a37e9a853f606bf42674776bf893e16542ae424726c844ee13d9: Status 404 returned error can't find the container with id 856bb48a3816a37e9a853f606bf42674776bf893e16542ae424726c844ee13d9 Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.532691 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-k2z59"] Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.681672 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-k2z59" event={"ID":"18e78237-727f-4a10-bc58-53cbb650f5e2","Type":"ContainerStarted","Data":"924b774bf422115ee2d85401ee8918ca65c2669a4a178bdb31127a433d44f634"} Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.681737 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-k2z59" event={"ID":"18e78237-727f-4a10-bc58-53cbb650f5e2","Type":"ContainerStarted","Data":"856bb48a3816a37e9a853f606bf42674776bf893e16542ae424726c844ee13d9"} Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.685873 5047 generic.go:334] "Generic (PLEG): container finished" podID="bee247f6-dbc5-4e5c-b25c-bfbab835035d" containerID="f5855d662d26d2b2338ff96e0b73c3bede628c670627dc9ab27cebbf5732dfde" exitCode=0 Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.686012 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z2bj" event={"ID":"bee247f6-dbc5-4e5c-b25c-bfbab835035d","Type":"ContainerDied","Data":"f5855d662d26d2b2338ff96e0b73c3bede628c670627dc9ab27cebbf5732dfde"} Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.686030 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z2bj" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.686059 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z2bj" event={"ID":"bee247f6-dbc5-4e5c-b25c-bfbab835035d","Type":"ContainerDied","Data":"2f193230a71a26a56d62af829d0c6ca4414046ea8a674059df70b007c72fcde6"} Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.686088 5047 scope.go:117] "RemoveContainer" containerID="f5855d662d26d2b2338ff96e0b73c3bede628c670627dc9ab27cebbf5732dfde" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.688161 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b60b-account-create-update-qclq5" event={"ID":"2d3eac5f-445c-426a-8b40-da353ac3e4d7","Type":"ContainerStarted","Data":"b51e998165c084679a97f5c7335ff4b3417b8dfb0bf00b6799909acce411d059"} Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.688216 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b60b-account-create-update-qclq5" event={"ID":"2d3eac5f-445c-426a-8b40-da353ac3e4d7","Type":"ContainerStarted","Data":"0cc2ce045438811316a11d2d10c2c846e9c8206fd7f27840c6ac6346afd2def8"} Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.707582 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-k2z59" podStartSLOduration=1.707544123 podStartE2EDuration="1.707544123s" podCreationTimestamp="2026-02-23 08:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:44:49.695064759 +0000 UTC m=+7211.946391943" watchObservedRunningTime="2026-02-23 08:44:49.707544123 +0000 UTC m=+7211.958871297" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.723286 5047 scope.go:117] "RemoveContainer" containerID="65eea516b40172b5c82c770a28dda13166e627545ad919eb521414268da7a0da" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.734936 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-b60b-account-create-update-qclq5" podStartSLOduration=1.734880177 podStartE2EDuration="1.734880177s" podCreationTimestamp="2026-02-23 08:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:44:49.715419845 +0000 UTC m=+7211.966746989" watchObservedRunningTime="2026-02-23 08:44:49.734880177 +0000 UTC m=+7211.986207341" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.752598 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2z2bj"] Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.755546 5047 scope.go:117] "RemoveContainer" containerID="df0af1a508e149626e6ca134fafb2a65056b0ce3810af72bcccc5d83a3ddaa3a" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.764025 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2z2bj"] Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.778668 5047 scope.go:117] "RemoveContainer" containerID="f5855d662d26d2b2338ff96e0b73c3bede628c670627dc9ab27cebbf5732dfde" Feb 23 08:44:49 crc kubenswrapper[5047]: E0223 08:44:49.779209 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5855d662d26d2b2338ff96e0b73c3bede628c670627dc9ab27cebbf5732dfde\": container with ID starting with f5855d662d26d2b2338ff96e0b73c3bede628c670627dc9ab27cebbf5732dfde not found: ID does not exist" containerID="f5855d662d26d2b2338ff96e0b73c3bede628c670627dc9ab27cebbf5732dfde" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.779243 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5855d662d26d2b2338ff96e0b73c3bede628c670627dc9ab27cebbf5732dfde"} err="failed to get container status \"f5855d662d26d2b2338ff96e0b73c3bede628c670627dc9ab27cebbf5732dfde\": rpc error: code = NotFound desc = could not find container \"f5855d662d26d2b2338ff96e0b73c3bede628c670627dc9ab27cebbf5732dfde\": container with ID starting with f5855d662d26d2b2338ff96e0b73c3bede628c670627dc9ab27cebbf5732dfde not found: ID does not exist" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.779268 5047 scope.go:117] "RemoveContainer" containerID="65eea516b40172b5c82c770a28dda13166e627545ad919eb521414268da7a0da" Feb 23 08:44:49 crc kubenswrapper[5047]: E0223 08:44:49.779553 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65eea516b40172b5c82c770a28dda13166e627545ad919eb521414268da7a0da\": container with ID starting with 65eea516b40172b5c82c770a28dda13166e627545ad919eb521414268da7a0da not found: ID does not exist" containerID="65eea516b40172b5c82c770a28dda13166e627545ad919eb521414268da7a0da" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.779602 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65eea516b40172b5c82c770a28dda13166e627545ad919eb521414268da7a0da"} err="failed to get container status \"65eea516b40172b5c82c770a28dda13166e627545ad919eb521414268da7a0da\": rpc error: code = NotFound desc = could not find container \"65eea516b40172b5c82c770a28dda13166e627545ad919eb521414268da7a0da\": container with ID starting with 65eea516b40172b5c82c770a28dda13166e627545ad919eb521414268da7a0da not found: ID does not exist" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.779633 5047 scope.go:117] "RemoveContainer" containerID="df0af1a508e149626e6ca134fafb2a65056b0ce3810af72bcccc5d83a3ddaa3a" Feb 23 08:44:49 crc kubenswrapper[5047]: E0223 08:44:49.780200 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df0af1a508e149626e6ca134fafb2a65056b0ce3810af72bcccc5d83a3ddaa3a\": container with ID starting with df0af1a508e149626e6ca134fafb2a65056b0ce3810af72bcccc5d83a3ddaa3a not found: ID does not exist" containerID="df0af1a508e149626e6ca134fafb2a65056b0ce3810af72bcccc5d83a3ddaa3a" Feb 23 08:44:49 crc kubenswrapper[5047]: I0223 08:44:49.780229 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df0af1a508e149626e6ca134fafb2a65056b0ce3810af72bcccc5d83a3ddaa3a"} err="failed to get container status \"df0af1a508e149626e6ca134fafb2a65056b0ce3810af72bcccc5d83a3ddaa3a\": rpc error: code = NotFound desc = could not find container \"df0af1a508e149626e6ca134fafb2a65056b0ce3810af72bcccc5d83a3ddaa3a\": container with ID starting with df0af1a508e149626e6ca134fafb2a65056b0ce3810af72bcccc5d83a3ddaa3a not found: ID does not exist" Feb 23 08:44:50 crc kubenswrapper[5047]: I0223 08:44:50.388030 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bee247f6-dbc5-4e5c-b25c-bfbab835035d" path="/var/lib/kubelet/pods/bee247f6-dbc5-4e5c-b25c-bfbab835035d/volumes" Feb 23 08:44:50 crc kubenswrapper[5047]: I0223 08:44:50.704889 5047 generic.go:334] "Generic (PLEG): container finished" podID="2d3eac5f-445c-426a-8b40-da353ac3e4d7" containerID="b51e998165c084679a97f5c7335ff4b3417b8dfb0bf00b6799909acce411d059" exitCode=0 Feb 23 08:44:50 crc kubenswrapper[5047]: I0223 08:44:50.704993 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b60b-account-create-update-qclq5" event={"ID":"2d3eac5f-445c-426a-8b40-da353ac3e4d7","Type":"ContainerDied","Data":"b51e998165c084679a97f5c7335ff4b3417b8dfb0bf00b6799909acce411d059"} Feb 23 08:44:50 crc kubenswrapper[5047]: I0223 08:44:50.710380 5047 generic.go:334] "Generic (PLEG): container finished" podID="18e78237-727f-4a10-bc58-53cbb650f5e2" containerID="924b774bf422115ee2d85401ee8918ca65c2669a4a178bdb31127a433d44f634" exitCode=0 Feb 23 08:44:50 crc kubenswrapper[5047]: I0223 08:44:50.710527 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-k2z59" event={"ID":"18e78237-727f-4a10-bc58-53cbb650f5e2","Type":"ContainerDied","Data":"924b774bf422115ee2d85401ee8918ca65c2669a4a178bdb31127a433d44f634"} Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.219499 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-k2z59" Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.231292 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b60b-account-create-update-qclq5" Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.403681 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2crmh\" (UniqueName: \"kubernetes.io/projected/2d3eac5f-445c-426a-8b40-da353ac3e4d7-kube-api-access-2crmh\") pod \"2d3eac5f-445c-426a-8b40-da353ac3e4d7\" (UID: \"2d3eac5f-445c-426a-8b40-da353ac3e4d7\") " Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.403724 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e78237-727f-4a10-bc58-53cbb650f5e2-operator-scripts\") pod \"18e78237-727f-4a10-bc58-53cbb650f5e2\" (UID: \"18e78237-727f-4a10-bc58-53cbb650f5e2\") " Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.403754 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d3eac5f-445c-426a-8b40-da353ac3e4d7-operator-scripts\") pod \"2d3eac5f-445c-426a-8b40-da353ac3e4d7\" (UID: \"2d3eac5f-445c-426a-8b40-da353ac3e4d7\") " Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.403851 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rps2b\" (UniqueName: \"kubernetes.io/projected/18e78237-727f-4a10-bc58-53cbb650f5e2-kube-api-access-rps2b\") pod \"18e78237-727f-4a10-bc58-53cbb650f5e2\" (UID: \"18e78237-727f-4a10-bc58-53cbb650f5e2\") " Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.404199 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18e78237-727f-4a10-bc58-53cbb650f5e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18e78237-727f-4a10-bc58-53cbb650f5e2" (UID: "18e78237-727f-4a10-bc58-53cbb650f5e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.404684 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d3eac5f-445c-426a-8b40-da353ac3e4d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d3eac5f-445c-426a-8b40-da353ac3e4d7" (UID: "2d3eac5f-445c-426a-8b40-da353ac3e4d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.412713 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d3eac5f-445c-426a-8b40-da353ac3e4d7-kube-api-access-2crmh" (OuterVolumeSpecName: "kube-api-access-2crmh") pod "2d3eac5f-445c-426a-8b40-da353ac3e4d7" (UID: "2d3eac5f-445c-426a-8b40-da353ac3e4d7"). InnerVolumeSpecName "kube-api-access-2crmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.413050 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e78237-727f-4a10-bc58-53cbb650f5e2-kube-api-access-rps2b" (OuterVolumeSpecName: "kube-api-access-rps2b") pod "18e78237-727f-4a10-bc58-53cbb650f5e2" (UID: "18e78237-727f-4a10-bc58-53cbb650f5e2"). InnerVolumeSpecName "kube-api-access-rps2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.507391 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2crmh\" (UniqueName: \"kubernetes.io/projected/2d3eac5f-445c-426a-8b40-da353ac3e4d7-kube-api-access-2crmh\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.507438 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e78237-727f-4a10-bc58-53cbb650f5e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.507457 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d3eac5f-445c-426a-8b40-da353ac3e4d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.507479 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rps2b\" (UniqueName: \"kubernetes.io/projected/18e78237-727f-4a10-bc58-53cbb650f5e2-kube-api-access-rps2b\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.756211 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b60b-account-create-update-qclq5" Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.757444 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b60b-account-create-update-qclq5" event={"ID":"2d3eac5f-445c-426a-8b40-da353ac3e4d7","Type":"ContainerDied","Data":"0cc2ce045438811316a11d2d10c2c846e9c8206fd7f27840c6ac6346afd2def8"} Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.757499 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cc2ce045438811316a11d2d10c2c846e9c8206fd7f27840c6ac6346afd2def8" Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.760629 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-k2z59" event={"ID":"18e78237-727f-4a10-bc58-53cbb650f5e2","Type":"ContainerDied","Data":"856bb48a3816a37e9a853f606bf42674776bf893e16542ae424726c844ee13d9"} Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.760782 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="856bb48a3816a37e9a853f606bf42674776bf893e16542ae424726c844ee13d9" Feb 23 08:44:52 crc kubenswrapper[5047]: I0223 08:44:52.760948 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-k2z59" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.037661 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-kdn4n"] Feb 23 08:44:54 crc kubenswrapper[5047]: E0223 08:44:54.038763 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee247f6-dbc5-4e5c-b25c-bfbab835035d" containerName="extract-content" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.038783 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee247f6-dbc5-4e5c-b25c-bfbab835035d" containerName="extract-content" Feb 23 08:44:54 crc kubenswrapper[5047]: E0223 08:44:54.038799 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3eac5f-445c-426a-8b40-da353ac3e4d7" containerName="mariadb-account-create-update" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.038808 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3eac5f-445c-426a-8b40-da353ac3e4d7" containerName="mariadb-account-create-update" Feb 23 08:44:54 crc kubenswrapper[5047]: E0223 08:44:54.038838 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee247f6-dbc5-4e5c-b25c-bfbab835035d" containerName="extract-utilities" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.038847 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee247f6-dbc5-4e5c-b25c-bfbab835035d" containerName="extract-utilities" Feb 23 08:44:54 crc kubenswrapper[5047]: E0223 08:44:54.038866 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee247f6-dbc5-4e5c-b25c-bfbab835035d" containerName="registry-server" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.038874 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee247f6-dbc5-4e5c-b25c-bfbab835035d" containerName="registry-server" Feb 23 08:44:54 crc kubenswrapper[5047]: E0223 08:44:54.038890 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e78237-727f-4a10-bc58-53cbb650f5e2" containerName="mariadb-database-create" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.038899 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e78237-727f-4a10-bc58-53cbb650f5e2" containerName="mariadb-database-create" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.039112 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee247f6-dbc5-4e5c-b25c-bfbab835035d" containerName="registry-server" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.039139 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e78237-727f-4a10-bc58-53cbb650f5e2" containerName="mariadb-database-create" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.039184 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d3eac5f-445c-426a-8b40-da353ac3e4d7" containerName="mariadb-account-create-update" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.039969 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kdn4n" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.043284 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mr4v8" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.043468 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.075375 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-kdn4n"] Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.143768 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c60b58d-c0c1-470c-b84d-79528f882fb6-combined-ca-bundle\") pod \"barbican-db-sync-kdn4n\" (UID: \"0c60b58d-c0c1-470c-b84d-79528f882fb6\") " pod="openstack/barbican-db-sync-kdn4n" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.144133 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jtdm\" (UniqueName: \"kubernetes.io/projected/0c60b58d-c0c1-470c-b84d-79528f882fb6-kube-api-access-7jtdm\") pod \"barbican-db-sync-kdn4n\" (UID: \"0c60b58d-c0c1-470c-b84d-79528f882fb6\") " pod="openstack/barbican-db-sync-kdn4n" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.144311 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0c60b58d-c0c1-470c-b84d-79528f882fb6-db-sync-config-data\") pod \"barbican-db-sync-kdn4n\" (UID: \"0c60b58d-c0c1-470c-b84d-79528f882fb6\") " pod="openstack/barbican-db-sync-kdn4n" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.247879 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0c60b58d-c0c1-470c-b84d-79528f882fb6-db-sync-config-data\") pod \"barbican-db-sync-kdn4n\" (UID: \"0c60b58d-c0c1-470c-b84d-79528f882fb6\") " pod="openstack/barbican-db-sync-kdn4n" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.248162 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c60b58d-c0c1-470c-b84d-79528f882fb6-combined-ca-bundle\") pod \"barbican-db-sync-kdn4n\" (UID: \"0c60b58d-c0c1-470c-b84d-79528f882fb6\") " pod="openstack/barbican-db-sync-kdn4n" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.248220 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jtdm\" (UniqueName: \"kubernetes.io/projected/0c60b58d-c0c1-470c-b84d-79528f882fb6-kube-api-access-7jtdm\") pod \"barbican-db-sync-kdn4n\" (UID: \"0c60b58d-c0c1-470c-b84d-79528f882fb6\") " pod="openstack/barbican-db-sync-kdn4n" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.256055 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0c60b58d-c0c1-470c-b84d-79528f882fb6-db-sync-config-data\") pod \"barbican-db-sync-kdn4n\" (UID: \"0c60b58d-c0c1-470c-b84d-79528f882fb6\") " pod="openstack/barbican-db-sync-kdn4n" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.267897 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jtdm\" (UniqueName: \"kubernetes.io/projected/0c60b58d-c0c1-470c-b84d-79528f882fb6-kube-api-access-7jtdm\") pod \"barbican-db-sync-kdn4n\" (UID: \"0c60b58d-c0c1-470c-b84d-79528f882fb6\") " pod="openstack/barbican-db-sync-kdn4n" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.269584 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c60b58d-c0c1-470c-b84d-79528f882fb6-combined-ca-bundle\") pod \"barbican-db-sync-kdn4n\" (UID: \"0c60b58d-c0c1-470c-b84d-79528f882fb6\") " pod="openstack/barbican-db-sync-kdn4n" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.378370 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kdn4n" Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.709459 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-kdn4n"] Feb 23 08:44:54 crc kubenswrapper[5047]: I0223 08:44:54.781121 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kdn4n" event={"ID":"0c60b58d-c0c1-470c-b84d-79528f882fb6","Type":"ContainerStarted","Data":"84880a24510c66a1949de3f60857a0f014773cdb14aed585ae56ef2d93be2a59"} Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.142074 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr"] Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.144335 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.147222 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.147242 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.172152 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr"] Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.272979 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-config-volume\") pod \"collect-profiles-29530605-4r9dr\" (UID: \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.273504 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98m67\" (UniqueName: \"kubernetes.io/projected/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-kube-api-access-98m67\") pod \"collect-profiles-29530605-4r9dr\" (UID: \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.273536 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-secret-volume\") pod \"collect-profiles-29530605-4r9dr\" (UID: \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.376077 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-config-volume\") pod \"collect-profiles-29530605-4r9dr\" (UID: \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.376165 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98m67\" (UniqueName: \"kubernetes.io/projected/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-kube-api-access-98m67\") pod \"collect-profiles-29530605-4r9dr\" (UID: \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.376197 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-secret-volume\") pod \"collect-profiles-29530605-4r9dr\" (UID: \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.377145 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-config-volume\") pod \"collect-profiles-29530605-4r9dr\" (UID: \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.380744 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-secret-volume\") pod \"collect-profiles-29530605-4r9dr\" (UID: \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.393802 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98m67\" (UniqueName: \"kubernetes.io/projected/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-kube-api-access-98m67\") pod \"collect-profiles-29530605-4r9dr\" (UID: \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.477059 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.845176 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kdn4n" event={"ID":"0c60b58d-c0c1-470c-b84d-79528f882fb6","Type":"ContainerStarted","Data":"25a65cae038fba539bc27589cc7c4f44444fed5df321cb4a5f40a5de73068b1e"} Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.861924 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-kdn4n" podStartSLOduration=1.384631858 podStartE2EDuration="6.861891928s" podCreationTimestamp="2026-02-23 08:44:54 +0000 UTC" firstStartedPulling="2026-02-23 08:44:54.724899198 +0000 UTC m=+7216.976226332" lastFinishedPulling="2026-02-23 08:45:00.202159268 +0000 UTC m=+7222.453486402" observedRunningTime="2026-02-23 08:45:00.859697789 +0000 UTC m=+7223.111024963" watchObservedRunningTime="2026-02-23 08:45:00.861891928 +0000 UTC m=+7223.113219062" Feb 23 08:45:00 crc kubenswrapper[5047]: I0223 08:45:00.981551 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr"] Feb 23 08:45:00 crc kubenswrapper[5047]: W0223 08:45:00.994039 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a94a7d2_4cfd_4b89_8b1a_41608f29bf97.slice/crio-8feb430d70992db548fc1fc4be94528d33d45198e47da098eb95bffebafeab2c WatchSource:0}: Error finding container 8feb430d70992db548fc1fc4be94528d33d45198e47da098eb95bffebafeab2c: Status 404 returned error can't find the container with id 8feb430d70992db548fc1fc4be94528d33d45198e47da098eb95bffebafeab2c Feb 23 08:45:01 crc kubenswrapper[5047]: I0223 08:45:01.859326 5047 generic.go:334] "Generic (PLEG): container finished" podID="2a94a7d2-4cfd-4b89-8b1a-41608f29bf97" containerID="25f5b79a5fa6716b7323fca9ba001b48f1c12d9bbc024d92d8daa95db7015ced" exitCode=0 Feb 23 08:45:01 crc kubenswrapper[5047]: I0223 08:45:01.859439 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" event={"ID":"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97","Type":"ContainerDied","Data":"25f5b79a5fa6716b7323fca9ba001b48f1c12d9bbc024d92d8daa95db7015ced"} Feb 23 08:45:01 crc kubenswrapper[5047]: I0223 08:45:01.860136 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" event={"ID":"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97","Type":"ContainerStarted","Data":"8feb430d70992db548fc1fc4be94528d33d45198e47da098eb95bffebafeab2c"} Feb 23 08:45:02 crc kubenswrapper[5047]: E0223 08:45:02.491536 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c60b58d_c0c1_470c_b84d_79528f882fb6.slice/crio-25a65cae038fba539bc27589cc7c4f44444fed5df321cb4a5f40a5de73068b1e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c60b58d_c0c1_470c_b84d_79528f882fb6.slice/crio-conmon-25a65cae038fba539bc27589cc7c4f44444fed5df321cb4a5f40a5de73068b1e.scope\": RecentStats: unable to find data in memory cache]" Feb 23 08:45:02 crc kubenswrapper[5047]: I0223 08:45:02.871425 5047 generic.go:334] "Generic (PLEG): container finished" podID="0c60b58d-c0c1-470c-b84d-79528f882fb6" containerID="25a65cae038fba539bc27589cc7c4f44444fed5df321cb4a5f40a5de73068b1e" exitCode=0 Feb 23 08:45:02 crc kubenswrapper[5047]: I0223 08:45:02.871501 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kdn4n" event={"ID":"0c60b58d-c0c1-470c-b84d-79528f882fb6","Type":"ContainerDied","Data":"25a65cae038fba539bc27589cc7c4f44444fed5df321cb4a5f40a5de73068b1e"} Feb 23 08:45:03 crc kubenswrapper[5047]: I0223 08:45:03.313461 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" Feb 23 08:45:03 crc kubenswrapper[5047]: I0223 08:45:03.339357 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-secret-volume\") pod \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\" (UID: \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\") " Feb 23 08:45:03 crc kubenswrapper[5047]: I0223 08:45:03.339659 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98m67\" (UniqueName: \"kubernetes.io/projected/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-kube-api-access-98m67\") pod \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\" (UID: \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\") " Feb 23 08:45:03 crc kubenswrapper[5047]: I0223 08:45:03.339880 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-config-volume\") pod \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\" (UID: \"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97\") " Feb 23 08:45:03 crc kubenswrapper[5047]: I0223 08:45:03.340732 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-config-volume" (OuterVolumeSpecName: "config-volume") pod "2a94a7d2-4cfd-4b89-8b1a-41608f29bf97" (UID: "2a94a7d2-4cfd-4b89-8b1a-41608f29bf97"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:03 crc kubenswrapper[5047]: I0223 08:45:03.342172 5047 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:03 crc kubenswrapper[5047]: I0223 08:45:03.347546 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-kube-api-access-98m67" (OuterVolumeSpecName: "kube-api-access-98m67") pod "2a94a7d2-4cfd-4b89-8b1a-41608f29bf97" (UID: "2a94a7d2-4cfd-4b89-8b1a-41608f29bf97"). InnerVolumeSpecName "kube-api-access-98m67". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:03 crc kubenswrapper[5047]: I0223 08:45:03.349077 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2a94a7d2-4cfd-4b89-8b1a-41608f29bf97" (UID: "2a94a7d2-4cfd-4b89-8b1a-41608f29bf97"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:03 crc kubenswrapper[5047]: I0223 08:45:03.444230 5047 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:03 crc kubenswrapper[5047]: I0223 08:45:03.444506 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98m67\" (UniqueName: \"kubernetes.io/projected/2a94a7d2-4cfd-4b89-8b1a-41608f29bf97-kube-api-access-98m67\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:03 crc kubenswrapper[5047]: I0223 08:45:03.885258 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" event={"ID":"2a94a7d2-4cfd-4b89-8b1a-41608f29bf97","Type":"ContainerDied","Data":"8feb430d70992db548fc1fc4be94528d33d45198e47da098eb95bffebafeab2c"} Feb 23 08:45:03 crc kubenswrapper[5047]: I0223 08:45:03.885323 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8feb430d70992db548fc1fc4be94528d33d45198e47da098eb95bffebafeab2c" Feb 23 08:45:03 crc kubenswrapper[5047]: I0223 08:45:03.885271 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-4r9dr" Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.206712 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kdn4n" Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.261251 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c60b58d-c0c1-470c-b84d-79528f882fb6-combined-ca-bundle\") pod \"0c60b58d-c0c1-470c-b84d-79528f882fb6\" (UID: \"0c60b58d-c0c1-470c-b84d-79528f882fb6\") " Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.261471 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0c60b58d-c0c1-470c-b84d-79528f882fb6-db-sync-config-data\") pod \"0c60b58d-c0c1-470c-b84d-79528f882fb6\" (UID: \"0c60b58d-c0c1-470c-b84d-79528f882fb6\") " Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.261511 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jtdm\" (UniqueName: \"kubernetes.io/projected/0c60b58d-c0c1-470c-b84d-79528f882fb6-kube-api-access-7jtdm\") pod \"0c60b58d-c0c1-470c-b84d-79528f882fb6\" (UID: \"0c60b58d-c0c1-470c-b84d-79528f882fb6\") " Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.271942 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c60b58d-c0c1-470c-b84d-79528f882fb6-kube-api-access-7jtdm" (OuterVolumeSpecName: "kube-api-access-7jtdm") pod "0c60b58d-c0c1-470c-b84d-79528f882fb6" (UID: "0c60b58d-c0c1-470c-b84d-79528f882fb6"). InnerVolumeSpecName "kube-api-access-7jtdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.274942 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c60b58d-c0c1-470c-b84d-79528f882fb6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0c60b58d-c0c1-470c-b84d-79528f882fb6" (UID: "0c60b58d-c0c1-470c-b84d-79528f882fb6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.313814 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c60b58d-c0c1-470c-b84d-79528f882fb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c60b58d-c0c1-470c-b84d-79528f882fb6" (UID: "0c60b58d-c0c1-470c-b84d-79528f882fb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.363502 5047 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0c60b58d-c0c1-470c-b84d-79528f882fb6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.363958 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jtdm\" (UniqueName: \"kubernetes.io/projected/0c60b58d-c0c1-470c-b84d-79528f882fb6-kube-api-access-7jtdm\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.363978 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c60b58d-c0c1-470c-b84d-79528f882fb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.396566 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s"] Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.409983 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-vqz2s"] Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.892541 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kdn4n" event={"ID":"0c60b58d-c0c1-470c-b84d-79528f882fb6","Type":"ContainerDied","Data":"84880a24510c66a1949de3f60857a0f014773cdb14aed585ae56ef2d93be2a59"} Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.892578 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84880a24510c66a1949de3f60857a0f014773cdb14aed585ae56ef2d93be2a59" Feb 23 08:45:04 crc kubenswrapper[5047]: I0223 08:45:04.892627 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kdn4n" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.161007 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5f75d6c566-lwrqj"] Feb 23 08:45:05 crc kubenswrapper[5047]: E0223 08:45:05.161446 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a94a7d2-4cfd-4b89-8b1a-41608f29bf97" containerName="collect-profiles" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.161469 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a94a7d2-4cfd-4b89-8b1a-41608f29bf97" containerName="collect-profiles" Feb 23 08:45:05 crc kubenswrapper[5047]: E0223 08:45:05.161490 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c60b58d-c0c1-470c-b84d-79528f882fb6" containerName="barbican-db-sync" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.161499 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c60b58d-c0c1-470c-b84d-79528f882fb6" containerName="barbican-db-sync" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.161689 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c60b58d-c0c1-470c-b84d-79528f882fb6" containerName="barbican-db-sync" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.161714 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a94a7d2-4cfd-4b89-8b1a-41608f29bf97" containerName="collect-profiles" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.162781 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.171389 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.171431 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.172026 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mr4v8" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.178031 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89hwx\" (UniqueName: \"kubernetes.io/projected/56f69678-8caa-45a4-8361-a0bf3ef10d19-kube-api-access-89hwx\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.178117 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-config-data\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.178193 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-config-data-custom\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.178223 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-combined-ca-bundle\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.178288 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f69678-8caa-45a4-8361-a0bf3ef10d19-logs\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.200074 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f75d6c566-lwrqj"] Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.227583 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-56c47db55f-krsw7"] Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.229738 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.241330 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.270222 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56c47db55f-krsw7"] Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.281258 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-config-data\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.281338 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-config-data-custom\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.281369 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-combined-ca-bundle\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.281409 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f69678-8caa-45a4-8361-a0bf3ef10d19-logs\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.281444 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-config-data-custom\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.281472 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-combined-ca-bundle\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.281529 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kjbf\" (UniqueName: \"kubernetes.io/projected/5e613490-e007-4f7e-9868-abf59633c7c2-kube-api-access-8kjbf\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.281563 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e613490-e007-4f7e-9868-abf59633c7c2-logs\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.281586 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-config-data\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.281643 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89hwx\" (UniqueName: \"kubernetes.io/projected/56f69678-8caa-45a4-8361-a0bf3ef10d19-kube-api-access-89hwx\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.289344 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-config-data\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.293712 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-config-data-custom\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.293930 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f69678-8caa-45a4-8361-a0bf3ef10d19-logs\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.294672 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-combined-ca-bundle\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.327111 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89hwx\" (UniqueName: \"kubernetes.io/projected/56f69678-8caa-45a4-8361-a0bf3ef10d19-kube-api-access-89hwx\") pod \"barbican-keystone-listener-5f75d6c566-lwrqj\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.331900 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-549cbc9d6f-h9h4z"] Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.333576 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.343512 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-549cbc9d6f-h9h4z"] Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.390208 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-config-data\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.390366 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6kb2\" (UniqueName: \"kubernetes.io/projected/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-kube-api-access-t6kb2\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.390442 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-config-data-custom\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.390470 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-combined-ca-bundle\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.390518 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-config\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.390540 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-dns-svc\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.390568 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kjbf\" (UniqueName: \"kubernetes.io/projected/5e613490-e007-4f7e-9868-abf59633c7c2-kube-api-access-8kjbf\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.390591 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-ovsdbserver-sb\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.390632 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e613490-e007-4f7e-9868-abf59633c7c2-logs\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.390654 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-ovsdbserver-nb\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.394460 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e613490-e007-4f7e-9868-abf59633c7c2-logs\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.395344 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-combined-ca-bundle\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.395369 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-config-data-custom\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.396337 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-config-data\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.402236 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5c48d97d68-59ffb"] Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.407863 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.411405 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.418933 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kjbf\" (UniqueName: \"kubernetes.io/projected/5e613490-e007-4f7e-9868-abf59633c7c2-kube-api-access-8kjbf\") pod \"barbican-worker-56c47db55f-krsw7\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.422106 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c48d97d68-59ffb"] Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.492515 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-config\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.492585 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-dns-svc\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.492619 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-ovsdbserver-sb\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.492655 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-ovsdbserver-nb\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.492758 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-config-data-custom\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.492798 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89d2n\" (UniqueName: \"kubernetes.io/projected/5dadff11-3a5a-4935-ad6a-dec911875ebb-kube-api-access-89d2n\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.492826 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dadff11-3a5a-4935-ad6a-dec911875ebb-logs\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.492852 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6kb2\" (UniqueName: \"kubernetes.io/projected/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-kube-api-access-t6kb2\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.492897 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-combined-ca-bundle\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.492996 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-config-data\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.494439 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-config\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.495108 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-dns-svc\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.495347 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-ovsdbserver-nb\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.495999 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-ovsdbserver-sb\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.501428 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.515237 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6kb2\" (UniqueName: \"kubernetes.io/projected/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-kube-api-access-t6kb2\") pod \"dnsmasq-dns-549cbc9d6f-h9h4z\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.557597 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.596634 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-config-data-custom\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.596722 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89d2n\" (UniqueName: \"kubernetes.io/projected/5dadff11-3a5a-4935-ad6a-dec911875ebb-kube-api-access-89d2n\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.596748 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dadff11-3a5a-4935-ad6a-dec911875ebb-logs\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.596794 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-combined-ca-bundle\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.596818 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-config-data\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.599339 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dadff11-3a5a-4935-ad6a-dec911875ebb-logs\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.602667 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-config-data\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.603048 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-combined-ca-bundle\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.603341 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-config-data-custom\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.627238 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89d2n\" (UniqueName: \"kubernetes.io/projected/5dadff11-3a5a-4935-ad6a-dec911875ebb-kube-api-access-89d2n\") pod \"barbican-api-5c48d97d68-59ffb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.782466 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.789881 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:05 crc kubenswrapper[5047]: I0223 08:45:05.954817 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56c47db55f-krsw7"] Feb 23 08:45:06 crc kubenswrapper[5047]: I0223 08:45:06.011786 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f75d6c566-lwrqj"] Feb 23 08:45:06 crc kubenswrapper[5047]: I0223 08:45:06.323118 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-549cbc9d6f-h9h4z"] Feb 23 08:45:06 crc kubenswrapper[5047]: I0223 08:45:06.356770 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64c334ee-69a0-4a5d-9c5c-7b253af16a18" path="/var/lib/kubelet/pods/64c334ee-69a0-4a5d-9c5c-7b253af16a18/volumes" Feb 23 08:45:06 crc kubenswrapper[5047]: I0223 08:45:06.375678 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c48d97d68-59ffb"] Feb 23 08:45:06 crc kubenswrapper[5047]: I0223 08:45:06.916491 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c48d97d68-59ffb" event={"ID":"5dadff11-3a5a-4935-ad6a-dec911875ebb","Type":"ContainerStarted","Data":"697ef8c2de5903124932940f0584217f43c638d5bc856a60937c4ebfde8e4ed8"} Feb 23 08:45:06 crc kubenswrapper[5047]: I0223 08:45:06.919415 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" event={"ID":"56f69678-8caa-45a4-8361-a0bf3ef10d19","Type":"ContainerStarted","Data":"decc8b8f7b2dfc4e7ef4724327db55ae0b38ada9af823a26288f00524f859bf5"} Feb 23 08:45:06 crc kubenswrapper[5047]: I0223 08:45:06.921680 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56c47db55f-krsw7" event={"ID":"5e613490-e007-4f7e-9868-abf59633c7c2","Type":"ContainerStarted","Data":"6d94b313ca3f23a72550cd486abde3b7c8eb06b9b4fb20291a2cf0afb5b37850"} Feb 23 08:45:06 crc kubenswrapper[5047]: I0223 08:45:06.922977 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" event={"ID":"ce2a4182-5761-4a21-82b4-02f46e3c7b5c","Type":"ContainerStarted","Data":"1cf1c1a43bb757cd5529002ab9c90a40342ac2ab5f3a75d3f602b264a8c42155"} Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.353656 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d854f95bd-qf2l8"] Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.360580 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.363894 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.364217 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.369733 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d854f95bd-qf2l8"] Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.441081 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2925\" (UniqueName: \"kubernetes.io/projected/1058e169-d572-43c5-80b3-d5f2f2c78afb-kube-api-access-q2925\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.441269 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1058e169-d572-43c5-80b3-d5f2f2c78afb-logs\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.441398 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-internal-tls-certs\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.441447 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-public-tls-certs\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.441525 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-config-data\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.441665 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-config-data-custom\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.441725 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-combined-ca-bundle\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.543414 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-combined-ca-bundle\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.543504 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2925\" (UniqueName: \"kubernetes.io/projected/1058e169-d572-43c5-80b3-d5f2f2c78afb-kube-api-access-q2925\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.543574 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1058e169-d572-43c5-80b3-d5f2f2c78afb-logs\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.543625 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-internal-tls-certs\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.543652 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-public-tls-certs\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.543680 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-config-data\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.543729 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-config-data-custom\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.544719 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1058e169-d572-43c5-80b3-d5f2f2c78afb-logs\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.550232 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-config-data-custom\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.553886 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-internal-tls-certs\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.554963 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-combined-ca-bundle\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.558651 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-public-tls-certs\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.562078 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2925\" (UniqueName: \"kubernetes.io/projected/1058e169-d572-43c5-80b3-d5f2f2c78afb-kube-api-access-q2925\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.570228 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-config-data\") pod \"barbican-api-5d854f95bd-qf2l8\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.688726 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.933992 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c48d97d68-59ffb" event={"ID":"5dadff11-3a5a-4935-ad6a-dec911875ebb","Type":"ContainerStarted","Data":"eff9948e009c554275cf5226879a4408cd0514825f6f1167501a58fe206dfc21"} Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.935935 5047 generic.go:334] "Generic (PLEG): container finished" podID="ce2a4182-5761-4a21-82b4-02f46e3c7b5c" containerID="4db6e30204466bf60e52c5d6f34648ef975470acce6d6e76b535155b33921f77" exitCode=0 Feb 23 08:45:07 crc kubenswrapper[5047]: I0223 08:45:07.935994 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" event={"ID":"ce2a4182-5761-4a21-82b4-02f46e3c7b5c","Type":"ContainerDied","Data":"4db6e30204466bf60e52c5d6f34648ef975470acce6d6e76b535155b33921f77"} Feb 23 08:45:08 crc kubenswrapper[5047]: I0223 08:45:08.324157 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d854f95bd-qf2l8"] Feb 23 08:45:08 crc kubenswrapper[5047]: I0223 08:45:08.947952 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" event={"ID":"ce2a4182-5761-4a21-82b4-02f46e3c7b5c","Type":"ContainerStarted","Data":"cb029c3bbbf07a7d5870f7915afd9ffe176946f36ed02f87443b0c725e6c274d"} Feb 23 08:45:08 crc kubenswrapper[5047]: I0223 08:45:08.949217 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:08 crc kubenswrapper[5047]: I0223 08:45:08.951294 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" event={"ID":"56f69678-8caa-45a4-8361-a0bf3ef10d19","Type":"ContainerStarted","Data":"46026da011a0aa7880379234ae4494704c649f1e05648955f118d375e2a3e1e0"} Feb 23 08:45:08 crc kubenswrapper[5047]: I0223 08:45:08.951343 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" event={"ID":"56f69678-8caa-45a4-8361-a0bf3ef10d19","Type":"ContainerStarted","Data":"f80a58364511a0fa76558ab45536d04c4cce2e97d79499c3e5dad8ce40b0217a"} Feb 23 08:45:08 crc kubenswrapper[5047]: I0223 08:45:08.954306 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c48d97d68-59ffb" event={"ID":"5dadff11-3a5a-4935-ad6a-dec911875ebb","Type":"ContainerStarted","Data":"9ad46517bc4089f11412129ad3029ac9142807c4030217a51f92372d30bb6be2"} Feb 23 08:45:08 crc kubenswrapper[5047]: I0223 08:45:08.954501 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:08 crc kubenswrapper[5047]: I0223 08:45:08.956731 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d854f95bd-qf2l8" event={"ID":"1058e169-d572-43c5-80b3-d5f2f2c78afb","Type":"ContainerStarted","Data":"d60474bd97e9b0d0915a58eae3db7da57f5234e0b02651a966146b0399694399"} Feb 23 08:45:08 crc kubenswrapper[5047]: I0223 08:45:08.956758 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d854f95bd-qf2l8" event={"ID":"1058e169-d572-43c5-80b3-d5f2f2c78afb","Type":"ContainerStarted","Data":"dcf3076fcf8e8a3596430ddf627b1510267721dfc147bcaa96b9951e0ece0e7c"} Feb 23 08:45:08 crc kubenswrapper[5047]: I0223 08:45:08.956789 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d854f95bd-qf2l8" event={"ID":"1058e169-d572-43c5-80b3-d5f2f2c78afb","Type":"ContainerStarted","Data":"ded9d9ca87a57a22eb2252345cbf14e358b71875343afbe48bf8dba37631cd84"} Feb 23 08:45:08 crc kubenswrapper[5047]: I0223 08:45:08.963962 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56c47db55f-krsw7" event={"ID":"5e613490-e007-4f7e-9868-abf59633c7c2","Type":"ContainerStarted","Data":"4c4abeb50762236bdb25af75e59326ffcbc0cafbe34c51b67459b228cce0deb5"} Feb 23 08:45:08 crc kubenswrapper[5047]: I0223 08:45:08.964006 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56c47db55f-krsw7" event={"ID":"5e613490-e007-4f7e-9868-abf59633c7c2","Type":"ContainerStarted","Data":"2b967aa0fbb7606bb8ade55e0690f140e4568286a58a2b7dd614a5b813dee08a"} Feb 23 08:45:08 crc kubenswrapper[5047]: I0223 08:45:08.987236 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" podStartSLOduration=3.987206871 podStartE2EDuration="3.987206871s" podCreationTimestamp="2026-02-23 08:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:08.97668707 +0000 UTC m=+7231.228014194" watchObservedRunningTime="2026-02-23 08:45:08.987206871 +0000 UTC m=+7231.238534005" Feb 23 08:45:09 crc kubenswrapper[5047]: I0223 08:45:09.004561 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" podStartSLOduration=2.213007587 podStartE2EDuration="4.004541798s" podCreationTimestamp="2026-02-23 08:45:05 +0000 UTC" firstStartedPulling="2026-02-23 08:45:06.030144903 +0000 UTC m=+7228.281472037" lastFinishedPulling="2026-02-23 08:45:07.821679084 +0000 UTC m=+7230.073006248" observedRunningTime="2026-02-23 08:45:09.000632232 +0000 UTC m=+7231.251959366" watchObservedRunningTime="2026-02-23 08:45:09.004541798 +0000 UTC m=+7231.255868932" Feb 23 08:45:09 crc kubenswrapper[5047]: I0223 08:45:09.065807 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d854f95bd-qf2l8" podStartSLOduration=2.065786041 podStartE2EDuration="2.065786041s" podCreationTimestamp="2026-02-23 08:45:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:09.058844265 +0000 UTC m=+7231.310171399" watchObservedRunningTime="2026-02-23 08:45:09.065786041 +0000 UTC m=+7231.317113175" Feb 23 08:45:09 crc kubenswrapper[5047]: I0223 08:45:09.067288 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5c48d97d68-59ffb" podStartSLOduration=4.067281731 podStartE2EDuration="4.067281731s" podCreationTimestamp="2026-02-23 08:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:09.032433845 +0000 UTC m=+7231.283760999" watchObservedRunningTime="2026-02-23 08:45:09.067281731 +0000 UTC m=+7231.318608865" Feb 23 08:45:09 crc kubenswrapper[5047]: I0223 08:45:09.088124 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-56c47db55f-krsw7" podStartSLOduration=2.255660691 podStartE2EDuration="4.08809643s" podCreationTimestamp="2026-02-23 08:45:05 +0000 UTC" firstStartedPulling="2026-02-23 08:45:05.986668046 +0000 UTC m=+7228.237995180" lastFinishedPulling="2026-02-23 08:45:07.819103785 +0000 UTC m=+7230.070430919" observedRunningTime="2026-02-23 08:45:09.084215636 +0000 UTC m=+7231.335542770" watchObservedRunningTime="2026-02-23 08:45:09.08809643 +0000 UTC m=+7231.339423564" Feb 23 08:45:09 crc kubenswrapper[5047]: I0223 08:45:09.974074 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:09 crc kubenswrapper[5047]: I0223 08:45:09.974366 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:09 crc kubenswrapper[5047]: I0223 08:45:09.974378 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:15 crc kubenswrapper[5047]: I0223 08:45:15.784107 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:15 crc kubenswrapper[5047]: I0223 08:45:15.869661 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb8ffc8f7-vngbx"] Feb 23 08:45:15 crc kubenswrapper[5047]: I0223 08:45:15.870058 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" podUID="ea0fb049-f486-4799-8cc4-31c66c897340" containerName="dnsmasq-dns" containerID="cri-o://ab938f6b20aa6b8432c90c929ffb9c529dc3c97a1dc55488a857777ae5c5002e" gracePeriod=10 Feb 23 08:45:15 crc kubenswrapper[5047]: I0223 08:45:15.897492 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" podUID="ea0fb049-f486-4799-8cc4-31c66c897340" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.34:5353: connect: connection refused" Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.042502 5047 generic.go:334] "Generic (PLEG): container finished" podID="ea0fb049-f486-4799-8cc4-31c66c897340" containerID="ab938f6b20aa6b8432c90c929ffb9c529dc3c97a1dc55488a857777ae5c5002e" exitCode=0 Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.042871 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" event={"ID":"ea0fb049-f486-4799-8cc4-31c66c897340","Type":"ContainerDied","Data":"ab938f6b20aa6b8432c90c929ffb9c529dc3c97a1dc55488a857777ae5c5002e"} Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.356398 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.455809 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9crr9\" (UniqueName: \"kubernetes.io/projected/ea0fb049-f486-4799-8cc4-31c66c897340-kube-api-access-9crr9\") pod \"ea0fb049-f486-4799-8cc4-31c66c897340\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.455892 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-config\") pod \"ea0fb049-f486-4799-8cc4-31c66c897340\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.456014 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-ovsdbserver-nb\") pod \"ea0fb049-f486-4799-8cc4-31c66c897340\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.456104 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-dns-svc\") pod \"ea0fb049-f486-4799-8cc4-31c66c897340\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.456128 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-ovsdbserver-sb\") pod \"ea0fb049-f486-4799-8cc4-31c66c897340\" (UID: \"ea0fb049-f486-4799-8cc4-31c66c897340\") " Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.478267 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea0fb049-f486-4799-8cc4-31c66c897340-kube-api-access-9crr9" (OuterVolumeSpecName: "kube-api-access-9crr9") pod "ea0fb049-f486-4799-8cc4-31c66c897340" (UID: "ea0fb049-f486-4799-8cc4-31c66c897340"). InnerVolumeSpecName "kube-api-access-9crr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.528081 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ea0fb049-f486-4799-8cc4-31c66c897340" (UID: "ea0fb049-f486-4799-8cc4-31c66c897340"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.531508 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ea0fb049-f486-4799-8cc4-31c66c897340" (UID: "ea0fb049-f486-4799-8cc4-31c66c897340"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.540671 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea0fb049-f486-4799-8cc4-31c66c897340" (UID: "ea0fb049-f486-4799-8cc4-31c66c897340"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.541433 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-config" (OuterVolumeSpecName: "config") pod "ea0fb049-f486-4799-8cc4-31c66c897340" (UID: "ea0fb049-f486-4799-8cc4-31c66c897340"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.559280 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9crr9\" (UniqueName: \"kubernetes.io/projected/ea0fb049-f486-4799-8cc4-31c66c897340-kube-api-access-9crr9\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.559623 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.559688 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.559748 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.559802 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea0fb049-f486-4799-8cc4-31c66c897340-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.760281 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:45:16 crc kubenswrapper[5047]: I0223 08:45:16.761311 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:45:17 crc kubenswrapper[5047]: I0223 08:45:17.075245 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" event={"ID":"ea0fb049-f486-4799-8cc4-31c66c897340","Type":"ContainerDied","Data":"1e0b35e267dd0cf0de662bb8dbf36b4dc5bb65d9763cf0979b958577af88e7ea"} Feb 23 08:45:17 crc kubenswrapper[5047]: I0223 08:45:17.075392 5047 scope.go:117] "RemoveContainer" containerID="ab938f6b20aa6b8432c90c929ffb9c529dc3c97a1dc55488a857777ae5c5002e" Feb 23 08:45:17 crc kubenswrapper[5047]: I0223 08:45:17.075402 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb8ffc8f7-vngbx" Feb 23 08:45:17 crc kubenswrapper[5047]: I0223 08:45:17.104580 5047 scope.go:117] "RemoveContainer" containerID="1dac1d9d7ceac931c476e1fa89a68153eaabc3efa8351493368913e15b132aa7" Feb 23 08:45:17 crc kubenswrapper[5047]: I0223 08:45:17.123016 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb8ffc8f7-vngbx"] Feb 23 08:45:17 crc kubenswrapper[5047]: I0223 08:45:17.133549 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb8ffc8f7-vngbx"] Feb 23 08:45:17 crc kubenswrapper[5047]: I0223 08:45:17.597723 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:17 crc kubenswrapper[5047]: I0223 08:45:17.826938 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:18 crc kubenswrapper[5047]: I0223 08:45:18.351856 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea0fb049-f486-4799-8cc4-31c66c897340" path="/var/lib/kubelet/pods/ea0fb049-f486-4799-8cc4-31c66c897340/volumes" Feb 23 08:45:19 crc kubenswrapper[5047]: I0223 08:45:19.332660 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:19 crc kubenswrapper[5047]: I0223 08:45:19.344926 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 08:45:19 crc kubenswrapper[5047]: I0223 08:45:19.414201 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c48d97d68-59ffb"] Feb 23 08:45:19 crc kubenswrapper[5047]: I0223 08:45:19.414440 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c48d97d68-59ffb" podUID="5dadff11-3a5a-4935-ad6a-dec911875ebb" containerName="barbican-api-log" containerID="cri-o://eff9948e009c554275cf5226879a4408cd0514825f6f1167501a58fe206dfc21" gracePeriod=30 Feb 23 08:45:19 crc kubenswrapper[5047]: I0223 08:45:19.414522 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c48d97d68-59ffb" podUID="5dadff11-3a5a-4935-ad6a-dec911875ebb" containerName="barbican-api" containerID="cri-o://9ad46517bc4089f11412129ad3029ac9142807c4030217a51f92372d30bb6be2" gracePeriod=30 Feb 23 08:45:19 crc kubenswrapper[5047]: I0223 08:45:19.435468 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c48d97d68-59ffb" podUID="5dadff11-3a5a-4935-ad6a-dec911875ebb" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.47:9311/healthcheck\": EOF" Feb 23 08:45:20 crc kubenswrapper[5047]: I0223 08:45:20.121388 5047 generic.go:334] "Generic (PLEG): container finished" podID="5dadff11-3a5a-4935-ad6a-dec911875ebb" containerID="eff9948e009c554275cf5226879a4408cd0514825f6f1167501a58fe206dfc21" exitCode=143 Feb 23 08:45:20 crc kubenswrapper[5047]: I0223 08:45:20.121538 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c48d97d68-59ffb" event={"ID":"5dadff11-3a5a-4935-ad6a-dec911875ebb","Type":"ContainerDied","Data":"eff9948e009c554275cf5226879a4408cd0514825f6f1167501a58fe206dfc21"} Feb 23 08:45:21 crc kubenswrapper[5047]: I0223 08:45:21.808367 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xmw4b"] Feb 23 08:45:21 crc kubenswrapper[5047]: E0223 08:45:21.809694 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0fb049-f486-4799-8cc4-31c66c897340" containerName="dnsmasq-dns" Feb 23 08:45:21 crc kubenswrapper[5047]: I0223 08:45:21.809730 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0fb049-f486-4799-8cc4-31c66c897340" containerName="dnsmasq-dns" Feb 23 08:45:21 crc kubenswrapper[5047]: E0223 08:45:21.809793 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0fb049-f486-4799-8cc4-31c66c897340" containerName="init" Feb 23 08:45:21 crc kubenswrapper[5047]: I0223 08:45:21.809811 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0fb049-f486-4799-8cc4-31c66c897340" containerName="init" Feb 23 08:45:21 crc kubenswrapper[5047]: I0223 08:45:21.810487 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea0fb049-f486-4799-8cc4-31c66c897340" containerName="dnsmasq-dns" Feb 23 08:45:21 crc kubenswrapper[5047]: I0223 08:45:21.813728 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:21 crc kubenswrapper[5047]: I0223 08:45:21.833723 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xmw4b"] Feb 23 08:45:21 crc kubenswrapper[5047]: I0223 08:45:21.910537 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxvzz\" (UniqueName: \"kubernetes.io/projected/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-kube-api-access-lxvzz\") pod \"community-operators-xmw4b\" (UID: \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\") " pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:21 crc kubenswrapper[5047]: I0223 08:45:21.911216 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-utilities\") pod \"community-operators-xmw4b\" (UID: \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\") " pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:21 crc kubenswrapper[5047]: I0223 08:45:21.911299 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-catalog-content\") pod \"community-operators-xmw4b\" (UID: \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\") " pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:22 crc kubenswrapper[5047]: I0223 08:45:22.012750 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-utilities\") pod \"community-operators-xmw4b\" (UID: \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\") " pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:22 crc kubenswrapper[5047]: I0223 08:45:22.012799 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-catalog-content\") pod \"community-operators-xmw4b\" (UID: \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\") " pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:22 crc kubenswrapper[5047]: I0223 08:45:22.012864 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxvzz\" (UniqueName: \"kubernetes.io/projected/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-kube-api-access-lxvzz\") pod \"community-operators-xmw4b\" (UID: \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\") " pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:22 crc kubenswrapper[5047]: I0223 08:45:22.013778 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-utilities\") pod \"community-operators-xmw4b\" (UID: \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\") " pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:22 crc kubenswrapper[5047]: I0223 08:45:22.013876 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-catalog-content\") pod \"community-operators-xmw4b\" (UID: \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\") " pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:22 crc kubenswrapper[5047]: I0223 08:45:22.047288 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxvzz\" (UniqueName: \"kubernetes.io/projected/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-kube-api-access-lxvzz\") pod \"community-operators-xmw4b\" (UID: \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\") " pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:22 crc kubenswrapper[5047]: I0223 08:45:22.133644 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:22 crc kubenswrapper[5047]: I0223 08:45:22.696359 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xmw4b"] Feb 23 08:45:22 crc kubenswrapper[5047]: W0223 08:45:22.709189 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ed4d35_2262_4f98_ad36_5c3c36b97a3e.slice/crio-dbae555aaab3bc2bd2db527ffb593bbaea33ba87887fce63d6eebdd07beb1472 WatchSource:0}: Error finding container dbae555aaab3bc2bd2db527ffb593bbaea33ba87887fce63d6eebdd07beb1472: Status 404 returned error can't find the container with id dbae555aaab3bc2bd2db527ffb593bbaea33ba87887fce63d6eebdd07beb1472 Feb 23 08:45:22 crc kubenswrapper[5047]: I0223 08:45:22.835725 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c48d97d68-59ffb" podUID="5dadff11-3a5a-4935-ad6a-dec911875ebb" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.47:9311/healthcheck\": read tcp 10.217.0.2:35600->10.217.1.47:9311: read: connection reset by peer" Feb 23 08:45:22 crc kubenswrapper[5047]: I0223 08:45:22.835782 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c48d97d68-59ffb" podUID="5dadff11-3a5a-4935-ad6a-dec911875ebb" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.47:9311/healthcheck\": read tcp 10.217.0.2:35612->10.217.1.47:9311: read: connection reset by peer" Feb 23 08:45:23 crc kubenswrapper[5047]: E0223 08:45:23.099640 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ed4d35_2262_4f98_ad36_5c3c36b97a3e.slice/crio-conmon-a981dabcffe67846888ca6f5f16cb9857a76d1c5757cda7a82d7452744a10c9a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ed4d35_2262_4f98_ad36_5c3c36b97a3e.slice/crio-a981dabcffe67846888ca6f5f16cb9857a76d1c5757cda7a82d7452744a10c9a.scope\": RecentStats: unable to find data in memory cache]" Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.164795 5047 generic.go:334] "Generic (PLEG): container finished" podID="a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" containerID="a981dabcffe67846888ca6f5f16cb9857a76d1c5757cda7a82d7452744a10c9a" exitCode=0 Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.164926 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw4b" event={"ID":"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e","Type":"ContainerDied","Data":"a981dabcffe67846888ca6f5f16cb9857a76d1c5757cda7a82d7452744a10c9a"} Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.164964 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw4b" event={"ID":"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e","Type":"ContainerStarted","Data":"dbae555aaab3bc2bd2db527ffb593bbaea33ba87887fce63d6eebdd07beb1472"} Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.171091 5047 generic.go:334] "Generic (PLEG): container finished" podID="5dadff11-3a5a-4935-ad6a-dec911875ebb" containerID="9ad46517bc4089f11412129ad3029ac9142807c4030217a51f92372d30bb6be2" exitCode=0 Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.171162 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c48d97d68-59ffb" event={"ID":"5dadff11-3a5a-4935-ad6a-dec911875ebb","Type":"ContainerDied","Data":"9ad46517bc4089f11412129ad3029ac9142807c4030217a51f92372d30bb6be2"} Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.334817 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.464035 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-config-data-custom\") pod \"5dadff11-3a5a-4935-ad6a-dec911875ebb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.464093 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89d2n\" (UniqueName: \"kubernetes.io/projected/5dadff11-3a5a-4935-ad6a-dec911875ebb-kube-api-access-89d2n\") pod \"5dadff11-3a5a-4935-ad6a-dec911875ebb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.464157 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-combined-ca-bundle\") pod \"5dadff11-3a5a-4935-ad6a-dec911875ebb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.464246 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dadff11-3a5a-4935-ad6a-dec911875ebb-logs\") pod \"5dadff11-3a5a-4935-ad6a-dec911875ebb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.464290 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-config-data\") pod \"5dadff11-3a5a-4935-ad6a-dec911875ebb\" (UID: \"5dadff11-3a5a-4935-ad6a-dec911875ebb\") " Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.465239 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dadff11-3a5a-4935-ad6a-dec911875ebb-logs" (OuterVolumeSpecName: "logs") pod "5dadff11-3a5a-4935-ad6a-dec911875ebb" (UID: "5dadff11-3a5a-4935-ad6a-dec911875ebb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.475216 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dadff11-3a5a-4935-ad6a-dec911875ebb-kube-api-access-89d2n" (OuterVolumeSpecName: "kube-api-access-89d2n") pod "5dadff11-3a5a-4935-ad6a-dec911875ebb" (UID: "5dadff11-3a5a-4935-ad6a-dec911875ebb"). InnerVolumeSpecName "kube-api-access-89d2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.485893 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5dadff11-3a5a-4935-ad6a-dec911875ebb" (UID: "5dadff11-3a5a-4935-ad6a-dec911875ebb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.497457 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dadff11-3a5a-4935-ad6a-dec911875ebb" (UID: "5dadff11-3a5a-4935-ad6a-dec911875ebb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.532912 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-config-data" (OuterVolumeSpecName: "config-data") pod "5dadff11-3a5a-4935-ad6a-dec911875ebb" (UID: "5dadff11-3a5a-4935-ad6a-dec911875ebb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.570572 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dadff11-3a5a-4935-ad6a-dec911875ebb-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.570624 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.570637 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.570655 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89d2n\" (UniqueName: \"kubernetes.io/projected/5dadff11-3a5a-4935-ad6a-dec911875ebb-kube-api-access-89d2n\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:23 crc kubenswrapper[5047]: I0223 08:45:23.570667 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dadff11-3a5a-4935-ad6a-dec911875ebb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:24 crc kubenswrapper[5047]: I0223 08:45:24.184144 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw4b" event={"ID":"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e","Type":"ContainerStarted","Data":"1e1c7042b0d09b83ed7ebacd98d516960fec21979072b96038406cc60415dac1"} Feb 23 08:45:24 crc kubenswrapper[5047]: I0223 08:45:24.187294 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c48d97d68-59ffb" event={"ID":"5dadff11-3a5a-4935-ad6a-dec911875ebb","Type":"ContainerDied","Data":"697ef8c2de5903124932940f0584217f43c638d5bc856a60937c4ebfde8e4ed8"} Feb 23 08:45:24 crc kubenswrapper[5047]: I0223 08:45:24.187349 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c48d97d68-59ffb" Feb 23 08:45:24 crc kubenswrapper[5047]: I0223 08:45:24.187379 5047 scope.go:117] "RemoveContainer" containerID="9ad46517bc4089f11412129ad3029ac9142807c4030217a51f92372d30bb6be2" Feb 23 08:45:24 crc kubenswrapper[5047]: I0223 08:45:24.223664 5047 scope.go:117] "RemoveContainer" containerID="eff9948e009c554275cf5226879a4408cd0514825f6f1167501a58fe206dfc21" Feb 23 08:45:24 crc kubenswrapper[5047]: I0223 08:45:24.262242 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c48d97d68-59ffb"] Feb 23 08:45:24 crc kubenswrapper[5047]: I0223 08:45:24.265966 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5c48d97d68-59ffb"] Feb 23 08:45:24 crc kubenswrapper[5047]: I0223 08:45:24.356238 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dadff11-3a5a-4935-ad6a-dec911875ebb" path="/var/lib/kubelet/pods/5dadff11-3a5a-4935-ad6a-dec911875ebb/volumes" Feb 23 08:45:25 crc kubenswrapper[5047]: I0223 08:45:25.218120 5047 generic.go:334] "Generic (PLEG): container finished" podID="a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" containerID="1e1c7042b0d09b83ed7ebacd98d516960fec21979072b96038406cc60415dac1" exitCode=0 Feb 23 08:45:25 crc kubenswrapper[5047]: I0223 08:45:25.218246 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw4b" event={"ID":"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e","Type":"ContainerDied","Data":"1e1c7042b0d09b83ed7ebacd98d516960fec21979072b96038406cc60415dac1"} Feb 23 08:45:26 crc kubenswrapper[5047]: I0223 08:45:26.248348 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw4b" event={"ID":"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e","Type":"ContainerStarted","Data":"9d1bf75208e8fb1456ea1b2b64c6f201bcc3869e45634522b9eaefda6c28ed29"} Feb 23 08:45:26 crc kubenswrapper[5047]: I0223 08:45:26.272379 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xmw4b" podStartSLOduration=2.597370883 podStartE2EDuration="5.27235025s" podCreationTimestamp="2026-02-23 08:45:21 +0000 UTC" firstStartedPulling="2026-02-23 08:45:23.168951823 +0000 UTC m=+7245.420278957" lastFinishedPulling="2026-02-23 08:45:25.84393118 +0000 UTC m=+7248.095258324" observedRunningTime="2026-02-23 08:45:26.270444959 +0000 UTC m=+7248.521772103" watchObservedRunningTime="2026-02-23 08:45:26.27235025 +0000 UTC m=+7248.523677404" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.059060 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-246gf"] Feb 23 08:45:27 crc kubenswrapper[5047]: E0223 08:45:27.060235 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dadff11-3a5a-4935-ad6a-dec911875ebb" containerName="barbican-api" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.060310 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dadff11-3a5a-4935-ad6a-dec911875ebb" containerName="barbican-api" Feb 23 08:45:27 crc kubenswrapper[5047]: E0223 08:45:27.060367 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dadff11-3a5a-4935-ad6a-dec911875ebb" containerName="barbican-api-log" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.060416 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dadff11-3a5a-4935-ad6a-dec911875ebb" containerName="barbican-api-log" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.060636 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dadff11-3a5a-4935-ad6a-dec911875ebb" containerName="barbican-api-log" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.060710 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dadff11-3a5a-4935-ad6a-dec911875ebb" containerName="barbican-api" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.061363 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-246gf" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.077617 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-246gf"] Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.168620 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-317c-account-create-update-88hlp"] Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.170235 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-317c-account-create-update-88hlp" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.175625 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.183786 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-317c-account-create-update-88hlp"] Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.260286 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24f0974c-f916-44fb-92ff-4e3c7dafadd1-operator-scripts\") pod \"neutron-db-create-246gf\" (UID: \"24f0974c-f916-44fb-92ff-4e3c7dafadd1\") " pod="openstack/neutron-db-create-246gf" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.260565 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwb68\" (UniqueName: \"kubernetes.io/projected/24f0974c-f916-44fb-92ff-4e3c7dafadd1-kube-api-access-wwb68\") pod \"neutron-db-create-246gf\" (UID: \"24f0974c-f916-44fb-92ff-4e3c7dafadd1\") " pod="openstack/neutron-db-create-246gf" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.362142 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdn8w\" (UniqueName: \"kubernetes.io/projected/e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb-kube-api-access-gdn8w\") pod \"neutron-317c-account-create-update-88hlp\" (UID: \"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb\") " pod="openstack/neutron-317c-account-create-update-88hlp" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.362350 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwb68\" (UniqueName: \"kubernetes.io/projected/24f0974c-f916-44fb-92ff-4e3c7dafadd1-kube-api-access-wwb68\") pod \"neutron-db-create-246gf\" (UID: \"24f0974c-f916-44fb-92ff-4e3c7dafadd1\") " pod="openstack/neutron-db-create-246gf" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.362699 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb-operator-scripts\") pod \"neutron-317c-account-create-update-88hlp\" (UID: \"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb\") " pod="openstack/neutron-317c-account-create-update-88hlp" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.362810 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24f0974c-f916-44fb-92ff-4e3c7dafadd1-operator-scripts\") pod \"neutron-db-create-246gf\" (UID: \"24f0974c-f916-44fb-92ff-4e3c7dafadd1\") " pod="openstack/neutron-db-create-246gf" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.363817 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24f0974c-f916-44fb-92ff-4e3c7dafadd1-operator-scripts\") pod \"neutron-db-create-246gf\" (UID: \"24f0974c-f916-44fb-92ff-4e3c7dafadd1\") " pod="openstack/neutron-db-create-246gf" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.387882 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwb68\" (UniqueName: \"kubernetes.io/projected/24f0974c-f916-44fb-92ff-4e3c7dafadd1-kube-api-access-wwb68\") pod \"neutron-db-create-246gf\" (UID: \"24f0974c-f916-44fb-92ff-4e3c7dafadd1\") " pod="openstack/neutron-db-create-246gf" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.464689 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdn8w\" (UniqueName: \"kubernetes.io/projected/e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb-kube-api-access-gdn8w\") pod \"neutron-317c-account-create-update-88hlp\" (UID: \"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb\") " pod="openstack/neutron-317c-account-create-update-88hlp" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.464833 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb-operator-scripts\") pod \"neutron-317c-account-create-update-88hlp\" (UID: \"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb\") " pod="openstack/neutron-317c-account-create-update-88hlp" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.465874 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb-operator-scripts\") pod \"neutron-317c-account-create-update-88hlp\" (UID: \"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb\") " pod="openstack/neutron-317c-account-create-update-88hlp" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.483472 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdn8w\" (UniqueName: \"kubernetes.io/projected/e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb-kube-api-access-gdn8w\") pod \"neutron-317c-account-create-update-88hlp\" (UID: \"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb\") " pod="openstack/neutron-317c-account-create-update-88hlp" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.491542 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-317c-account-create-update-88hlp" Feb 23 08:45:27 crc kubenswrapper[5047]: I0223 08:45:27.679861 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-246gf" Feb 23 08:45:28 crc kubenswrapper[5047]: I0223 08:45:28.016366 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-317c-account-create-update-88hlp"] Feb 23 08:45:28 crc kubenswrapper[5047]: I0223 08:45:28.161777 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-246gf"] Feb 23 08:45:28 crc kubenswrapper[5047]: I0223 08:45:28.309983 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-246gf" event={"ID":"24f0974c-f916-44fb-92ff-4e3c7dafadd1","Type":"ContainerStarted","Data":"aa2ddbcd20394cb8534c4f03c160c7f3f4f86a18e5b3b191aa52bcaeca1cdb2c"} Feb 23 08:45:28 crc kubenswrapper[5047]: I0223 08:45:28.328050 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-317c-account-create-update-88hlp" event={"ID":"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb","Type":"ContainerStarted","Data":"e7f35f7cdd514ebfa9ff8bac6c35833daf8e0cbdaf1da0233acf88329b6c931d"} Feb 23 08:45:28 crc kubenswrapper[5047]: I0223 08:45:28.328113 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-317c-account-create-update-88hlp" event={"ID":"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb","Type":"ContainerStarted","Data":"4d24477ee31833e93db2a1de67e0fcb025229cfde2ad85d17f6b58f130d47dab"} Feb 23 08:45:28 crc kubenswrapper[5047]: I0223 08:45:28.397822 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-317c-account-create-update-88hlp" podStartSLOduration=1.397794474 podStartE2EDuration="1.397794474s" podCreationTimestamp="2026-02-23 08:45:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:28.364592923 +0000 UTC m=+7250.615920057" watchObservedRunningTime="2026-02-23 08:45:28.397794474 +0000 UTC m=+7250.649121608" Feb 23 08:45:29 crc kubenswrapper[5047]: I0223 08:45:29.340902 5047 generic.go:334] "Generic (PLEG): container finished" podID="e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb" containerID="e7f35f7cdd514ebfa9ff8bac6c35833daf8e0cbdaf1da0233acf88329b6c931d" exitCode=0 Feb 23 08:45:29 crc kubenswrapper[5047]: I0223 08:45:29.341077 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-317c-account-create-update-88hlp" event={"ID":"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb","Type":"ContainerDied","Data":"e7f35f7cdd514ebfa9ff8bac6c35833daf8e0cbdaf1da0233acf88329b6c931d"} Feb 23 08:45:29 crc kubenswrapper[5047]: I0223 08:45:29.345381 5047 generic.go:334] "Generic (PLEG): container finished" podID="24f0974c-f916-44fb-92ff-4e3c7dafadd1" containerID="aa4b9f2c9aa9e2fe310ec55c2272fdcdb5d561e7689f4ce33ddea1abbb0a12c3" exitCode=0 Feb 23 08:45:29 crc kubenswrapper[5047]: I0223 08:45:29.345457 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-246gf" event={"ID":"24f0974c-f916-44fb-92ff-4e3c7dafadd1","Type":"ContainerDied","Data":"aa4b9f2c9aa9e2fe310ec55c2272fdcdb5d561e7689f4ce33ddea1abbb0a12c3"} Feb 23 08:45:30 crc kubenswrapper[5047]: I0223 08:45:30.851330 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-246gf" Feb 23 08:45:30 crc kubenswrapper[5047]: I0223 08:45:30.857787 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-317c-account-create-update-88hlp" Feb 23 08:45:30 crc kubenswrapper[5047]: I0223 08:45:30.963845 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb-operator-scripts\") pod \"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb\" (UID: \"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb\") " Feb 23 08:45:30 crc kubenswrapper[5047]: I0223 08:45:30.963938 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwb68\" (UniqueName: \"kubernetes.io/projected/24f0974c-f916-44fb-92ff-4e3c7dafadd1-kube-api-access-wwb68\") pod \"24f0974c-f916-44fb-92ff-4e3c7dafadd1\" (UID: \"24f0974c-f916-44fb-92ff-4e3c7dafadd1\") " Feb 23 08:45:30 crc kubenswrapper[5047]: I0223 08:45:30.964018 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdn8w\" (UniqueName: \"kubernetes.io/projected/e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb-kube-api-access-gdn8w\") pod \"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb\" (UID: \"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb\") " Feb 23 08:45:30 crc kubenswrapper[5047]: I0223 08:45:30.964040 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24f0974c-f916-44fb-92ff-4e3c7dafadd1-operator-scripts\") pod \"24f0974c-f916-44fb-92ff-4e3c7dafadd1\" (UID: \"24f0974c-f916-44fb-92ff-4e3c7dafadd1\") " Feb 23 08:45:30 crc kubenswrapper[5047]: I0223 08:45:30.965015 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb" (UID: "e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:30 crc kubenswrapper[5047]: I0223 08:45:30.965194 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24f0974c-f916-44fb-92ff-4e3c7dafadd1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24f0974c-f916-44fb-92ff-4e3c7dafadd1" (UID: "24f0974c-f916-44fb-92ff-4e3c7dafadd1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:30 crc kubenswrapper[5047]: I0223 08:45:30.970159 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f0974c-f916-44fb-92ff-4e3c7dafadd1-kube-api-access-wwb68" (OuterVolumeSpecName: "kube-api-access-wwb68") pod "24f0974c-f916-44fb-92ff-4e3c7dafadd1" (UID: "24f0974c-f916-44fb-92ff-4e3c7dafadd1"). InnerVolumeSpecName "kube-api-access-wwb68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:30 crc kubenswrapper[5047]: I0223 08:45:30.971583 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb-kube-api-access-gdn8w" (OuterVolumeSpecName: "kube-api-access-gdn8w") pod "e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb" (UID: "e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb"). InnerVolumeSpecName "kube-api-access-gdn8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:31 crc kubenswrapper[5047]: I0223 08:45:31.067435 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:31 crc kubenswrapper[5047]: I0223 08:45:31.067490 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwb68\" (UniqueName: \"kubernetes.io/projected/24f0974c-f916-44fb-92ff-4e3c7dafadd1-kube-api-access-wwb68\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:31 crc kubenswrapper[5047]: I0223 08:45:31.067510 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdn8w\" (UniqueName: \"kubernetes.io/projected/e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb-kube-api-access-gdn8w\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:31 crc kubenswrapper[5047]: I0223 08:45:31.067528 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24f0974c-f916-44fb-92ff-4e3c7dafadd1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:31 crc kubenswrapper[5047]: I0223 08:45:31.376251 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-317c-account-create-update-88hlp" event={"ID":"e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb","Type":"ContainerDied","Data":"4d24477ee31833e93db2a1de67e0fcb025229cfde2ad85d17f6b58f130d47dab"} Feb 23 08:45:31 crc kubenswrapper[5047]: I0223 08:45:31.376892 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d24477ee31833e93db2a1de67e0fcb025229cfde2ad85d17f6b58f130d47dab" Feb 23 08:45:31 crc kubenswrapper[5047]: I0223 08:45:31.377010 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-317c-account-create-update-88hlp" Feb 23 08:45:31 crc kubenswrapper[5047]: I0223 08:45:31.379411 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-246gf" event={"ID":"24f0974c-f916-44fb-92ff-4e3c7dafadd1","Type":"ContainerDied","Data":"aa2ddbcd20394cb8534c4f03c160c7f3f4f86a18e5b3b191aa52bcaeca1cdb2c"} Feb 23 08:45:31 crc kubenswrapper[5047]: I0223 08:45:31.379455 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa2ddbcd20394cb8534c4f03c160c7f3f4f86a18e5b3b191aa52bcaeca1cdb2c" Feb 23 08:45:31 crc kubenswrapper[5047]: I0223 08:45:31.379541 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-246gf" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.134120 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.134205 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.227974 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.376383 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-w76mz"] Feb 23 08:45:32 crc kubenswrapper[5047]: E0223 08:45:32.376948 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb" containerName="mariadb-account-create-update" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.376976 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb" containerName="mariadb-account-create-update" Feb 23 08:45:32 crc kubenswrapper[5047]: E0223 08:45:32.376997 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f0974c-f916-44fb-92ff-4e3c7dafadd1" containerName="mariadb-database-create" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.377012 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f0974c-f916-44fb-92ff-4e3c7dafadd1" containerName="mariadb-database-create" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.377350 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb" containerName="mariadb-account-create-update" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.377380 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="24f0974c-f916-44fb-92ff-4e3c7dafadd1" containerName="mariadb-database-create" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.378410 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w76mz" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.381274 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bwb8h" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.382971 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.385555 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.445056 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-w76mz"] Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.469820 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.498995 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-combined-ca-bundle\") pod \"neutron-db-sync-w76mz\" (UID: \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\") " pod="openstack/neutron-db-sync-w76mz" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.499067 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-config\") pod \"neutron-db-sync-w76mz\" (UID: \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\") " pod="openstack/neutron-db-sync-w76mz" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.499215 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr77c\" (UniqueName: \"kubernetes.io/projected/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-kube-api-access-cr77c\") pod \"neutron-db-sync-w76mz\" (UID: \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\") " pod="openstack/neutron-db-sync-w76mz" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.523237 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xmw4b"] Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.601897 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-combined-ca-bundle\") pod \"neutron-db-sync-w76mz\" (UID: \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\") " pod="openstack/neutron-db-sync-w76mz" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.602353 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-config\") pod \"neutron-db-sync-w76mz\" (UID: \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\") " pod="openstack/neutron-db-sync-w76mz" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.602498 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr77c\" (UniqueName: \"kubernetes.io/projected/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-kube-api-access-cr77c\") pod \"neutron-db-sync-w76mz\" (UID: \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\") " pod="openstack/neutron-db-sync-w76mz" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.614839 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-config\") pod \"neutron-db-sync-w76mz\" (UID: \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\") " pod="openstack/neutron-db-sync-w76mz" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.628850 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-combined-ca-bundle\") pod \"neutron-db-sync-w76mz\" (UID: \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\") " pod="openstack/neutron-db-sync-w76mz" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.654644 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr77c\" (UniqueName: \"kubernetes.io/projected/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-kube-api-access-cr77c\") pod \"neutron-db-sync-w76mz\" (UID: \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\") " pod="openstack/neutron-db-sync-w76mz" Feb 23 08:45:32 crc kubenswrapper[5047]: I0223 08:45:32.709760 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w76mz" Feb 23 08:45:33 crc kubenswrapper[5047]: I0223 08:45:33.264127 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-w76mz"] Feb 23 08:45:33 crc kubenswrapper[5047]: I0223 08:45:33.412696 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w76mz" event={"ID":"d77e4cfd-baf4-4d7f-bc0e-a28608a27638","Type":"ContainerStarted","Data":"a775166635dbb1def2517223661ac61ab9768b0320eaa2bfa86a8a577b8c54a9"} Feb 23 08:45:34 crc kubenswrapper[5047]: I0223 08:45:34.424689 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w76mz" event={"ID":"d77e4cfd-baf4-4d7f-bc0e-a28608a27638","Type":"ContainerStarted","Data":"1221a96ef2e1d143401f421d9dbd9acc36d207a6a2d1b24a62529ccca05df63f"} Feb 23 08:45:34 crc kubenswrapper[5047]: I0223 08:45:34.425008 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xmw4b" podUID="a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" containerName="registry-server" containerID="cri-o://9d1bf75208e8fb1456ea1b2b64c6f201bcc3869e45634522b9eaefda6c28ed29" gracePeriod=2 Feb 23 08:45:34 crc kubenswrapper[5047]: I0223 08:45:34.459923 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-w76mz" podStartSLOduration=2.459872213 podStartE2EDuration="2.459872213s" podCreationTimestamp="2026-02-23 08:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:34.445950999 +0000 UTC m=+7256.697278173" watchObservedRunningTime="2026-02-23 08:45:34.459872213 +0000 UTC m=+7256.711199367" Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.439867 5047 generic.go:334] "Generic (PLEG): container finished" podID="a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" containerID="9d1bf75208e8fb1456ea1b2b64c6f201bcc3869e45634522b9eaefda6c28ed29" exitCode=0 Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.440028 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw4b" event={"ID":"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e","Type":"ContainerDied","Data":"9d1bf75208e8fb1456ea1b2b64c6f201bcc3869e45634522b9eaefda6c28ed29"} Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.440343 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmw4b" event={"ID":"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e","Type":"ContainerDied","Data":"dbae555aaab3bc2bd2db527ffb593bbaea33ba87887fce63d6eebdd07beb1472"} Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.440372 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbae555aaab3bc2bd2db527ffb593bbaea33ba87887fce63d6eebdd07beb1472" Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.453899 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.576308 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-utilities\") pod \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\" (UID: \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\") " Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.576426 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxvzz\" (UniqueName: \"kubernetes.io/projected/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-kube-api-access-lxvzz\") pod \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\" (UID: \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\") " Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.576469 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-catalog-content\") pod \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\" (UID: \"a6ed4d35-2262-4f98-ad36-5c3c36b97a3e\") " Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.577985 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-utilities" (OuterVolumeSpecName: "utilities") pod "a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" (UID: "a6ed4d35-2262-4f98-ad36-5c3c36b97a3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.585486 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-kube-api-access-lxvzz" (OuterVolumeSpecName: "kube-api-access-lxvzz") pod "a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" (UID: "a6ed4d35-2262-4f98-ad36-5c3c36b97a3e"). InnerVolumeSpecName "kube-api-access-lxvzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.650109 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" (UID: "a6ed4d35-2262-4f98-ad36-5c3c36b97a3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.679471 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.679536 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:35 crc kubenswrapper[5047]: I0223 08:45:35.679605 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxvzz\" (UniqueName: \"kubernetes.io/projected/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e-kube-api-access-lxvzz\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:36 crc kubenswrapper[5047]: I0223 08:45:36.447372 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmw4b" Feb 23 08:45:36 crc kubenswrapper[5047]: I0223 08:45:36.474184 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xmw4b"] Feb 23 08:45:36 crc kubenswrapper[5047]: I0223 08:45:36.480225 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xmw4b"] Feb 23 08:45:38 crc kubenswrapper[5047]: I0223 08:45:38.352987 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" path="/var/lib/kubelet/pods/a6ed4d35-2262-4f98-ad36-5c3c36b97a3e/volumes" Feb 23 08:45:38 crc kubenswrapper[5047]: I0223 08:45:38.479278 5047 generic.go:334] "Generic (PLEG): container finished" podID="d77e4cfd-baf4-4d7f-bc0e-a28608a27638" containerID="1221a96ef2e1d143401f421d9dbd9acc36d207a6a2d1b24a62529ccca05df63f" exitCode=0 Feb 23 08:45:38 crc kubenswrapper[5047]: I0223 08:45:38.479366 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w76mz" event={"ID":"d77e4cfd-baf4-4d7f-bc0e-a28608a27638","Type":"ContainerDied","Data":"1221a96ef2e1d143401f421d9dbd9acc36d207a6a2d1b24a62529ccca05df63f"} Feb 23 08:45:39 crc kubenswrapper[5047]: I0223 08:45:39.856532 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w76mz" Feb 23 08:45:39 crc kubenswrapper[5047]: I0223 08:45:39.964131 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-config\") pod \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\" (UID: \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\") " Feb 23 08:45:39 crc kubenswrapper[5047]: I0223 08:45:39.964282 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-combined-ca-bundle\") pod \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\" (UID: \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\") " Feb 23 08:45:39 crc kubenswrapper[5047]: I0223 08:45:39.964427 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr77c\" (UniqueName: \"kubernetes.io/projected/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-kube-api-access-cr77c\") pod \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\" (UID: \"d77e4cfd-baf4-4d7f-bc0e-a28608a27638\") " Feb 23 08:45:39 crc kubenswrapper[5047]: I0223 08:45:39.971104 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-kube-api-access-cr77c" (OuterVolumeSpecName: "kube-api-access-cr77c") pod "d77e4cfd-baf4-4d7f-bc0e-a28608a27638" (UID: "d77e4cfd-baf4-4d7f-bc0e-a28608a27638"). InnerVolumeSpecName "kube-api-access-cr77c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:39 crc kubenswrapper[5047]: I0223 08:45:39.997022 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-config" (OuterVolumeSpecName: "config") pod "d77e4cfd-baf4-4d7f-bc0e-a28608a27638" (UID: "d77e4cfd-baf4-4d7f-bc0e-a28608a27638"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:39.999869 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d77e4cfd-baf4-4d7f-bc0e-a28608a27638" (UID: "d77e4cfd-baf4-4d7f-bc0e-a28608a27638"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.066289 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.066340 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.066367 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr77c\" (UniqueName: \"kubernetes.io/projected/d77e4cfd-baf4-4d7f-bc0e-a28608a27638-kube-api-access-cr77c\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.503693 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w76mz" event={"ID":"d77e4cfd-baf4-4d7f-bc0e-a28608a27638","Type":"ContainerDied","Data":"a775166635dbb1def2517223661ac61ab9768b0320eaa2bfa86a8a577b8c54a9"} Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.503736 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a775166635dbb1def2517223661ac61ab9768b0320eaa2bfa86a8a577b8c54a9" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.504244 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w76mz" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.863714 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-84bcc8bdb-bcv5z"] Feb 23 08:45:40 crc kubenswrapper[5047]: E0223 08:45:40.864433 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" containerName="extract-utilities" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.864449 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" containerName="extract-utilities" Feb 23 08:45:40 crc kubenswrapper[5047]: E0223 08:45:40.864461 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77e4cfd-baf4-4d7f-bc0e-a28608a27638" containerName="neutron-db-sync" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.864468 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77e4cfd-baf4-4d7f-bc0e-a28608a27638" containerName="neutron-db-sync" Feb 23 08:45:40 crc kubenswrapper[5047]: E0223 08:45:40.864485 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" containerName="extract-content" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.864491 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" containerName="extract-content" Feb 23 08:45:40 crc kubenswrapper[5047]: E0223 08:45:40.864502 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" containerName="registry-server" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.864508 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" containerName="registry-server" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.864688 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d77e4cfd-baf4-4d7f-bc0e-a28608a27638" containerName="neutron-db-sync" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.864698 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ed4d35-2262-4f98-ad36-5c3c36b97a3e" containerName="registry-server" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.865586 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.870855 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.870976 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.871182 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.871306 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77458987f-gzdbf"] Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.871623 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bwb8h" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.872686 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.880899 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84bcc8bdb-bcv5z"] Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.892956 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77458987f-gzdbf"] Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.984993 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-config\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.985086 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czkwl\" (UniqueName: \"kubernetes.io/projected/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-kube-api-access-czkwl\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.985134 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-ovndb-tls-certs\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.985174 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-ovsdbserver-nb\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.985218 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-config\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.985252 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-dns-svc\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.985274 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-combined-ca-bundle\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.985304 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-ovsdbserver-sb\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.985353 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-httpd-config\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:40 crc kubenswrapper[5047]: I0223 08:45:40.985415 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jchwk\" (UniqueName: \"kubernetes.io/projected/8075f6ee-15d3-4c72-8da3-1754a90710ee-kube-api-access-jchwk\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.088087 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-config\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.088167 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czkwl\" (UniqueName: \"kubernetes.io/projected/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-kube-api-access-czkwl\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.088210 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-ovndb-tls-certs\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.088249 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-ovsdbserver-nb\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.088287 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-config\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.088322 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-dns-svc\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.088363 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-combined-ca-bundle\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.088390 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-ovsdbserver-sb\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.088431 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-httpd-config\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.088489 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jchwk\" (UniqueName: \"kubernetes.io/projected/8075f6ee-15d3-4c72-8da3-1754a90710ee-kube-api-access-jchwk\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.089045 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-config\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.089641 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-dns-svc\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.090036 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-ovsdbserver-sb\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.090349 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-ovsdbserver-nb\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.095195 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-config\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.095770 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-ovndb-tls-certs\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.099583 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-combined-ca-bundle\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.100653 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-httpd-config\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.123738 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czkwl\" (UniqueName: \"kubernetes.io/projected/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-kube-api-access-czkwl\") pod \"dnsmasq-dns-77458987f-gzdbf\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.152797 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jchwk\" (UniqueName: \"kubernetes.io/projected/8075f6ee-15d3-4c72-8da3-1754a90710ee-kube-api-access-jchwk\") pod \"neutron-84bcc8bdb-bcv5z\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.200244 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.207748 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:41 crc kubenswrapper[5047]: I0223 08:45:41.780854 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77458987f-gzdbf"] Feb 23 08:45:42 crc kubenswrapper[5047]: I0223 08:45:42.059506 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84bcc8bdb-bcv5z"] Feb 23 08:45:42 crc kubenswrapper[5047]: W0223 08:45:42.082045 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8075f6ee_15d3_4c72_8da3_1754a90710ee.slice/crio-5d6f8ae610b910be432118f4f505de1760d74e26b3704437faf76ea33743b19b WatchSource:0}: Error finding container 5d6f8ae610b910be432118f4f505de1760d74e26b3704437faf76ea33743b19b: Status 404 returned error can't find the container with id 5d6f8ae610b910be432118f4f505de1760d74e26b3704437faf76ea33743b19b Feb 23 08:45:42 crc kubenswrapper[5047]: I0223 08:45:42.539189 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84bcc8bdb-bcv5z" event={"ID":"8075f6ee-15d3-4c72-8da3-1754a90710ee","Type":"ContainerStarted","Data":"2fa1b80f55ae46237e4cf3484aa320012cbf5b502bff01fd05234a1f684dfe9e"} Feb 23 08:45:42 crc kubenswrapper[5047]: I0223 08:45:42.539708 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84bcc8bdb-bcv5z" event={"ID":"8075f6ee-15d3-4c72-8da3-1754a90710ee","Type":"ContainerStarted","Data":"5d6f8ae610b910be432118f4f505de1760d74e26b3704437faf76ea33743b19b"} Feb 23 08:45:42 crc kubenswrapper[5047]: I0223 08:45:42.555028 5047 generic.go:334] "Generic (PLEG): container finished" podID="ada6a2f2-1e71-45a6-91ef-73f00a6194f1" containerID="05194a266cf7c256bf1de95e0e41314bb4da672c8141ebc5b26c6f2e7850498b" exitCode=0 Feb 23 08:45:42 crc kubenswrapper[5047]: I0223 08:45:42.555098 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77458987f-gzdbf" event={"ID":"ada6a2f2-1e71-45a6-91ef-73f00a6194f1","Type":"ContainerDied","Data":"05194a266cf7c256bf1de95e0e41314bb4da672c8141ebc5b26c6f2e7850498b"} Feb 23 08:45:42 crc kubenswrapper[5047]: I0223 08:45:42.555143 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77458987f-gzdbf" event={"ID":"ada6a2f2-1e71-45a6-91ef-73f00a6194f1","Type":"ContainerStarted","Data":"81a0fd969fc1777b6a85107c69c7f66b079ddf08ba90cf6e6192b6b12ee9f930"} Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.215026 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-755d9b4d6f-jbxjv"] Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.217078 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.222768 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.223067 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.231954 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-internal-tls-certs\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.232045 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsqlz\" (UniqueName: \"kubernetes.io/projected/f1306003-e12b-4db1-beeb-cd461db0975e-kube-api-access-zsqlz\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.232103 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-ovndb-tls-certs\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.232127 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-public-tls-certs\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.232151 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-combined-ca-bundle\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.232485 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-httpd-config\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.232629 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-config\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.256746 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-755d9b4d6f-jbxjv"] Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.333503 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-internal-tls-certs\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.333564 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsqlz\" (UniqueName: \"kubernetes.io/projected/f1306003-e12b-4db1-beeb-cd461db0975e-kube-api-access-zsqlz\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.333605 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-ovndb-tls-certs\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.333624 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-public-tls-certs\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.333649 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-combined-ca-bundle\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.333715 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-httpd-config\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.333765 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-config\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.339333 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-ovndb-tls-certs\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.344474 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-public-tls-certs\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.344729 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-httpd-config\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.345017 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-combined-ca-bundle\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.345162 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-internal-tls-certs\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.345327 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-config\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.367877 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsqlz\" (UniqueName: \"kubernetes.io/projected/f1306003-e12b-4db1-beeb-cd461db0975e-kube-api-access-zsqlz\") pod \"neutron-755d9b4d6f-jbxjv\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.550832 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.602234 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77458987f-gzdbf" event={"ID":"ada6a2f2-1e71-45a6-91ef-73f00a6194f1","Type":"ContainerStarted","Data":"ab70ceb9ecf30e893ccca85afd1ab01fe88aba7a9060c1cb45aa46f5abc35858"} Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.602814 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.638189 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84bcc8bdb-bcv5z" event={"ID":"8075f6ee-15d3-4c72-8da3-1754a90710ee","Type":"ContainerStarted","Data":"9b9aab6ddebfc63cbf7df9b35ce81cbd20dd7b072ee66355a451faedd2da6c45"} Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.638807 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.662990 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77458987f-gzdbf" podStartSLOduration=3.662972349 podStartE2EDuration="3.662972349s" podCreationTimestamp="2026-02-23 08:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:43.659311011 +0000 UTC m=+7265.910638145" watchObservedRunningTime="2026-02-23 08:45:43.662972349 +0000 UTC m=+7265.914299483" Feb 23 08:45:43 crc kubenswrapper[5047]: I0223 08:45:43.694344 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-84bcc8bdb-bcv5z" podStartSLOduration=3.694326442 podStartE2EDuration="3.694326442s" podCreationTimestamp="2026-02-23 08:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:43.690699474 +0000 UTC m=+7265.942026608" watchObservedRunningTime="2026-02-23 08:45:43.694326442 +0000 UTC m=+7265.945653576" Feb 23 08:45:44 crc kubenswrapper[5047]: I0223 08:45:44.354001 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-755d9b4d6f-jbxjv"] Feb 23 08:45:44 crc kubenswrapper[5047]: I0223 08:45:44.678808 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755d9b4d6f-jbxjv" event={"ID":"f1306003-e12b-4db1-beeb-cd461db0975e","Type":"ContainerStarted","Data":"90d8272e14124df09a1e4603e8bf04a61add7c1cf9078eea0c18f84d9f987fe3"} Feb 23 08:45:44 crc kubenswrapper[5047]: I0223 08:45:44.679167 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755d9b4d6f-jbxjv" event={"ID":"f1306003-e12b-4db1-beeb-cd461db0975e","Type":"ContainerStarted","Data":"e34a311f5c039beb78ef7cb8b18633ff064cb83ddb6ea59bdbeed93d52ec9700"} Feb 23 08:45:45 crc kubenswrapper[5047]: I0223 08:45:45.688549 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755d9b4d6f-jbxjv" event={"ID":"f1306003-e12b-4db1-beeb-cd461db0975e","Type":"ContainerStarted","Data":"9ea42abec4f78326158bc5cb0cec2641d24efe082290fbd2778b5995dbeb07c2"} Feb 23 08:45:45 crc kubenswrapper[5047]: I0223 08:45:45.713714 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-755d9b4d6f-jbxjv" podStartSLOduration=2.713698028 podStartE2EDuration="2.713698028s" podCreationTimestamp="2026-02-23 08:45:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:45.707944724 +0000 UTC m=+7267.959271878" watchObservedRunningTime="2026-02-23 08:45:45.713698028 +0000 UTC m=+7267.965025162" Feb 23 08:45:46 crc kubenswrapper[5047]: I0223 08:45:46.699150 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:45:46 crc kubenswrapper[5047]: I0223 08:45:46.759401 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:45:46 crc kubenswrapper[5047]: I0223 08:45:46.759476 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.210127 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.265515 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-549cbc9d6f-h9h4z"] Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.265876 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" podUID="ce2a4182-5761-4a21-82b4-02f46e3c7b5c" containerName="dnsmasq-dns" containerID="cri-o://cb029c3bbbf07a7d5870f7915afd9ffe176946f36ed02f87443b0c725e6c274d" gracePeriod=10 Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.736256 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.741249 5047 generic.go:334] "Generic (PLEG): container finished" podID="ce2a4182-5761-4a21-82b4-02f46e3c7b5c" containerID="cb029c3bbbf07a7d5870f7915afd9ffe176946f36ed02f87443b0c725e6c274d" exitCode=0 Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.741285 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" event={"ID":"ce2a4182-5761-4a21-82b4-02f46e3c7b5c","Type":"ContainerDied","Data":"cb029c3bbbf07a7d5870f7915afd9ffe176946f36ed02f87443b0c725e6c274d"} Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.741308 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" event={"ID":"ce2a4182-5761-4a21-82b4-02f46e3c7b5c","Type":"ContainerDied","Data":"1cf1c1a43bb757cd5529002ab9c90a40342ac2ab5f3a75d3f602b264a8c42155"} Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.741323 5047 scope.go:117] "RemoveContainer" containerID="cb029c3bbbf07a7d5870f7915afd9ffe176946f36ed02f87443b0c725e6c274d" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.741423 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549cbc9d6f-h9h4z" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.761074 5047 scope.go:117] "RemoveContainer" containerID="4db6e30204466bf60e52c5d6f34648ef975470acce6d6e76b535155b33921f77" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.802067 5047 scope.go:117] "RemoveContainer" containerID="cb029c3bbbf07a7d5870f7915afd9ffe176946f36ed02f87443b0c725e6c274d" Feb 23 08:45:51 crc kubenswrapper[5047]: E0223 08:45:51.808151 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb029c3bbbf07a7d5870f7915afd9ffe176946f36ed02f87443b0c725e6c274d\": container with ID starting with cb029c3bbbf07a7d5870f7915afd9ffe176946f36ed02f87443b0c725e6c274d not found: ID does not exist" containerID="cb029c3bbbf07a7d5870f7915afd9ffe176946f36ed02f87443b0c725e6c274d" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.808200 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb029c3bbbf07a7d5870f7915afd9ffe176946f36ed02f87443b0c725e6c274d"} err="failed to get container status \"cb029c3bbbf07a7d5870f7915afd9ffe176946f36ed02f87443b0c725e6c274d\": rpc error: code = NotFound desc = could not find container \"cb029c3bbbf07a7d5870f7915afd9ffe176946f36ed02f87443b0c725e6c274d\": container with ID starting with cb029c3bbbf07a7d5870f7915afd9ffe176946f36ed02f87443b0c725e6c274d not found: ID does not exist" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.808225 5047 scope.go:117] "RemoveContainer" containerID="4db6e30204466bf60e52c5d6f34648ef975470acce6d6e76b535155b33921f77" Feb 23 08:45:51 crc kubenswrapper[5047]: E0223 08:45:51.808635 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db6e30204466bf60e52c5d6f34648ef975470acce6d6e76b535155b33921f77\": container with ID starting with 4db6e30204466bf60e52c5d6f34648ef975470acce6d6e76b535155b33921f77 not found: ID does not exist" containerID="4db6e30204466bf60e52c5d6f34648ef975470acce6d6e76b535155b33921f77" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.808657 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db6e30204466bf60e52c5d6f34648ef975470acce6d6e76b535155b33921f77"} err="failed to get container status \"4db6e30204466bf60e52c5d6f34648ef975470acce6d6e76b535155b33921f77\": rpc error: code = NotFound desc = could not find container \"4db6e30204466bf60e52c5d6f34648ef975470acce6d6e76b535155b33921f77\": container with ID starting with 4db6e30204466bf60e52c5d6f34648ef975470acce6d6e76b535155b33921f77 not found: ID does not exist" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.904284 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6kb2\" (UniqueName: \"kubernetes.io/projected/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-kube-api-access-t6kb2\") pod \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.904356 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-ovsdbserver-sb\") pod \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.904499 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-config\") pod \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.904523 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-ovsdbserver-nb\") pod \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.904550 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-dns-svc\") pod \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\" (UID: \"ce2a4182-5761-4a21-82b4-02f46e3c7b5c\") " Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.919147 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-kube-api-access-t6kb2" (OuterVolumeSpecName: "kube-api-access-t6kb2") pod "ce2a4182-5761-4a21-82b4-02f46e3c7b5c" (UID: "ce2a4182-5761-4a21-82b4-02f46e3c7b5c"). InnerVolumeSpecName "kube-api-access-t6kb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.942793 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce2a4182-5761-4a21-82b4-02f46e3c7b5c" (UID: "ce2a4182-5761-4a21-82b4-02f46e3c7b5c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.952589 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce2a4182-5761-4a21-82b4-02f46e3c7b5c" (UID: "ce2a4182-5761-4a21-82b4-02f46e3c7b5c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.964293 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce2a4182-5761-4a21-82b4-02f46e3c7b5c" (UID: "ce2a4182-5761-4a21-82b4-02f46e3c7b5c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:51 crc kubenswrapper[5047]: I0223 08:45:51.964835 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-config" (OuterVolumeSpecName: "config") pod "ce2a4182-5761-4a21-82b4-02f46e3c7b5c" (UID: "ce2a4182-5761-4a21-82b4-02f46e3c7b5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:52 crc kubenswrapper[5047]: I0223 08:45:52.005976 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:52 crc kubenswrapper[5047]: I0223 08:45:52.006006 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:52 crc kubenswrapper[5047]: I0223 08:45:52.006019 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:52 crc kubenswrapper[5047]: I0223 08:45:52.006030 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6kb2\" (UniqueName: \"kubernetes.io/projected/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-kube-api-access-t6kb2\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:52 crc kubenswrapper[5047]: I0223 08:45:52.006039 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce2a4182-5761-4a21-82b4-02f46e3c7b5c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:52 crc kubenswrapper[5047]: I0223 08:45:52.074072 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-549cbc9d6f-h9h4z"] Feb 23 08:45:52 crc kubenswrapper[5047]: I0223 08:45:52.083571 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-549cbc9d6f-h9h4z"] Feb 23 08:45:52 crc kubenswrapper[5047]: I0223 08:45:52.352870 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce2a4182-5761-4a21-82b4-02f46e3c7b5c" path="/var/lib/kubelet/pods/ce2a4182-5761-4a21-82b4-02f46e3c7b5c/volumes" Feb 23 08:45:56 crc kubenswrapper[5047]: I0223 08:45:56.381336 5047 scope.go:117] "RemoveContainer" containerID="bf3b0236277e00464a5629357de2ec8dbe421e7731e772f51908e1af8425b9cc" Feb 23 08:46:11 crc kubenswrapper[5047]: I0223 08:46:11.222121 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:46:13 crc kubenswrapper[5047]: I0223 08:46:13.570025 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 08:46:13 crc kubenswrapper[5047]: I0223 08:46:13.660372 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84bcc8bdb-bcv5z"] Feb 23 08:46:13 crc kubenswrapper[5047]: I0223 08:46:13.660755 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84bcc8bdb-bcv5z" podUID="8075f6ee-15d3-4c72-8da3-1754a90710ee" containerName="neutron-api" containerID="cri-o://2fa1b80f55ae46237e4cf3484aa320012cbf5b502bff01fd05234a1f684dfe9e" gracePeriod=30 Feb 23 08:46:13 crc kubenswrapper[5047]: I0223 08:46:13.661014 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84bcc8bdb-bcv5z" podUID="8075f6ee-15d3-4c72-8da3-1754a90710ee" containerName="neutron-httpd" containerID="cri-o://9b9aab6ddebfc63cbf7df9b35ce81cbd20dd7b072ee66355a451faedd2da6c45" gracePeriod=30 Feb 23 08:46:15 crc kubenswrapper[5047]: I0223 08:46:15.013038 5047 generic.go:334] "Generic (PLEG): container finished" podID="8075f6ee-15d3-4c72-8da3-1754a90710ee" containerID="9b9aab6ddebfc63cbf7df9b35ce81cbd20dd7b072ee66355a451faedd2da6c45" exitCode=0 Feb 23 08:46:15 crc kubenswrapper[5047]: I0223 08:46:15.013109 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84bcc8bdb-bcv5z" event={"ID":"8075f6ee-15d3-4c72-8da3-1754a90710ee","Type":"ContainerDied","Data":"9b9aab6ddebfc63cbf7df9b35ce81cbd20dd7b072ee66355a451faedd2da6c45"} Feb 23 08:46:16 crc kubenswrapper[5047]: I0223 08:46:16.759401 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:46:16 crc kubenswrapper[5047]: I0223 08:46:16.759810 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:46:16 crc kubenswrapper[5047]: I0223 08:46:16.759857 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 08:46:16 crc kubenswrapper[5047]: I0223 08:46:16.760598 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5480ffc93828193b25025f4f8362ac8f87100f59cbe44b98393e3159b56048e0"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:46:16 crc kubenswrapper[5047]: I0223 08:46:16.760643 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://5480ffc93828193b25025f4f8362ac8f87100f59cbe44b98393e3159b56048e0" gracePeriod=600 Feb 23 08:46:17 crc kubenswrapper[5047]: I0223 08:46:17.033870 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="5480ffc93828193b25025f4f8362ac8f87100f59cbe44b98393e3159b56048e0" exitCode=0 Feb 23 08:46:17 crc kubenswrapper[5047]: I0223 08:46:17.034089 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"5480ffc93828193b25025f4f8362ac8f87100f59cbe44b98393e3159b56048e0"} Feb 23 08:46:17 crc kubenswrapper[5047]: I0223 08:46:17.034395 5047 scope.go:117] "RemoveContainer" containerID="4d6483318becda2624a6ce807e74d8adfeac803349506954621e3eb8c80f5357" Feb 23 08:46:18 crc kubenswrapper[5047]: I0223 08:46:18.055380 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873"} Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.077950 5047 generic.go:334] "Generic (PLEG): container finished" podID="8075f6ee-15d3-4c72-8da3-1754a90710ee" containerID="2fa1b80f55ae46237e4cf3484aa320012cbf5b502bff01fd05234a1f684dfe9e" exitCode=0 Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.078285 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84bcc8bdb-bcv5z" event={"ID":"8075f6ee-15d3-4c72-8da3-1754a90710ee","Type":"ContainerDied","Data":"2fa1b80f55ae46237e4cf3484aa320012cbf5b502bff01fd05234a1f684dfe9e"} Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.279936 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.319899 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-ovndb-tls-certs\") pod \"8075f6ee-15d3-4c72-8da3-1754a90710ee\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.320105 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-httpd-config\") pod \"8075f6ee-15d3-4c72-8da3-1754a90710ee\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.320211 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jchwk\" (UniqueName: \"kubernetes.io/projected/8075f6ee-15d3-4c72-8da3-1754a90710ee-kube-api-access-jchwk\") pod \"8075f6ee-15d3-4c72-8da3-1754a90710ee\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.320272 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-config\") pod \"8075f6ee-15d3-4c72-8da3-1754a90710ee\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.320414 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-combined-ca-bundle\") pod \"8075f6ee-15d3-4c72-8da3-1754a90710ee\" (UID: \"8075f6ee-15d3-4c72-8da3-1754a90710ee\") " Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.330008 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8075f6ee-15d3-4c72-8da3-1754a90710ee-kube-api-access-jchwk" (OuterVolumeSpecName: "kube-api-access-jchwk") pod "8075f6ee-15d3-4c72-8da3-1754a90710ee" (UID: "8075f6ee-15d3-4c72-8da3-1754a90710ee"). InnerVolumeSpecName "kube-api-access-jchwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.330149 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8075f6ee-15d3-4c72-8da3-1754a90710ee" (UID: "8075f6ee-15d3-4c72-8da3-1754a90710ee"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.383153 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8075f6ee-15d3-4c72-8da3-1754a90710ee" (UID: "8075f6ee-15d3-4c72-8da3-1754a90710ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.383690 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-config" (OuterVolumeSpecName: "config") pod "8075f6ee-15d3-4c72-8da3-1754a90710ee" (UID: "8075f6ee-15d3-4c72-8da3-1754a90710ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.405024 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8075f6ee-15d3-4c72-8da3-1754a90710ee" (UID: "8075f6ee-15d3-4c72-8da3-1754a90710ee"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.425173 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.425247 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jchwk\" (UniqueName: \"kubernetes.io/projected/8075f6ee-15d3-4c72-8da3-1754a90710ee-kube-api-access-jchwk\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.425260 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.425269 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:19 crc kubenswrapper[5047]: I0223 08:46:19.425279 5047 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8075f6ee-15d3-4c72-8da3-1754a90710ee-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:20 crc kubenswrapper[5047]: I0223 08:46:20.093932 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84bcc8bdb-bcv5z" event={"ID":"8075f6ee-15d3-4c72-8da3-1754a90710ee","Type":"ContainerDied","Data":"5d6f8ae610b910be432118f4f505de1760d74e26b3704437faf76ea33743b19b"} Feb 23 08:46:20 crc kubenswrapper[5047]: I0223 08:46:20.094007 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84bcc8bdb-bcv5z" Feb 23 08:46:20 crc kubenswrapper[5047]: I0223 08:46:20.094383 5047 scope.go:117] "RemoveContainer" containerID="9b9aab6ddebfc63cbf7df9b35ce81cbd20dd7b072ee66355a451faedd2da6c45" Feb 23 08:46:20 crc kubenswrapper[5047]: I0223 08:46:20.141835 5047 scope.go:117] "RemoveContainer" containerID="2fa1b80f55ae46237e4cf3484aa320012cbf5b502bff01fd05234a1f684dfe9e" Feb 23 08:46:20 crc kubenswrapper[5047]: I0223 08:46:20.159688 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84bcc8bdb-bcv5z"] Feb 23 08:46:20 crc kubenswrapper[5047]: I0223 08:46:20.167239 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-84bcc8bdb-bcv5z"] Feb 23 08:46:20 crc kubenswrapper[5047]: I0223 08:46:20.370023 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8075f6ee-15d3-4c72-8da3-1754a90710ee" path="/var/lib/kubelet/pods/8075f6ee-15d3-4c72-8da3-1754a90710ee/volumes" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.749358 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-qv6xg"] Feb 23 08:46:33 crc kubenswrapper[5047]: E0223 08:46:33.753834 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8075f6ee-15d3-4c72-8da3-1754a90710ee" containerName="neutron-api" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.753950 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8075f6ee-15d3-4c72-8da3-1754a90710ee" containerName="neutron-api" Feb 23 08:46:33 crc kubenswrapper[5047]: E0223 08:46:33.754082 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8075f6ee-15d3-4c72-8da3-1754a90710ee" containerName="neutron-httpd" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.754155 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8075f6ee-15d3-4c72-8da3-1754a90710ee" containerName="neutron-httpd" Feb 23 08:46:33 crc kubenswrapper[5047]: E0223 08:46:33.754238 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce2a4182-5761-4a21-82b4-02f46e3c7b5c" containerName="init" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.754289 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2a4182-5761-4a21-82b4-02f46e3c7b5c" containerName="init" Feb 23 08:46:33 crc kubenswrapper[5047]: E0223 08:46:33.754353 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce2a4182-5761-4a21-82b4-02f46e3c7b5c" containerName="dnsmasq-dns" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.754413 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2a4182-5761-4a21-82b4-02f46e3c7b5c" containerName="dnsmasq-dns" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.757009 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8075f6ee-15d3-4c72-8da3-1754a90710ee" containerName="neutron-httpd" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.757121 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce2a4182-5761-4a21-82b4-02f46e3c7b5c" containerName="dnsmasq-dns" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.757202 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8075f6ee-15d3-4c72-8da3-1754a90710ee" containerName="neutron-api" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.758165 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.763179 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.764013 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-hgczk" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.764135 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.764320 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.764470 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.779045 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qv6xg"] Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.830604 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77c98fc85f-sw8fn"] Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.833416 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.850821 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77c98fc85f-sw8fn"] Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.953660 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5fcc\" (UniqueName: \"kubernetes.io/projected/dda27df2-528e-46fb-9e17-1dfacd9e70ee-kube-api-access-x5fcc\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.953721 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxzt8\" (UniqueName: \"kubernetes.io/projected/8dd40f96-4019-455e-b292-19cd00b8e616-kube-api-access-lxzt8\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.953754 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dd40f96-4019-455e-b292-19cd00b8e616-etc-swift\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.953788 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-dispersionconf\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.953819 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-combined-ca-bundle\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.953837 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-ovsdbserver-sb\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.953856 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-dns-svc\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.953907 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-config\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.953947 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-ovsdbserver-nb\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.953979 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-swiftconf\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.954010 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dd40f96-4019-455e-b292-19cd00b8e616-ring-data-devices\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:33 crc kubenswrapper[5047]: I0223 08:46:33.954036 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dd40f96-4019-455e-b292-19cd00b8e616-scripts\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.056096 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dd40f96-4019-455e-b292-19cd00b8e616-etc-swift\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.056156 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-dispersionconf\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.056196 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-combined-ca-bundle\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.056219 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-ovsdbserver-sb\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.056242 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-dns-svc\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.056285 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-config\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.056310 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-ovsdbserver-nb\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.056343 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-swiftconf\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.056369 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dd40f96-4019-455e-b292-19cd00b8e616-ring-data-devices\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.056391 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dd40f96-4019-455e-b292-19cd00b8e616-scripts\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.056440 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5fcc\" (UniqueName: \"kubernetes.io/projected/dda27df2-528e-46fb-9e17-1dfacd9e70ee-kube-api-access-x5fcc\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.056460 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxzt8\" (UniqueName: \"kubernetes.io/projected/8dd40f96-4019-455e-b292-19cd00b8e616-kube-api-access-lxzt8\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.056589 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dd40f96-4019-455e-b292-19cd00b8e616-etc-swift\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.057333 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-ovsdbserver-nb\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.057630 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dd40f96-4019-455e-b292-19cd00b8e616-scripts\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.058029 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-config\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.058264 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dd40f96-4019-455e-b292-19cd00b8e616-ring-data-devices\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.058529 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-ovsdbserver-sb\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.058581 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-dns-svc\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.062454 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-swiftconf\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.062653 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-combined-ca-bundle\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.062758 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-dispersionconf\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.077613 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5fcc\" (UniqueName: \"kubernetes.io/projected/dda27df2-528e-46fb-9e17-1dfacd9e70ee-kube-api-access-x5fcc\") pod \"dnsmasq-dns-77c98fc85f-sw8fn\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.080446 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxzt8\" (UniqueName: \"kubernetes.io/projected/8dd40f96-4019-455e-b292-19cd00b8e616-kube-api-access-lxzt8\") pod \"swift-ring-rebalance-qv6xg\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.118963 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.163720 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.657193 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qv6xg"] Feb 23 08:46:34 crc kubenswrapper[5047]: W0223 08:46:34.664450 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dd40f96_4019_455e_b292_19cd00b8e616.slice/crio-df69506a29f6aa7da6c9c7077ebe14c31cce7f91547378113031ad7b20c8673b WatchSource:0}: Error finding container df69506a29f6aa7da6c9c7077ebe14c31cce7f91547378113031ad7b20c8673b: Status 404 returned error can't find the container with id df69506a29f6aa7da6c9c7077ebe14c31cce7f91547378113031ad7b20c8673b Feb 23 08:46:34 crc kubenswrapper[5047]: I0223 08:46:34.783554 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77c98fc85f-sw8fn"] Feb 23 08:46:34 crc kubenswrapper[5047]: W0223 08:46:34.788184 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddda27df2_528e_46fb_9e17_1dfacd9e70ee.slice/crio-ccd6e19bb9b12cbeab5c0792f2598c377bdd50dc5ece7887f7c30a26747e4807 WatchSource:0}: Error finding container ccd6e19bb9b12cbeab5c0792f2598c377bdd50dc5ece7887f7c30a26747e4807: Status 404 returned error can't find the container with id ccd6e19bb9b12cbeab5c0792f2598c377bdd50dc5ece7887f7c30a26747e4807 Feb 23 08:46:35 crc kubenswrapper[5047]: I0223 08:46:35.259302 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qv6xg" event={"ID":"8dd40f96-4019-455e-b292-19cd00b8e616","Type":"ContainerStarted","Data":"df69506a29f6aa7da6c9c7077ebe14c31cce7f91547378113031ad7b20c8673b"} Feb 23 08:46:35 crc kubenswrapper[5047]: I0223 08:46:35.261751 5047 generic.go:334] "Generic (PLEG): container finished" podID="dda27df2-528e-46fb-9e17-1dfacd9e70ee" containerID="fcb586ad993c48e8b973ec29427639c02accea5ae41172e65e4e6b2908bec5bc" exitCode=0 Feb 23 08:46:35 crc kubenswrapper[5047]: I0223 08:46:35.261816 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" event={"ID":"dda27df2-528e-46fb-9e17-1dfacd9e70ee","Type":"ContainerDied","Data":"fcb586ad993c48e8b973ec29427639c02accea5ae41172e65e4e6b2908bec5bc"} Feb 23 08:46:35 crc kubenswrapper[5047]: I0223 08:46:35.261900 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" event={"ID":"dda27df2-528e-46fb-9e17-1dfacd9e70ee","Type":"ContainerStarted","Data":"ccd6e19bb9b12cbeab5c0792f2598c377bdd50dc5ece7887f7c30a26747e4807"} Feb 23 08:46:36 crc kubenswrapper[5047]: I0223 08:46:36.281391 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" event={"ID":"dda27df2-528e-46fb-9e17-1dfacd9e70ee","Type":"ContainerStarted","Data":"4b5105c60c8298f6cd7a7fc1bfb183c9fa6eb74473d079079174a517180a28a8"} Feb 23 08:46:36 crc kubenswrapper[5047]: I0223 08:46:36.282226 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:36 crc kubenswrapper[5047]: I0223 08:46:36.307151 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" podStartSLOduration=3.307132368 podStartE2EDuration="3.307132368s" podCreationTimestamp="2026-02-23 08:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:46:36.302979306 +0000 UTC m=+7318.554306440" watchObservedRunningTime="2026-02-23 08:46:36.307132368 +0000 UTC m=+7318.558459502" Feb 23 08:46:36 crc kubenswrapper[5047]: I0223 08:46:36.945850 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7fc7875764-v8676"] Feb 23 08:46:36 crc kubenswrapper[5047]: I0223 08:46:36.947623 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:36 crc kubenswrapper[5047]: I0223 08:46:36.949759 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 23 08:46:36 crc kubenswrapper[5047]: I0223 08:46:36.968039 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7fc7875764-v8676"] Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.110698 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-combined-ca-bundle\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.110788 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg2ts\" (UniqueName: \"kubernetes.io/projected/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-kube-api-access-mg2ts\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.110821 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-config-data\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.110850 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-run-httpd\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.110978 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-etc-swift\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.111043 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-log-httpd\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.212870 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-combined-ca-bundle\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.212955 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg2ts\" (UniqueName: \"kubernetes.io/projected/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-kube-api-access-mg2ts\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.212973 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-config-data\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.212989 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-run-httpd\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.213044 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-etc-swift\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.213078 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-log-httpd\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.213625 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-log-httpd\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.213853 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-run-httpd\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.225629 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-combined-ca-bundle\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.225858 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-etc-swift\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.236504 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg2ts\" (UniqueName: \"kubernetes.io/projected/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-kube-api-access-mg2ts\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.242386 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-config-data\") pod \"swift-proxy-7fc7875764-v8676\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:37 crc kubenswrapper[5047]: I0223 08:46:37.268803 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.541289 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5448bb6c56-5br5b"] Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.542857 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.545278 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.547059 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.575251 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5448bb6c56-5br5b"] Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.648661 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzzhn\" (UniqueName: \"kubernetes.io/projected/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-kube-api-access-gzzhn\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.648752 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-internal-tls-certs\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.648792 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-config-data\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.648835 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-log-httpd\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.648859 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-public-tls-certs\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.648890 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-combined-ca-bundle\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.648967 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-run-httpd\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.649010 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-etc-swift\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.750971 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-run-httpd\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.751029 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-etc-swift\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.751079 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzzhn\" (UniqueName: \"kubernetes.io/projected/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-kube-api-access-gzzhn\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.751110 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-internal-tls-certs\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.751136 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-config-data\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.751172 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-log-httpd\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.751193 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-public-tls-certs\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.751243 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-combined-ca-bundle\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.752454 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-log-httpd\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.752544 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-run-httpd\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.754045 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.754810 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.757483 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-combined-ca-bundle\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.758745 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-config-data\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.762863 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-etc-swift\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.768087 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-public-tls-certs\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.769535 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-internal-tls-certs\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.773221 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzzhn\" (UniqueName: \"kubernetes.io/projected/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-kube-api-access-gzzhn\") pod \"swift-proxy-5448bb6c56-5br5b\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:38 crc kubenswrapper[5047]: I0223 08:46:38.875313 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:40 crc kubenswrapper[5047]: I0223 08:46:40.048574 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7fc7875764-v8676"] Feb 23 08:46:40 crc kubenswrapper[5047]: I0223 08:46:40.373646 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7fc7875764-v8676" event={"ID":"41c5c59a-c3ea-4f82-8bf4-9a95787616f6","Type":"ContainerStarted","Data":"75ea2117156de6091553e73240a2df927f51cf2b5840af7ca4269efc526154b7"} Feb 23 08:46:40 crc kubenswrapper[5047]: I0223 08:46:40.373707 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7fc7875764-v8676" event={"ID":"41c5c59a-c3ea-4f82-8bf4-9a95787616f6","Type":"ContainerStarted","Data":"d49611fe956fa48252bcae42e81ff7c1512a70bfa7d317aa8ecb1ea449d671a6"} Feb 23 08:46:40 crc kubenswrapper[5047]: I0223 08:46:40.375932 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qv6xg" event={"ID":"8dd40f96-4019-455e-b292-19cd00b8e616","Type":"ContainerStarted","Data":"f1af0f63c5a3f5e9fc8ecb2896b6500cf5a64a631e0df682cd475c9a6a8beb23"} Feb 23 08:46:40 crc kubenswrapper[5047]: I0223 08:46:40.399020 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-qv6xg" podStartSLOduration=2.585874347 podStartE2EDuration="7.399001949s" podCreationTimestamp="2026-02-23 08:46:33 +0000 UTC" firstStartedPulling="2026-02-23 08:46:34.665885591 +0000 UTC m=+7316.917212725" lastFinishedPulling="2026-02-23 08:46:39.479013193 +0000 UTC m=+7321.730340327" observedRunningTime="2026-02-23 08:46:40.397022966 +0000 UTC m=+7322.648350100" watchObservedRunningTime="2026-02-23 08:46:40.399001949 +0000 UTC m=+7322.650329083" Feb 23 08:46:40 crc kubenswrapper[5047]: I0223 08:46:40.971016 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5448bb6c56-5br5b"] Feb 23 08:46:41 crc kubenswrapper[5047]: I0223 08:46:41.393992 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7fc7875764-v8676" event={"ID":"41c5c59a-c3ea-4f82-8bf4-9a95787616f6","Type":"ContainerStarted","Data":"bfadfd5027fb517aa3370b27dd53eac6df4ee5d83fb50fe95937c7fea74195aa"} Feb 23 08:46:41 crc kubenswrapper[5047]: I0223 08:46:41.395966 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:41 crc kubenswrapper[5047]: I0223 08:46:41.396614 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5448bb6c56-5br5b" event={"ID":"0c50549a-ecc3-4c2f-a625-4cfa3492fbad","Type":"ContainerStarted","Data":"8a9512e77ec9a5c41445f0fc54c6a74435b9b78c0ecdcff32ba1f9acce025e1f"} Feb 23 08:46:41 crc kubenswrapper[5047]: I0223 08:46:41.396664 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5448bb6c56-5br5b" event={"ID":"0c50549a-ecc3-4c2f-a625-4cfa3492fbad","Type":"ContainerStarted","Data":"8add52e375d3e75b9aee7e2b15793bfcdbaf51c6a30afc98e9cd3ea8dcb9b935"} Feb 23 08:46:41 crc kubenswrapper[5047]: I0223 08:46:41.420747 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7fc7875764-v8676" podStartSLOduration=5.420721896 podStartE2EDuration="5.420721896s" podCreationTimestamp="2026-02-23 08:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:46:41.418366312 +0000 UTC m=+7323.669693446" watchObservedRunningTime="2026-02-23 08:46:41.420721896 +0000 UTC m=+7323.672049050" Feb 23 08:46:42 crc kubenswrapper[5047]: I0223 08:46:42.269422 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:42 crc kubenswrapper[5047]: I0223 08:46:42.407326 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5448bb6c56-5br5b" event={"ID":"0c50549a-ecc3-4c2f-a625-4cfa3492fbad","Type":"ContainerStarted","Data":"cff924505599b5b5ba1b26edde59684e4ca7be4355f211ee64421ecef23a8028"} Feb 23 08:46:42 crc kubenswrapper[5047]: I0223 08:46:42.408874 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:42 crc kubenswrapper[5047]: I0223 08:46:42.409004 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:42 crc kubenswrapper[5047]: I0223 08:46:42.431111 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5448bb6c56-5br5b" podStartSLOduration=4.431089608 podStartE2EDuration="4.431089608s" podCreationTimestamp="2026-02-23 08:46:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:46:42.426931946 +0000 UTC m=+7324.678259080" watchObservedRunningTime="2026-02-23 08:46:42.431089608 +0000 UTC m=+7324.682416752" Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.165265 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.242166 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77458987f-gzdbf"] Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.242521 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77458987f-gzdbf" podUID="ada6a2f2-1e71-45a6-91ef-73f00a6194f1" containerName="dnsmasq-dns" containerID="cri-o://ab70ceb9ecf30e893ccca85afd1ab01fe88aba7a9060c1cb45aa46f5abc35858" gracePeriod=10 Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.429126 5047 generic.go:334] "Generic (PLEG): container finished" podID="ada6a2f2-1e71-45a6-91ef-73f00a6194f1" containerID="ab70ceb9ecf30e893ccca85afd1ab01fe88aba7a9060c1cb45aa46f5abc35858" exitCode=0 Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.429184 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77458987f-gzdbf" event={"ID":"ada6a2f2-1e71-45a6-91ef-73f00a6194f1","Type":"ContainerDied","Data":"ab70ceb9ecf30e893ccca85afd1ab01fe88aba7a9060c1cb45aa46f5abc35858"} Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.430420 5047 generic.go:334] "Generic (PLEG): container finished" podID="8dd40f96-4019-455e-b292-19cd00b8e616" containerID="f1af0f63c5a3f5e9fc8ecb2896b6500cf5a64a631e0df682cd475c9a6a8beb23" exitCode=0 Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.431313 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qv6xg" event={"ID":"8dd40f96-4019-455e-b292-19cd00b8e616","Type":"ContainerDied","Data":"f1af0f63c5a3f5e9fc8ecb2896b6500cf5a64a631e0df682cd475c9a6a8beb23"} Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.757796 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.891711 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czkwl\" (UniqueName: \"kubernetes.io/projected/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-kube-api-access-czkwl\") pod \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.892280 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-ovsdbserver-sb\") pod \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.892411 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-config\") pod \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.892468 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-dns-svc\") pod \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.892504 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-ovsdbserver-nb\") pod \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\" (UID: \"ada6a2f2-1e71-45a6-91ef-73f00a6194f1\") " Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.916830 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-kube-api-access-czkwl" (OuterVolumeSpecName: "kube-api-access-czkwl") pod "ada6a2f2-1e71-45a6-91ef-73f00a6194f1" (UID: "ada6a2f2-1e71-45a6-91ef-73f00a6194f1"). InnerVolumeSpecName "kube-api-access-czkwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.941424 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ada6a2f2-1e71-45a6-91ef-73f00a6194f1" (UID: "ada6a2f2-1e71-45a6-91ef-73f00a6194f1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.951445 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-config" (OuterVolumeSpecName: "config") pod "ada6a2f2-1e71-45a6-91ef-73f00a6194f1" (UID: "ada6a2f2-1e71-45a6-91ef-73f00a6194f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.951679 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ada6a2f2-1e71-45a6-91ef-73f00a6194f1" (UID: "ada6a2f2-1e71-45a6-91ef-73f00a6194f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.963058 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ada6a2f2-1e71-45a6-91ef-73f00a6194f1" (UID: "ada6a2f2-1e71-45a6-91ef-73f00a6194f1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.994554 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czkwl\" (UniqueName: \"kubernetes.io/projected/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-kube-api-access-czkwl\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.994591 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.994605 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.995546 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:44 crc kubenswrapper[5047]: I0223 08:46:44.995616 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ada6a2f2-1e71-45a6-91ef-73f00a6194f1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:45 crc kubenswrapper[5047]: I0223 08:46:45.442611 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77458987f-gzdbf" Feb 23 08:46:45 crc kubenswrapper[5047]: I0223 08:46:45.442692 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77458987f-gzdbf" event={"ID":"ada6a2f2-1e71-45a6-91ef-73f00a6194f1","Type":"ContainerDied","Data":"81a0fd969fc1777b6a85107c69c7f66b079ddf08ba90cf6e6192b6b12ee9f930"} Feb 23 08:46:45 crc kubenswrapper[5047]: I0223 08:46:45.442749 5047 scope.go:117] "RemoveContainer" containerID="ab70ceb9ecf30e893ccca85afd1ab01fe88aba7a9060c1cb45aa46f5abc35858" Feb 23 08:46:45 crc kubenswrapper[5047]: I0223 08:46:45.492232 5047 scope.go:117] "RemoveContainer" containerID="05194a266cf7c256bf1de95e0e41314bb4da672c8141ebc5b26c6f2e7850498b" Feb 23 08:46:45 crc kubenswrapper[5047]: I0223 08:46:45.501008 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77458987f-gzdbf"] Feb 23 08:46:45 crc kubenswrapper[5047]: I0223 08:46:45.516318 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77458987f-gzdbf"] Feb 23 08:46:45 crc kubenswrapper[5047]: I0223 08:46:45.924241 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.118058 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-swiftconf\") pod \"8dd40f96-4019-455e-b292-19cd00b8e616\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.118758 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dd40f96-4019-455e-b292-19cd00b8e616-ring-data-devices\") pod \"8dd40f96-4019-455e-b292-19cd00b8e616\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.119082 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dd40f96-4019-455e-b292-19cd00b8e616-scripts\") pod \"8dd40f96-4019-455e-b292-19cd00b8e616\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.119496 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dd40f96-4019-455e-b292-19cd00b8e616-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8dd40f96-4019-455e-b292-19cd00b8e616" (UID: "8dd40f96-4019-455e-b292-19cd00b8e616"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.119675 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-combined-ca-bundle\") pod \"8dd40f96-4019-455e-b292-19cd00b8e616\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.119885 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxzt8\" (UniqueName: \"kubernetes.io/projected/8dd40f96-4019-455e-b292-19cd00b8e616-kube-api-access-lxzt8\") pod \"8dd40f96-4019-455e-b292-19cd00b8e616\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.120125 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dd40f96-4019-455e-b292-19cd00b8e616-etc-swift\") pod \"8dd40f96-4019-455e-b292-19cd00b8e616\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.120422 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-dispersionconf\") pod \"8dd40f96-4019-455e-b292-19cd00b8e616\" (UID: \"8dd40f96-4019-455e-b292-19cd00b8e616\") " Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.121373 5047 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dd40f96-4019-455e-b292-19cd00b8e616-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.121401 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dd40f96-4019-455e-b292-19cd00b8e616-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8dd40f96-4019-455e-b292-19cd00b8e616" (UID: "8dd40f96-4019-455e-b292-19cd00b8e616"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.128234 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dd40f96-4019-455e-b292-19cd00b8e616-kube-api-access-lxzt8" (OuterVolumeSpecName: "kube-api-access-lxzt8") pod "8dd40f96-4019-455e-b292-19cd00b8e616" (UID: "8dd40f96-4019-455e-b292-19cd00b8e616"). InnerVolumeSpecName "kube-api-access-lxzt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.130866 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8dd40f96-4019-455e-b292-19cd00b8e616" (UID: "8dd40f96-4019-455e-b292-19cd00b8e616"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.150050 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8dd40f96-4019-455e-b292-19cd00b8e616" (UID: "8dd40f96-4019-455e-b292-19cd00b8e616"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.156429 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dd40f96-4019-455e-b292-19cd00b8e616-scripts" (OuterVolumeSpecName: "scripts") pod "8dd40f96-4019-455e-b292-19cd00b8e616" (UID: "8dd40f96-4019-455e-b292-19cd00b8e616"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.161086 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dd40f96-4019-455e-b292-19cd00b8e616" (UID: "8dd40f96-4019-455e-b292-19cd00b8e616"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.224208 5047 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.224290 5047 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.224317 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dd40f96-4019-455e-b292-19cd00b8e616-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.224342 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd40f96-4019-455e-b292-19cd00b8e616-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.224368 5047 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dd40f96-4019-455e-b292-19cd00b8e616-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.224395 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxzt8\" (UniqueName: \"kubernetes.io/projected/8dd40f96-4019-455e-b292-19cd00b8e616-kube-api-access-lxzt8\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.355314 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada6a2f2-1e71-45a6-91ef-73f00a6194f1" path="/var/lib/kubelet/pods/ada6a2f2-1e71-45a6-91ef-73f00a6194f1/volumes" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.458435 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qv6xg" event={"ID":"8dd40f96-4019-455e-b292-19cd00b8e616","Type":"ContainerDied","Data":"df69506a29f6aa7da6c9c7077ebe14c31cce7f91547378113031ad7b20c8673b"} Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.458486 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df69506a29f6aa7da6c9c7077ebe14c31cce7f91547378113031ad7b20c8673b" Feb 23 08:46:46 crc kubenswrapper[5047]: I0223 08:46:46.458526 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qv6xg" Feb 23 08:46:47 crc kubenswrapper[5047]: I0223 08:46:47.273896 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:47 crc kubenswrapper[5047]: I0223 08:46:47.274514 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:48 crc kubenswrapper[5047]: I0223 08:46:48.882743 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:48 crc kubenswrapper[5047]: I0223 08:46:48.884965 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 08:46:48 crc kubenswrapper[5047]: I0223 08:46:48.996968 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7fc7875764-v8676"] Feb 23 08:46:48 crc kubenswrapper[5047]: I0223 08:46:48.997358 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7fc7875764-v8676" podUID="41c5c59a-c3ea-4f82-8bf4-9a95787616f6" containerName="proxy-httpd" containerID="cri-o://75ea2117156de6091553e73240a2df927f51cf2b5840af7ca4269efc526154b7" gracePeriod=30 Feb 23 08:46:48 crc kubenswrapper[5047]: I0223 08:46:48.997537 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7fc7875764-v8676" podUID="41c5c59a-c3ea-4f82-8bf4-9a95787616f6" containerName="proxy-server" containerID="cri-o://bfadfd5027fb517aa3370b27dd53eac6df4ee5d83fb50fe95937c7fea74195aa" gracePeriod=30 Feb 23 08:46:49 crc kubenswrapper[5047]: I0223 08:46:49.495643 5047 generic.go:334] "Generic (PLEG): container finished" podID="41c5c59a-c3ea-4f82-8bf4-9a95787616f6" containerID="bfadfd5027fb517aa3370b27dd53eac6df4ee5d83fb50fe95937c7fea74195aa" exitCode=0 Feb 23 08:46:49 crc kubenswrapper[5047]: I0223 08:46:49.495682 5047 generic.go:334] "Generic (PLEG): container finished" podID="41c5c59a-c3ea-4f82-8bf4-9a95787616f6" containerID="75ea2117156de6091553e73240a2df927f51cf2b5840af7ca4269efc526154b7" exitCode=0 Feb 23 08:46:49 crc kubenswrapper[5047]: I0223 08:46:49.495735 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7fc7875764-v8676" event={"ID":"41c5c59a-c3ea-4f82-8bf4-9a95787616f6","Type":"ContainerDied","Data":"bfadfd5027fb517aa3370b27dd53eac6df4ee5d83fb50fe95937c7fea74195aa"} Feb 23 08:46:49 crc kubenswrapper[5047]: I0223 08:46:49.495804 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7fc7875764-v8676" event={"ID":"41c5c59a-c3ea-4f82-8bf4-9a95787616f6","Type":"ContainerDied","Data":"75ea2117156de6091553e73240a2df927f51cf2b5840af7ca4269efc526154b7"} Feb 23 08:46:49 crc kubenswrapper[5047]: I0223 08:46:49.992282 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.103167 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-config-data\") pod \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.103263 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-etc-swift\") pod \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.103315 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg2ts\" (UniqueName: \"kubernetes.io/projected/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-kube-api-access-mg2ts\") pod \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.103343 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-combined-ca-bundle\") pod \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.103422 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-log-httpd\") pod \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.103440 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-run-httpd\") pod \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\" (UID: \"41c5c59a-c3ea-4f82-8bf4-9a95787616f6\") " Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.104387 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "41c5c59a-c3ea-4f82-8bf4-9a95787616f6" (UID: "41c5c59a-c3ea-4f82-8bf4-9a95787616f6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.106473 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "41c5c59a-c3ea-4f82-8bf4-9a95787616f6" (UID: "41c5c59a-c3ea-4f82-8bf4-9a95787616f6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.111165 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-kube-api-access-mg2ts" (OuterVolumeSpecName: "kube-api-access-mg2ts") pod "41c5c59a-c3ea-4f82-8bf4-9a95787616f6" (UID: "41c5c59a-c3ea-4f82-8bf4-9a95787616f6"). InnerVolumeSpecName "kube-api-access-mg2ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.112116 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "41c5c59a-c3ea-4f82-8bf4-9a95787616f6" (UID: "41c5c59a-c3ea-4f82-8bf4-9a95787616f6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.168382 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41c5c59a-c3ea-4f82-8bf4-9a95787616f6" (UID: "41c5c59a-c3ea-4f82-8bf4-9a95787616f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.182278 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-config-data" (OuterVolumeSpecName: "config-data") pod "41c5c59a-c3ea-4f82-8bf4-9a95787616f6" (UID: "41c5c59a-c3ea-4f82-8bf4-9a95787616f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.205722 5047 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.206085 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg2ts\" (UniqueName: \"kubernetes.io/projected/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-kube-api-access-mg2ts\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.206151 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.206297 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.206356 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.206422 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c5c59a-c3ea-4f82-8bf4-9a95787616f6-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.509384 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7fc7875764-v8676" event={"ID":"41c5c59a-c3ea-4f82-8bf4-9a95787616f6","Type":"ContainerDied","Data":"d49611fe956fa48252bcae42e81ff7c1512a70bfa7d317aa8ecb1ea449d671a6"} Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.509450 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7fc7875764-v8676" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.509493 5047 scope.go:117] "RemoveContainer" containerID="bfadfd5027fb517aa3370b27dd53eac6df4ee5d83fb50fe95937c7fea74195aa" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.546652 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7fc7875764-v8676"] Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.547493 5047 scope.go:117] "RemoveContainer" containerID="75ea2117156de6091553e73240a2df927f51cf2b5840af7ca4269efc526154b7" Feb 23 08:46:50 crc kubenswrapper[5047]: I0223 08:46:50.563226 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7fc7875764-v8676"] Feb 23 08:46:52 crc kubenswrapper[5047]: I0223 08:46:52.362225 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c5c59a-c3ea-4f82-8bf4-9a95787616f6" path="/var/lib/kubelet/pods/41c5c59a-c3ea-4f82-8bf4-9a95787616f6/volumes" Feb 23 08:46:55 crc kubenswrapper[5047]: I0223 08:46:55.907793 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-lr25h"] Feb 23 08:46:55 crc kubenswrapper[5047]: E0223 08:46:55.909211 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c5c59a-c3ea-4f82-8bf4-9a95787616f6" containerName="proxy-server" Feb 23 08:46:55 crc kubenswrapper[5047]: I0223 08:46:55.909230 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c5c59a-c3ea-4f82-8bf4-9a95787616f6" containerName="proxy-server" Feb 23 08:46:55 crc kubenswrapper[5047]: E0223 08:46:55.909265 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada6a2f2-1e71-45a6-91ef-73f00a6194f1" containerName="dnsmasq-dns" Feb 23 08:46:55 crc kubenswrapper[5047]: I0223 08:46:55.909274 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada6a2f2-1e71-45a6-91ef-73f00a6194f1" containerName="dnsmasq-dns" Feb 23 08:46:55 crc kubenswrapper[5047]: E0223 08:46:55.909289 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd40f96-4019-455e-b292-19cd00b8e616" containerName="swift-ring-rebalance" Feb 23 08:46:55 crc kubenswrapper[5047]: I0223 08:46:55.909298 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd40f96-4019-455e-b292-19cd00b8e616" containerName="swift-ring-rebalance" Feb 23 08:46:55 crc kubenswrapper[5047]: E0223 08:46:55.909339 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c5c59a-c3ea-4f82-8bf4-9a95787616f6" containerName="proxy-httpd" Feb 23 08:46:55 crc kubenswrapper[5047]: I0223 08:46:55.909346 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c5c59a-c3ea-4f82-8bf4-9a95787616f6" containerName="proxy-httpd" Feb 23 08:46:55 crc kubenswrapper[5047]: E0223 08:46:55.909396 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada6a2f2-1e71-45a6-91ef-73f00a6194f1" containerName="init" Feb 23 08:46:55 crc kubenswrapper[5047]: I0223 08:46:55.909402 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada6a2f2-1e71-45a6-91ef-73f00a6194f1" containerName="init" Feb 23 08:46:55 crc kubenswrapper[5047]: I0223 08:46:55.909829 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c5c59a-c3ea-4f82-8bf4-9a95787616f6" containerName="proxy-server" Feb 23 08:46:55 crc kubenswrapper[5047]: I0223 08:46:55.909847 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c5c59a-c3ea-4f82-8bf4-9a95787616f6" containerName="proxy-httpd" Feb 23 08:46:55 crc kubenswrapper[5047]: I0223 08:46:55.909888 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada6a2f2-1e71-45a6-91ef-73f00a6194f1" containerName="dnsmasq-dns" Feb 23 08:46:55 crc kubenswrapper[5047]: I0223 08:46:55.909897 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd40f96-4019-455e-b292-19cd00b8e616" containerName="swift-ring-rebalance" Feb 23 08:46:55 crc kubenswrapper[5047]: I0223 08:46:55.911150 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lr25h" Feb 23 08:46:55 crc kubenswrapper[5047]: I0223 08:46:55.966670 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lr25h"] Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.011294 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6c0a-account-create-update-jbdtd"] Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.012920 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6c0a-account-create-update-jbdtd" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.017871 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6c0a-account-create-update-jbdtd"] Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.019227 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.042031 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3824710f-3773-4281-8c49-b9799daa3671-operator-scripts\") pod \"cinder-db-create-lr25h\" (UID: \"3824710f-3773-4281-8c49-b9799daa3671\") " pod="openstack/cinder-db-create-lr25h" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.042083 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8c6r\" (UniqueName: \"kubernetes.io/projected/3824710f-3773-4281-8c49-b9799daa3671-kube-api-access-g8c6r\") pod \"cinder-db-create-lr25h\" (UID: \"3824710f-3773-4281-8c49-b9799daa3671\") " pod="openstack/cinder-db-create-lr25h" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.146859 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp9tf\" (UniqueName: \"kubernetes.io/projected/dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d-kube-api-access-hp9tf\") pod \"cinder-6c0a-account-create-update-jbdtd\" (UID: \"dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d\") " pod="openstack/cinder-6c0a-account-create-update-jbdtd" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.147230 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3824710f-3773-4281-8c49-b9799daa3671-operator-scripts\") pod \"cinder-db-create-lr25h\" (UID: \"3824710f-3773-4281-8c49-b9799daa3671\") " pod="openstack/cinder-db-create-lr25h" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.147330 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8c6r\" (UniqueName: \"kubernetes.io/projected/3824710f-3773-4281-8c49-b9799daa3671-kube-api-access-g8c6r\") pod \"cinder-db-create-lr25h\" (UID: \"3824710f-3773-4281-8c49-b9799daa3671\") " pod="openstack/cinder-db-create-lr25h" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.147568 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d-operator-scripts\") pod \"cinder-6c0a-account-create-update-jbdtd\" (UID: \"dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d\") " pod="openstack/cinder-6c0a-account-create-update-jbdtd" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.148203 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3824710f-3773-4281-8c49-b9799daa3671-operator-scripts\") pod \"cinder-db-create-lr25h\" (UID: \"3824710f-3773-4281-8c49-b9799daa3671\") " pod="openstack/cinder-db-create-lr25h" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.182565 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8c6r\" (UniqueName: \"kubernetes.io/projected/3824710f-3773-4281-8c49-b9799daa3671-kube-api-access-g8c6r\") pod \"cinder-db-create-lr25h\" (UID: \"3824710f-3773-4281-8c49-b9799daa3671\") " pod="openstack/cinder-db-create-lr25h" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.250854 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d-operator-scripts\") pod \"cinder-6c0a-account-create-update-jbdtd\" (UID: \"dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d\") " pod="openstack/cinder-6c0a-account-create-update-jbdtd" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.251053 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp9tf\" (UniqueName: \"kubernetes.io/projected/dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d-kube-api-access-hp9tf\") pod \"cinder-6c0a-account-create-update-jbdtd\" (UID: \"dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d\") " pod="openstack/cinder-6c0a-account-create-update-jbdtd" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.252628 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d-operator-scripts\") pod \"cinder-6c0a-account-create-update-jbdtd\" (UID: \"dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d\") " pod="openstack/cinder-6c0a-account-create-update-jbdtd" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.280577 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp9tf\" (UniqueName: \"kubernetes.io/projected/dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d-kube-api-access-hp9tf\") pod \"cinder-6c0a-account-create-update-jbdtd\" (UID: \"dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d\") " pod="openstack/cinder-6c0a-account-create-update-jbdtd" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.283185 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lr25h" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.342787 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6c0a-account-create-update-jbdtd" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.548974 5047 scope.go:117] "RemoveContainer" containerID="7204b4d0e16eefbedca9234be35959b70f726eadc685bee72d0427c7e67f3040" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.582126 5047 scope.go:117] "RemoveContainer" containerID="7c22430a25e9a0f924e9db23fa9441d74399301b524a94e01670b1358c8a671a" Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.734682 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6c0a-account-create-update-jbdtd"] Feb 23 08:46:56 crc kubenswrapper[5047]: W0223 08:46:56.740074 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd0727e5_c2c0_41dc_ab5b_42fefccaaa6d.slice/crio-13f9534db6d3e1a048e7f014a3c24f27bb308f85950ba9002bb8fcae558bc203 WatchSource:0}: Error finding container 13f9534db6d3e1a048e7f014a3c24f27bb308f85950ba9002bb8fcae558bc203: Status 404 returned error can't find the container with id 13f9534db6d3e1a048e7f014a3c24f27bb308f85950ba9002bb8fcae558bc203 Feb 23 08:46:56 crc kubenswrapper[5047]: I0223 08:46:56.828507 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lr25h"] Feb 23 08:46:56 crc kubenswrapper[5047]: W0223 08:46:56.835573 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3824710f_3773_4281_8c49_b9799daa3671.slice/crio-a033ae5d24b1736b1e6f47295a4d26412c61227d47e96ac2a136591bb0204c7b WatchSource:0}: Error finding container a033ae5d24b1736b1e6f47295a4d26412c61227d47e96ac2a136591bb0204c7b: Status 404 returned error can't find the container with id a033ae5d24b1736b1e6f47295a4d26412c61227d47e96ac2a136591bb0204c7b Feb 23 08:46:57 crc kubenswrapper[5047]: I0223 08:46:57.594512 5047 generic.go:334] "Generic (PLEG): container finished" podID="3824710f-3773-4281-8c49-b9799daa3671" containerID="74c4507af5b6b674cb7bb060a05413012bf15153873749d5c24ee526557f6a28" exitCode=0 Feb 23 08:46:57 crc kubenswrapper[5047]: I0223 08:46:57.594594 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lr25h" event={"ID":"3824710f-3773-4281-8c49-b9799daa3671","Type":"ContainerDied","Data":"74c4507af5b6b674cb7bb060a05413012bf15153873749d5c24ee526557f6a28"} Feb 23 08:46:57 crc kubenswrapper[5047]: I0223 08:46:57.595121 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lr25h" event={"ID":"3824710f-3773-4281-8c49-b9799daa3671","Type":"ContainerStarted","Data":"a033ae5d24b1736b1e6f47295a4d26412c61227d47e96ac2a136591bb0204c7b"} Feb 23 08:46:57 crc kubenswrapper[5047]: I0223 08:46:57.600644 5047 generic.go:334] "Generic (PLEG): container finished" podID="dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d" containerID="dd2ad55b13c125642d0625ac8fe328fbfd54175872bdaba0e73a09bc93d524dc" exitCode=0 Feb 23 08:46:57 crc kubenswrapper[5047]: I0223 08:46:57.600716 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6c0a-account-create-update-jbdtd" event={"ID":"dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d","Type":"ContainerDied","Data":"dd2ad55b13c125642d0625ac8fe328fbfd54175872bdaba0e73a09bc93d524dc"} Feb 23 08:46:57 crc kubenswrapper[5047]: I0223 08:46:57.600816 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6c0a-account-create-update-jbdtd" event={"ID":"dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d","Type":"ContainerStarted","Data":"13f9534db6d3e1a048e7f014a3c24f27bb308f85950ba9002bb8fcae558bc203"} Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.159370 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lr25h" Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.170040 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6c0a-account-create-update-jbdtd" Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.222822 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3824710f-3773-4281-8c49-b9799daa3671-operator-scripts\") pod \"3824710f-3773-4281-8c49-b9799daa3671\" (UID: \"3824710f-3773-4281-8c49-b9799daa3671\") " Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.222999 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp9tf\" (UniqueName: \"kubernetes.io/projected/dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d-kube-api-access-hp9tf\") pod \"dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d\" (UID: \"dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d\") " Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.223064 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8c6r\" (UniqueName: \"kubernetes.io/projected/3824710f-3773-4281-8c49-b9799daa3671-kube-api-access-g8c6r\") pod \"3824710f-3773-4281-8c49-b9799daa3671\" (UID: \"3824710f-3773-4281-8c49-b9799daa3671\") " Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.223211 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d-operator-scripts\") pod \"dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d\" (UID: \"dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d\") " Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.223869 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3824710f-3773-4281-8c49-b9799daa3671-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3824710f-3773-4281-8c49-b9799daa3671" (UID: "3824710f-3773-4281-8c49-b9799daa3671"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.224625 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d" (UID: "dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.231226 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3824710f-3773-4281-8c49-b9799daa3671-kube-api-access-g8c6r" (OuterVolumeSpecName: "kube-api-access-g8c6r") pod "3824710f-3773-4281-8c49-b9799daa3671" (UID: "3824710f-3773-4281-8c49-b9799daa3671"). InnerVolumeSpecName "kube-api-access-g8c6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.231313 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d-kube-api-access-hp9tf" (OuterVolumeSpecName: "kube-api-access-hp9tf") pod "dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d" (UID: "dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d"). InnerVolumeSpecName "kube-api-access-hp9tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.325699 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.325847 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3824710f-3773-4281-8c49-b9799daa3671-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.325866 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp9tf\" (UniqueName: \"kubernetes.io/projected/dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d-kube-api-access-hp9tf\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.325881 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8c6r\" (UniqueName: \"kubernetes.io/projected/3824710f-3773-4281-8c49-b9799daa3671-kube-api-access-g8c6r\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.624326 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lr25h" event={"ID":"3824710f-3773-4281-8c49-b9799daa3671","Type":"ContainerDied","Data":"a033ae5d24b1736b1e6f47295a4d26412c61227d47e96ac2a136591bb0204c7b"} Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.624393 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a033ae5d24b1736b1e6f47295a4d26412c61227d47e96ac2a136591bb0204c7b" Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.624561 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lr25h" Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.628227 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6c0a-account-create-update-jbdtd" Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.628463 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6c0a-account-create-update-jbdtd" event={"ID":"dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d","Type":"ContainerDied","Data":"13f9534db6d3e1a048e7f014a3c24f27bb308f85950ba9002bb8fcae558bc203"} Feb 23 08:46:59 crc kubenswrapper[5047]: I0223 08:46:59.628520 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13f9534db6d3e1a048e7f014a3c24f27bb308f85950ba9002bb8fcae558bc203" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.256675 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-57qv5"] Feb 23 08:47:01 crc kubenswrapper[5047]: E0223 08:47:01.257721 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3824710f-3773-4281-8c49-b9799daa3671" containerName="mariadb-database-create" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.257748 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3824710f-3773-4281-8c49-b9799daa3671" containerName="mariadb-database-create" Feb 23 08:47:01 crc kubenswrapper[5047]: E0223 08:47:01.257802 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d" containerName="mariadb-account-create-update" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.257816 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d" containerName="mariadb-account-create-update" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.258138 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="3824710f-3773-4281-8c49-b9799daa3671" containerName="mariadb-database-create" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.258199 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d" containerName="mariadb-account-create-update" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.259126 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.261436 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-z9xwp" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.262517 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.262549 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.271048 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-57qv5"] Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.371951 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db167786-3d0a-4918-9f70-dfd9610eb0cd-etc-machine-id\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.372037 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4mw5\" (UniqueName: \"kubernetes.io/projected/db167786-3d0a-4918-9f70-dfd9610eb0cd-kube-api-access-v4mw5\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.372962 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-scripts\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.373042 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-config-data\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.373071 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-db-sync-config-data\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.373200 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-combined-ca-bundle\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.474994 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4mw5\" (UniqueName: \"kubernetes.io/projected/db167786-3d0a-4918-9f70-dfd9610eb0cd-kube-api-access-v4mw5\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.475087 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-scripts\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.475120 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-config-data\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.475148 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-db-sync-config-data\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.475220 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-combined-ca-bundle\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.475266 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db167786-3d0a-4918-9f70-dfd9610eb0cd-etc-machine-id\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.475391 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db167786-3d0a-4918-9f70-dfd9610eb0cd-etc-machine-id\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.484775 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-combined-ca-bundle\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.485064 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-scripts\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.485683 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-config-data\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.487695 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-db-sync-config-data\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.498958 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4mw5\" (UniqueName: \"kubernetes.io/projected/db167786-3d0a-4918-9f70-dfd9610eb0cd-kube-api-access-v4mw5\") pod \"cinder-db-sync-57qv5\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:01 crc kubenswrapper[5047]: I0223 08:47:01.615935 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:02 crc kubenswrapper[5047]: I0223 08:47:02.123806 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-57qv5"] Feb 23 08:47:02 crc kubenswrapper[5047]: I0223 08:47:02.659249 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-57qv5" event={"ID":"db167786-3d0a-4918-9f70-dfd9610eb0cd","Type":"ContainerStarted","Data":"0001d428c4003b8ccd09985a2c1e9265cebf1cc61b22657d27649a1aaaf73716"} Feb 23 08:47:12 crc kubenswrapper[5047]: I0223 08:47:12.300976 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4npjs"] Feb 23 08:47:12 crc kubenswrapper[5047]: I0223 08:47:12.321593 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:12 crc kubenswrapper[5047]: I0223 08:47:12.375957 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4npjs"] Feb 23 08:47:12 crc kubenswrapper[5047]: I0223 08:47:12.414532 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3443e301-4942-4ae3-b3e9-5ab7dbee8973-utilities\") pod \"redhat-operators-4npjs\" (UID: \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\") " pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:12 crc kubenswrapper[5047]: I0223 08:47:12.414887 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpm9l\" (UniqueName: \"kubernetes.io/projected/3443e301-4942-4ae3-b3e9-5ab7dbee8973-kube-api-access-qpm9l\") pod \"redhat-operators-4npjs\" (UID: \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\") " pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:12 crc kubenswrapper[5047]: I0223 08:47:12.415456 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3443e301-4942-4ae3-b3e9-5ab7dbee8973-catalog-content\") pod \"redhat-operators-4npjs\" (UID: \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\") " pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:12 crc kubenswrapper[5047]: I0223 08:47:12.518143 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3443e301-4942-4ae3-b3e9-5ab7dbee8973-utilities\") pod \"redhat-operators-4npjs\" (UID: \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\") " pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:12 crc kubenswrapper[5047]: I0223 08:47:12.518221 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpm9l\" (UniqueName: \"kubernetes.io/projected/3443e301-4942-4ae3-b3e9-5ab7dbee8973-kube-api-access-qpm9l\") pod \"redhat-operators-4npjs\" (UID: \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\") " pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:12 crc kubenswrapper[5047]: I0223 08:47:12.518301 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3443e301-4942-4ae3-b3e9-5ab7dbee8973-catalog-content\") pod \"redhat-operators-4npjs\" (UID: \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\") " pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:12 crc kubenswrapper[5047]: I0223 08:47:12.519036 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3443e301-4942-4ae3-b3e9-5ab7dbee8973-catalog-content\") pod \"redhat-operators-4npjs\" (UID: \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\") " pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:12 crc kubenswrapper[5047]: I0223 08:47:12.519334 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3443e301-4942-4ae3-b3e9-5ab7dbee8973-utilities\") pod \"redhat-operators-4npjs\" (UID: \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\") " pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:12 crc kubenswrapper[5047]: I0223 08:47:12.543519 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpm9l\" (UniqueName: \"kubernetes.io/projected/3443e301-4942-4ae3-b3e9-5ab7dbee8973-kube-api-access-qpm9l\") pod \"redhat-operators-4npjs\" (UID: \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\") " pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:12 crc kubenswrapper[5047]: I0223 08:47:12.656176 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:22 crc kubenswrapper[5047]: I0223 08:47:22.547524 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4npjs"] Feb 23 08:47:22 crc kubenswrapper[5047]: W0223 08:47:22.555142 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3443e301_4942_4ae3_b3e9_5ab7dbee8973.slice/crio-ede2f63e8f90b1942079d343e69299c847811f3548dca2e09d9ef440be5d78a5 WatchSource:0}: Error finding container ede2f63e8f90b1942079d343e69299c847811f3548dca2e09d9ef440be5d78a5: Status 404 returned error can't find the container with id ede2f63e8f90b1942079d343e69299c847811f3548dca2e09d9ef440be5d78a5 Feb 23 08:47:22 crc kubenswrapper[5047]: I0223 08:47:22.891032 5047 generic.go:334] "Generic (PLEG): container finished" podID="3443e301-4942-4ae3-b3e9-5ab7dbee8973" containerID="4b22d9a52d0ff7b3a17c2d02abb2ecb330d8044668dd0b051ad6fcd7999c316a" exitCode=0 Feb 23 08:47:22 crc kubenswrapper[5047]: I0223 08:47:22.891164 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4npjs" event={"ID":"3443e301-4942-4ae3-b3e9-5ab7dbee8973","Type":"ContainerDied","Data":"4b22d9a52d0ff7b3a17c2d02abb2ecb330d8044668dd0b051ad6fcd7999c316a"} Feb 23 08:47:22 crc kubenswrapper[5047]: I0223 08:47:22.891432 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4npjs" event={"ID":"3443e301-4942-4ae3-b3e9-5ab7dbee8973","Type":"ContainerStarted","Data":"ede2f63e8f90b1942079d343e69299c847811f3548dca2e09d9ef440be5d78a5"} Feb 23 08:47:23 crc kubenswrapper[5047]: I0223 08:47:23.903768 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4npjs" event={"ID":"3443e301-4942-4ae3-b3e9-5ab7dbee8973","Type":"ContainerStarted","Data":"33eae7fd00d3573717637b6d1b27a4f357b68e5237d264e046608aad9d1f4ff8"} Feb 23 08:47:23 crc kubenswrapper[5047]: I0223 08:47:23.906779 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-57qv5" event={"ID":"db167786-3d0a-4918-9f70-dfd9610eb0cd","Type":"ContainerStarted","Data":"6e9bd22e781b5077b022248894ff9a1e31e4f6f3ba638dd2ab4b5bb0fc69e656"} Feb 23 08:47:25 crc kubenswrapper[5047]: I0223 08:47:25.932845 5047 generic.go:334] "Generic (PLEG): container finished" podID="3443e301-4942-4ae3-b3e9-5ab7dbee8973" containerID="33eae7fd00d3573717637b6d1b27a4f357b68e5237d264e046608aad9d1f4ff8" exitCode=0 Feb 23 08:47:25 crc kubenswrapper[5047]: I0223 08:47:25.932930 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4npjs" event={"ID":"3443e301-4942-4ae3-b3e9-5ab7dbee8973","Type":"ContainerDied","Data":"33eae7fd00d3573717637b6d1b27a4f357b68e5237d264e046608aad9d1f4ff8"} Feb 23 08:47:25 crc kubenswrapper[5047]: I0223 08:47:25.962619 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-57qv5" podStartSLOduration=4.964432211 podStartE2EDuration="24.962594717s" podCreationTimestamp="2026-02-23 08:47:01 +0000 UTC" firstStartedPulling="2026-02-23 08:47:02.137541543 +0000 UTC m=+7344.388868677" lastFinishedPulling="2026-02-23 08:47:22.135704049 +0000 UTC m=+7364.387031183" observedRunningTime="2026-02-23 08:47:23.958082588 +0000 UTC m=+7366.209409742" watchObservedRunningTime="2026-02-23 08:47:25.962594717 +0000 UTC m=+7368.213921851" Feb 23 08:47:26 crc kubenswrapper[5047]: I0223 08:47:26.951714 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4npjs" event={"ID":"3443e301-4942-4ae3-b3e9-5ab7dbee8973","Type":"ContainerStarted","Data":"d3834b3ad00e430f6121ada4b6bc0f2b772619f5c429be533bf0a83a15d57d54"} Feb 23 08:47:26 crc kubenswrapper[5047]: I0223 08:47:26.956173 5047 generic.go:334] "Generic (PLEG): container finished" podID="db167786-3d0a-4918-9f70-dfd9610eb0cd" containerID="6e9bd22e781b5077b022248894ff9a1e31e4f6f3ba638dd2ab4b5bb0fc69e656" exitCode=0 Feb 23 08:47:26 crc kubenswrapper[5047]: I0223 08:47:26.956250 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-57qv5" event={"ID":"db167786-3d0a-4918-9f70-dfd9610eb0cd","Type":"ContainerDied","Data":"6e9bd22e781b5077b022248894ff9a1e31e4f6f3ba638dd2ab4b5bb0fc69e656"} Feb 23 08:47:26 crc kubenswrapper[5047]: I0223 08:47:26.988368 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4npjs" podStartSLOduration=11.542717457 podStartE2EDuration="14.988342191s" podCreationTimestamp="2026-02-23 08:47:12 +0000 UTC" firstStartedPulling="2026-02-23 08:47:22.894338613 +0000 UTC m=+7365.145665747" lastFinishedPulling="2026-02-23 08:47:26.339963347 +0000 UTC m=+7368.591290481" observedRunningTime="2026-02-23 08:47:26.985968457 +0000 UTC m=+7369.237295631" watchObservedRunningTime="2026-02-23 08:47:26.988342191 +0000 UTC m=+7369.239669335" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.335318 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.412672 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db167786-3d0a-4918-9f70-dfd9610eb0cd-etc-machine-id\") pod \"db167786-3d0a-4918-9f70-dfd9610eb0cd\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.412716 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-db-sync-config-data\") pod \"db167786-3d0a-4918-9f70-dfd9610eb0cd\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.413180 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4mw5\" (UniqueName: \"kubernetes.io/projected/db167786-3d0a-4918-9f70-dfd9610eb0cd-kube-api-access-v4mw5\") pod \"db167786-3d0a-4918-9f70-dfd9610eb0cd\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.413212 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db167786-3d0a-4918-9f70-dfd9610eb0cd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "db167786-3d0a-4918-9f70-dfd9610eb0cd" (UID: "db167786-3d0a-4918-9f70-dfd9610eb0cd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.413398 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-config-data\") pod \"db167786-3d0a-4918-9f70-dfd9610eb0cd\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.413570 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-combined-ca-bundle\") pod \"db167786-3d0a-4918-9f70-dfd9610eb0cd\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.414170 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-scripts\") pod \"db167786-3d0a-4918-9f70-dfd9610eb0cd\" (UID: \"db167786-3d0a-4918-9f70-dfd9610eb0cd\") " Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.415416 5047 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db167786-3d0a-4918-9f70-dfd9610eb0cd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.429971 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "db167786-3d0a-4918-9f70-dfd9610eb0cd" (UID: "db167786-3d0a-4918-9f70-dfd9610eb0cd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.430043 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db167786-3d0a-4918-9f70-dfd9610eb0cd-kube-api-access-v4mw5" (OuterVolumeSpecName: "kube-api-access-v4mw5") pod "db167786-3d0a-4918-9f70-dfd9610eb0cd" (UID: "db167786-3d0a-4918-9f70-dfd9610eb0cd"). InnerVolumeSpecName "kube-api-access-v4mw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.433202 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-scripts" (OuterVolumeSpecName: "scripts") pod "db167786-3d0a-4918-9f70-dfd9610eb0cd" (UID: "db167786-3d0a-4918-9f70-dfd9610eb0cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.446994 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db167786-3d0a-4918-9f70-dfd9610eb0cd" (UID: "db167786-3d0a-4918-9f70-dfd9610eb0cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.472140 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-config-data" (OuterVolumeSpecName: "config-data") pod "db167786-3d0a-4918-9f70-dfd9610eb0cd" (UID: "db167786-3d0a-4918-9f70-dfd9610eb0cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.520878 5047 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.520948 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4mw5\" (UniqueName: \"kubernetes.io/projected/db167786-3d0a-4918-9f70-dfd9610eb0cd-kube-api-access-v4mw5\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.520980 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.520996 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.521011 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db167786-3d0a-4918-9f70-dfd9610eb0cd-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.975608 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-57qv5" event={"ID":"db167786-3d0a-4918-9f70-dfd9610eb0cd","Type":"ContainerDied","Data":"0001d428c4003b8ccd09985a2c1e9265cebf1cc61b22657d27649a1aaaf73716"} Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.975670 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0001d428c4003b8ccd09985a2c1e9265cebf1cc61b22657d27649a1aaaf73716" Feb 23 08:47:28 crc kubenswrapper[5047]: I0223 08:47:28.975763 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-57qv5" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.418469 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-788bb74c87-zcjrv"] Feb 23 08:47:29 crc kubenswrapper[5047]: E0223 08:47:29.418849 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db167786-3d0a-4918-9f70-dfd9610eb0cd" containerName="cinder-db-sync" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.418864 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="db167786-3d0a-4918-9f70-dfd9610eb0cd" containerName="cinder-db-sync" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.419074 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="db167786-3d0a-4918-9f70-dfd9610eb0cd" containerName="cinder-db-sync" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.421456 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.453021 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-788bb74c87-zcjrv"] Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.538825 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl9lt\" (UniqueName: \"kubernetes.io/projected/14629101-8773-4daf-8eab-2eb6d4b555dc-kube-api-access-nl9lt\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.538968 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-ovsdbserver-sb\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.539084 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-config\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.539121 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-ovsdbserver-nb\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.539177 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-dns-svc\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.545337 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.548223 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.559186 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-z9xwp" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.559304 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.559387 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.559445 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.563966 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.640488 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-scripts\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.640568 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-config\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.640594 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-logs\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.640622 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-ovsdbserver-nb\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.640669 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-dns-svc\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.640687 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-config-data\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.640702 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dj4v\" (UniqueName: \"kubernetes.io/projected/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-kube-api-access-5dj4v\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.640738 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl9lt\" (UniqueName: \"kubernetes.io/projected/14629101-8773-4daf-8eab-2eb6d4b555dc-kube-api-access-nl9lt\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.640756 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.640777 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.640817 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-ovsdbserver-sb\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.640856 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-config-data-custom\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.641800 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-config\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.642375 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-ovsdbserver-nb\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.642878 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-dns-svc\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.643565 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-ovsdbserver-sb\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.660759 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl9lt\" (UniqueName: \"kubernetes.io/projected/14629101-8773-4daf-8eab-2eb6d4b555dc-kube-api-access-nl9lt\") pod \"dnsmasq-dns-788bb74c87-zcjrv\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.742603 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-config-data-custom\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.742662 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-scripts\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.742718 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-logs\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.743246 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-logs\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.743484 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-config-data\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.743517 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dj4v\" (UniqueName: \"kubernetes.io/projected/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-kube-api-access-5dj4v\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.743559 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.743585 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.743671 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.747027 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.747219 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.747297 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-scripts\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.748460 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-config-data\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.756587 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-config-data-custom\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.762886 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dj4v\" (UniqueName: \"kubernetes.io/projected/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-kube-api-access-5dj4v\") pod \"cinder-api-0\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " pod="openstack/cinder-api-0" Feb 23 08:47:29 crc kubenswrapper[5047]: I0223 08:47:29.863767 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:47:30 crc kubenswrapper[5047]: I0223 08:47:30.253858 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-788bb74c87-zcjrv"] Feb 23 08:47:30 crc kubenswrapper[5047]: W0223 08:47:30.258934 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14629101_8773_4daf_8eab_2eb6d4b555dc.slice/crio-03563cbc08d9817fbfbd47e291680361789ca8c37e99946c9fb3442d9d88cb10 WatchSource:0}: Error finding container 03563cbc08d9817fbfbd47e291680361789ca8c37e99946c9fb3442d9d88cb10: Status 404 returned error can't find the container with id 03563cbc08d9817fbfbd47e291680361789ca8c37e99946c9fb3442d9d88cb10 Feb 23 08:47:30 crc kubenswrapper[5047]: I0223 08:47:30.411859 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:47:30 crc kubenswrapper[5047]: W0223 08:47:30.416833 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dbcbe51_dc4c_4e33_9c36_51a0d2da3d13.slice/crio-c612739f2bba06c5f8602588feccf8b4ad842622285c78876ec7840324d3d1d4 WatchSource:0}: Error finding container c612739f2bba06c5f8602588feccf8b4ad842622285c78876ec7840324d3d1d4: Status 404 returned error can't find the container with id c612739f2bba06c5f8602588feccf8b4ad842622285c78876ec7840324d3d1d4 Feb 23 08:47:31 crc kubenswrapper[5047]: I0223 08:47:31.005028 5047 generic.go:334] "Generic (PLEG): container finished" podID="14629101-8773-4daf-8eab-2eb6d4b555dc" containerID="2bbdd0c4f119bfaaba3e1360ad16a71d8ab9a176ad0f3c50c51c378426156ac5" exitCode=0 Feb 23 08:47:31 crc kubenswrapper[5047]: I0223 08:47:31.005120 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" event={"ID":"14629101-8773-4daf-8eab-2eb6d4b555dc","Type":"ContainerDied","Data":"2bbdd0c4f119bfaaba3e1360ad16a71d8ab9a176ad0f3c50c51c378426156ac5"} Feb 23 08:47:31 crc kubenswrapper[5047]: I0223 08:47:31.006334 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" event={"ID":"14629101-8773-4daf-8eab-2eb6d4b555dc","Type":"ContainerStarted","Data":"03563cbc08d9817fbfbd47e291680361789ca8c37e99946c9fb3442d9d88cb10"} Feb 23 08:47:31 crc kubenswrapper[5047]: I0223 08:47:31.007735 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13","Type":"ContainerStarted","Data":"c612739f2bba06c5f8602588feccf8b4ad842622285c78876ec7840324d3d1d4"} Feb 23 08:47:31 crc kubenswrapper[5047]: I0223 08:47:31.954031 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:47:32 crc kubenswrapper[5047]: I0223 08:47:32.017391 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" event={"ID":"14629101-8773-4daf-8eab-2eb6d4b555dc","Type":"ContainerStarted","Data":"48e0650f732385d422f2d60e1c549dbcc8d57ef7e39e1ded07af1b09e5f8788b"} Feb 23 08:47:32 crc kubenswrapper[5047]: I0223 08:47:32.019057 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:32 crc kubenswrapper[5047]: I0223 08:47:32.020662 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13","Type":"ContainerStarted","Data":"98f98c30a1b605751204bfdb90030856009d784af92bf06a33e32d845bce2022"} Feb 23 08:47:32 crc kubenswrapper[5047]: I0223 08:47:32.020687 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13","Type":"ContainerStarted","Data":"b8cd0bfdc96f2afe3f84de64c5bc5ec56b055b4f550ed1cd332dc3788459d930"} Feb 23 08:47:32 crc kubenswrapper[5047]: I0223 08:47:32.020783 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" containerName="cinder-api-log" containerID="cri-o://b8cd0bfdc96f2afe3f84de64c5bc5ec56b055b4f550ed1cd332dc3788459d930" gracePeriod=30 Feb 23 08:47:32 crc kubenswrapper[5047]: I0223 08:47:32.021008 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 23 08:47:32 crc kubenswrapper[5047]: I0223 08:47:32.021042 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" containerName="cinder-api" containerID="cri-o://98f98c30a1b605751204bfdb90030856009d784af92bf06a33e32d845bce2022" gracePeriod=30 Feb 23 08:47:32 crc kubenswrapper[5047]: I0223 08:47:32.045415 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" podStartSLOduration=3.045394592 podStartE2EDuration="3.045394592s" podCreationTimestamp="2026-02-23 08:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:32.035381084 +0000 UTC m=+7374.286708218" watchObservedRunningTime="2026-02-23 08:47:32.045394592 +0000 UTC m=+7374.296721726" Feb 23 08:47:32 crc kubenswrapper[5047]: I0223 08:47:32.061238 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.061206887 podStartE2EDuration="3.061206887s" podCreationTimestamp="2026-02-23 08:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:32.05388659 +0000 UTC m=+7374.305213724" watchObservedRunningTime="2026-02-23 08:47:32.061206887 +0000 UTC m=+7374.312534011" Feb 23 08:47:32 crc kubenswrapper[5047]: I0223 08:47:32.656860 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:32 crc kubenswrapper[5047]: I0223 08:47:32.656972 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:33 crc kubenswrapper[5047]: I0223 08:47:33.032316 5047 generic.go:334] "Generic (PLEG): container finished" podID="2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" containerID="b8cd0bfdc96f2afe3f84de64c5bc5ec56b055b4f550ed1cd332dc3788459d930" exitCode=143 Feb 23 08:47:33 crc kubenswrapper[5047]: I0223 08:47:33.032405 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13","Type":"ContainerDied","Data":"b8cd0bfdc96f2afe3f84de64c5bc5ec56b055b4f550ed1cd332dc3788459d930"} Feb 23 08:47:33 crc kubenswrapper[5047]: I0223 08:47:33.708240 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4npjs" podUID="3443e301-4942-4ae3-b3e9-5ab7dbee8973" containerName="registry-server" probeResult="failure" output=< Feb 23 08:47:33 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 08:47:33 crc kubenswrapper[5047]: > Feb 23 08:47:39 crc kubenswrapper[5047]: I0223 08:47:39.749248 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:47:39 crc kubenswrapper[5047]: I0223 08:47:39.847600 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77c98fc85f-sw8fn"] Feb 23 08:47:39 crc kubenswrapper[5047]: I0223 08:47:39.848520 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" podUID="dda27df2-528e-46fb-9e17-1dfacd9e70ee" containerName="dnsmasq-dns" containerID="cri-o://4b5105c60c8298f6cd7a7fc1bfb183c9fa6eb74473d079079174a517180a28a8" gracePeriod=10 Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.121549 5047 generic.go:334] "Generic (PLEG): container finished" podID="dda27df2-528e-46fb-9e17-1dfacd9e70ee" containerID="4b5105c60c8298f6cd7a7fc1bfb183c9fa6eb74473d079079174a517180a28a8" exitCode=0 Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.121602 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" event={"ID":"dda27df2-528e-46fb-9e17-1dfacd9e70ee","Type":"ContainerDied","Data":"4b5105c60c8298f6cd7a7fc1bfb183c9fa6eb74473d079079174a517180a28a8"} Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.412547 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.509216 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5fcc\" (UniqueName: \"kubernetes.io/projected/dda27df2-528e-46fb-9e17-1dfacd9e70ee-kube-api-access-x5fcc\") pod \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.509340 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-dns-svc\") pod \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.509372 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-config\") pod \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.509488 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-ovsdbserver-sb\") pod \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.509628 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-ovsdbserver-nb\") pod \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\" (UID: \"dda27df2-528e-46fb-9e17-1dfacd9e70ee\") " Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.516141 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda27df2-528e-46fb-9e17-1dfacd9e70ee-kube-api-access-x5fcc" (OuterVolumeSpecName: "kube-api-access-x5fcc") pod "dda27df2-528e-46fb-9e17-1dfacd9e70ee" (UID: "dda27df2-528e-46fb-9e17-1dfacd9e70ee"). InnerVolumeSpecName "kube-api-access-x5fcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.551988 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dda27df2-528e-46fb-9e17-1dfacd9e70ee" (UID: "dda27df2-528e-46fb-9e17-1dfacd9e70ee"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.559834 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dda27df2-528e-46fb-9e17-1dfacd9e70ee" (UID: "dda27df2-528e-46fb-9e17-1dfacd9e70ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.561385 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-config" (OuterVolumeSpecName: "config") pod "dda27df2-528e-46fb-9e17-1dfacd9e70ee" (UID: "dda27df2-528e-46fb-9e17-1dfacd9e70ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.565729 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dda27df2-528e-46fb-9e17-1dfacd9e70ee" (UID: "dda27df2-528e-46fb-9e17-1dfacd9e70ee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.611512 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.611736 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.611806 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5fcc\" (UniqueName: \"kubernetes.io/projected/dda27df2-528e-46fb-9e17-1dfacd9e70ee-kube-api-access-x5fcc\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.611880 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:40 crc kubenswrapper[5047]: I0223 08:47:40.611955 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dda27df2-528e-46fb-9e17-1dfacd9e70ee-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:41 crc kubenswrapper[5047]: I0223 08:47:41.135437 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" event={"ID":"dda27df2-528e-46fb-9e17-1dfacd9e70ee","Type":"ContainerDied","Data":"ccd6e19bb9b12cbeab5c0792f2598c377bdd50dc5ece7887f7c30a26747e4807"} Feb 23 08:47:41 crc kubenswrapper[5047]: I0223 08:47:41.135503 5047 scope.go:117] "RemoveContainer" containerID="4b5105c60c8298f6cd7a7fc1bfb183c9fa6eb74473d079079174a517180a28a8" Feb 23 08:47:41 crc kubenswrapper[5047]: I0223 08:47:41.135545 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c98fc85f-sw8fn" Feb 23 08:47:41 crc kubenswrapper[5047]: I0223 08:47:41.160369 5047 scope.go:117] "RemoveContainer" containerID="fcb586ad993c48e8b973ec29427639c02accea5ae41172e65e4e6b2908bec5bc" Feb 23 08:47:41 crc kubenswrapper[5047]: I0223 08:47:41.204450 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77c98fc85f-sw8fn"] Feb 23 08:47:41 crc kubenswrapper[5047]: I0223 08:47:41.217183 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77c98fc85f-sw8fn"] Feb 23 08:47:41 crc kubenswrapper[5047]: I0223 08:47:41.951647 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 23 08:47:42 crc kubenswrapper[5047]: I0223 08:47:42.353421 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dda27df2-528e-46fb-9e17-1dfacd9e70ee" path="/var/lib/kubelet/pods/dda27df2-528e-46fb-9e17-1dfacd9e70ee/volumes" Feb 23 08:47:42 crc kubenswrapper[5047]: I0223 08:47:42.708882 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:42 crc kubenswrapper[5047]: I0223 08:47:42.760661 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:43 crc kubenswrapper[5047]: I0223 08:47:43.487983 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4npjs"] Feb 23 08:47:44 crc kubenswrapper[5047]: I0223 08:47:44.168138 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4npjs" podUID="3443e301-4942-4ae3-b3e9-5ab7dbee8973" containerName="registry-server" containerID="cri-o://d3834b3ad00e430f6121ada4b6bc0f2b772619f5c429be533bf0a83a15d57d54" gracePeriod=2 Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.181470 5047 generic.go:334] "Generic (PLEG): container finished" podID="3443e301-4942-4ae3-b3e9-5ab7dbee8973" containerID="d3834b3ad00e430f6121ada4b6bc0f2b772619f5c429be533bf0a83a15d57d54" exitCode=0 Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.181572 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4npjs" event={"ID":"3443e301-4942-4ae3-b3e9-5ab7dbee8973","Type":"ContainerDied","Data":"d3834b3ad00e430f6121ada4b6bc0f2b772619f5c429be533bf0a83a15d57d54"} Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.182103 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4npjs" event={"ID":"3443e301-4942-4ae3-b3e9-5ab7dbee8973","Type":"ContainerDied","Data":"ede2f63e8f90b1942079d343e69299c847811f3548dca2e09d9ef440be5d78a5"} Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.182206 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ede2f63e8f90b1942079d343e69299c847811f3548dca2e09d9ef440be5d78a5" Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.195454 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.316755 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpm9l\" (UniqueName: \"kubernetes.io/projected/3443e301-4942-4ae3-b3e9-5ab7dbee8973-kube-api-access-qpm9l\") pod \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\" (UID: \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\") " Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.317170 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3443e301-4942-4ae3-b3e9-5ab7dbee8973-utilities\") pod \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\" (UID: \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\") " Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.317278 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3443e301-4942-4ae3-b3e9-5ab7dbee8973-catalog-content\") pod \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\" (UID: \"3443e301-4942-4ae3-b3e9-5ab7dbee8973\") " Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.320669 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3443e301-4942-4ae3-b3e9-5ab7dbee8973-utilities" (OuterVolumeSpecName: "utilities") pod "3443e301-4942-4ae3-b3e9-5ab7dbee8973" (UID: "3443e301-4942-4ae3-b3e9-5ab7dbee8973"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.325410 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3443e301-4942-4ae3-b3e9-5ab7dbee8973-kube-api-access-qpm9l" (OuterVolumeSpecName: "kube-api-access-qpm9l") pod "3443e301-4942-4ae3-b3e9-5ab7dbee8973" (UID: "3443e301-4942-4ae3-b3e9-5ab7dbee8973"). InnerVolumeSpecName "kube-api-access-qpm9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.420307 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3443e301-4942-4ae3-b3e9-5ab7dbee8973-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.420340 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpm9l\" (UniqueName: \"kubernetes.io/projected/3443e301-4942-4ae3-b3e9-5ab7dbee8973-kube-api-access-qpm9l\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.423087 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3443e301-4942-4ae3-b3e9-5ab7dbee8973-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3443e301-4942-4ae3-b3e9-5ab7dbee8973" (UID: "3443e301-4942-4ae3-b3e9-5ab7dbee8973"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:47:45 crc kubenswrapper[5047]: I0223 08:47:45.522555 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3443e301-4942-4ae3-b3e9-5ab7dbee8973-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:46 crc kubenswrapper[5047]: I0223 08:47:46.193958 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4npjs" Feb 23 08:47:46 crc kubenswrapper[5047]: I0223 08:47:46.260336 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4npjs"] Feb 23 08:47:46 crc kubenswrapper[5047]: I0223 08:47:46.279402 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4npjs"] Feb 23 08:47:46 crc kubenswrapper[5047]: I0223 08:47:46.355277 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3443e301-4942-4ae3-b3e9-5ab7dbee8973" path="/var/lib/kubelet/pods/3443e301-4942-4ae3-b3e9-5ab7dbee8973/volumes" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.031230 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gcxqb"] Feb 23 08:47:57 crc kubenswrapper[5047]: E0223 08:47:57.032231 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda27df2-528e-46fb-9e17-1dfacd9e70ee" containerName="dnsmasq-dns" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.032252 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda27df2-528e-46fb-9e17-1dfacd9e70ee" containerName="dnsmasq-dns" Feb 23 08:47:57 crc kubenswrapper[5047]: E0223 08:47:57.032279 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3443e301-4942-4ae3-b3e9-5ab7dbee8973" containerName="registry-server" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.032291 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3443e301-4942-4ae3-b3e9-5ab7dbee8973" containerName="registry-server" Feb 23 08:47:57 crc kubenswrapper[5047]: E0223 08:47:57.032316 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda27df2-528e-46fb-9e17-1dfacd9e70ee" containerName="init" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.032331 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda27df2-528e-46fb-9e17-1dfacd9e70ee" containerName="init" Feb 23 08:47:57 crc kubenswrapper[5047]: E0223 08:47:57.032367 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3443e301-4942-4ae3-b3e9-5ab7dbee8973" containerName="extract-utilities" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.032379 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3443e301-4942-4ae3-b3e9-5ab7dbee8973" containerName="extract-utilities" Feb 23 08:47:57 crc kubenswrapper[5047]: E0223 08:47:57.032404 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3443e301-4942-4ae3-b3e9-5ab7dbee8973" containerName="extract-content" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.032416 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3443e301-4942-4ae3-b3e9-5ab7dbee8973" containerName="extract-content" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.032704 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="3443e301-4942-4ae3-b3e9-5ab7dbee8973" containerName="registry-server" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.032751 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda27df2-528e-46fb-9e17-1dfacd9e70ee" containerName="dnsmasq-dns" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.035239 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.059784 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gcxqb"] Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.193245 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4cc1435-e723-49a2-8894-bac86f2cceca-utilities\") pod \"redhat-marketplace-gcxqb\" (UID: \"c4cc1435-e723-49a2-8894-bac86f2cceca\") " pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.193328 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4cc1435-e723-49a2-8894-bac86f2cceca-catalog-content\") pod \"redhat-marketplace-gcxqb\" (UID: \"c4cc1435-e723-49a2-8894-bac86f2cceca\") " pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.193667 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2shv\" (UniqueName: \"kubernetes.io/projected/c4cc1435-e723-49a2-8894-bac86f2cceca-kube-api-access-z2shv\") pod \"redhat-marketplace-gcxqb\" (UID: \"c4cc1435-e723-49a2-8894-bac86f2cceca\") " pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.295634 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2shv\" (UniqueName: \"kubernetes.io/projected/c4cc1435-e723-49a2-8894-bac86f2cceca-kube-api-access-z2shv\") pod \"redhat-marketplace-gcxqb\" (UID: \"c4cc1435-e723-49a2-8894-bac86f2cceca\") " pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.296559 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4cc1435-e723-49a2-8894-bac86f2cceca-utilities\") pod \"redhat-marketplace-gcxqb\" (UID: \"c4cc1435-e723-49a2-8894-bac86f2cceca\") " pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.296872 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4cc1435-e723-49a2-8894-bac86f2cceca-catalog-content\") pod \"redhat-marketplace-gcxqb\" (UID: \"c4cc1435-e723-49a2-8894-bac86f2cceca\") " pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.297295 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4cc1435-e723-49a2-8894-bac86f2cceca-utilities\") pod \"redhat-marketplace-gcxqb\" (UID: \"c4cc1435-e723-49a2-8894-bac86f2cceca\") " pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.297532 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4cc1435-e723-49a2-8894-bac86f2cceca-catalog-content\") pod \"redhat-marketplace-gcxqb\" (UID: \"c4cc1435-e723-49a2-8894-bac86f2cceca\") " pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.321253 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2shv\" (UniqueName: \"kubernetes.io/projected/c4cc1435-e723-49a2-8894-bac86f2cceca-kube-api-access-z2shv\") pod \"redhat-marketplace-gcxqb\" (UID: \"c4cc1435-e723-49a2-8894-bac86f2cceca\") " pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.374570 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:47:57 crc kubenswrapper[5047]: I0223 08:47:57.660850 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gcxqb"] Feb 23 08:47:58 crc kubenswrapper[5047]: I0223 08:47:58.337964 5047 generic.go:334] "Generic (PLEG): container finished" podID="c4cc1435-e723-49a2-8894-bac86f2cceca" containerID="0c4ae39a993f67b6c8a66c49094ec07dcf60eaafc3fdcbdb2029f7d11ffa0950" exitCode=0 Feb 23 08:47:58 crc kubenswrapper[5047]: I0223 08:47:58.338023 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcxqb" event={"ID":"c4cc1435-e723-49a2-8894-bac86f2cceca","Type":"ContainerDied","Data":"0c4ae39a993f67b6c8a66c49094ec07dcf60eaafc3fdcbdb2029f7d11ffa0950"} Feb 23 08:47:58 crc kubenswrapper[5047]: I0223 08:47:58.338058 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcxqb" event={"ID":"c4cc1435-e723-49a2-8894-bac86f2cceca","Type":"ContainerStarted","Data":"758007ab6ff39b470829197ba9f7494019de8dbea65536a3a363f5a40925d0f0"} Feb 23 08:47:59 crc kubenswrapper[5047]: I0223 08:47:59.349019 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcxqb" event={"ID":"c4cc1435-e723-49a2-8894-bac86f2cceca","Type":"ContainerStarted","Data":"4aa359bcaac449785a7d3d3df0a1dbebab24968a3cefad0445fde787bfdb57e3"} Feb 23 08:48:00 crc kubenswrapper[5047]: I0223 08:48:00.371142 5047 generic.go:334] "Generic (PLEG): container finished" podID="c4cc1435-e723-49a2-8894-bac86f2cceca" containerID="4aa359bcaac449785a7d3d3df0a1dbebab24968a3cefad0445fde787bfdb57e3" exitCode=0 Feb 23 08:48:00 crc kubenswrapper[5047]: I0223 08:48:00.371685 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcxqb" event={"ID":"c4cc1435-e723-49a2-8894-bac86f2cceca","Type":"ContainerDied","Data":"4aa359bcaac449785a7d3d3df0a1dbebab24968a3cefad0445fde787bfdb57e3"} Feb 23 08:48:01 crc kubenswrapper[5047]: I0223 08:48:01.398676 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcxqb" event={"ID":"c4cc1435-e723-49a2-8894-bac86f2cceca","Type":"ContainerStarted","Data":"9177b4b644eb8eea46c3856893d8c1c6bff7c11a802fa3f7fe8f88a20ff6d1f7"} Feb 23 08:48:01 crc kubenswrapper[5047]: I0223 08:48:01.428421 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gcxqb" podStartSLOduration=2.9716348630000002 podStartE2EDuration="5.428399112s" podCreationTimestamp="2026-02-23 08:47:56 +0000 UTC" firstStartedPulling="2026-02-23 08:47:58.339413142 +0000 UTC m=+7400.590740286" lastFinishedPulling="2026-02-23 08:48:00.796177361 +0000 UTC m=+7403.047504535" observedRunningTime="2026-02-23 08:48:01.419338619 +0000 UTC m=+7403.670665793" watchObservedRunningTime="2026-02-23 08:48:01.428399112 +0000 UTC m=+7403.679726256" Feb 23 08:48:02 crc kubenswrapper[5047]: E0223 08:48:02.386690 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3443e301_4942_4ae3_b3e9_5ab7dbee8973.slice/crio-ede2f63e8f90b1942079d343e69299c847811f3548dca2e09d9ef440be5d78a5\": RecentStats: unable to find data in memory cache]" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.409712 5047 generic.go:334] "Generic (PLEG): container finished" podID="2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" containerID="98f98c30a1b605751204bfdb90030856009d784af92bf06a33e32d845bce2022" exitCode=137 Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.409798 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13","Type":"ContainerDied","Data":"98f98c30a1b605751204bfdb90030856009d784af92bf06a33e32d845bce2022"} Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.608500 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.715503 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-logs\") pod \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.715583 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-combined-ca-bundle\") pod \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.715650 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-scripts\") pod \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.715676 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-config-data-custom\") pod \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.715698 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dj4v\" (UniqueName: \"kubernetes.io/projected/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-kube-api-access-5dj4v\") pod \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.715740 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-config-data\") pod \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.716558 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-etc-machine-id\") pod \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\" (UID: \"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13\") " Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.716856 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" (UID: "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.717437 5047 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.717799 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-logs" (OuterVolumeSpecName: "logs") pod "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" (UID: "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.722753 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-scripts" (OuterVolumeSpecName: "scripts") pod "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" (UID: "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.725022 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" (UID: "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.725217 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-kube-api-access-5dj4v" (OuterVolumeSpecName: "kube-api-access-5dj4v") pod "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" (UID: "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13"). InnerVolumeSpecName "kube-api-access-5dj4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.766782 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" (UID: "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.796548 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-config-data" (OuterVolumeSpecName: "config-data") pod "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" (UID: "2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.820704 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.820745 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.820757 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.820766 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.820776 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dj4v\" (UniqueName: \"kubernetes.io/projected/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-kube-api-access-5dj4v\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:02 crc kubenswrapper[5047]: I0223 08:48:02.820787 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.421117 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13","Type":"ContainerDied","Data":"c612739f2bba06c5f8602588feccf8b4ad842622285c78876ec7840324d3d1d4"} Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.421197 5047 scope.go:117] "RemoveContainer" containerID="98f98c30a1b605751204bfdb90030856009d784af92bf06a33e32d845bce2022" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.421426 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.443694 5047 scope.go:117] "RemoveContainer" containerID="b8cd0bfdc96f2afe3f84de64c5bc5ec56b055b4f550ed1cd332dc3788459d930" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.466737 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.474409 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.515095 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:48:03 crc kubenswrapper[5047]: E0223 08:48:03.515666 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" containerName="cinder-api-log" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.515687 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" containerName="cinder-api-log" Feb 23 08:48:03 crc kubenswrapper[5047]: E0223 08:48:03.515727 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" containerName="cinder-api" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.515739 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" containerName="cinder-api" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.516110 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" containerName="cinder-api-log" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.516154 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" containerName="cinder-api" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.519369 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.528125 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.529328 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.529967 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.532738 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.532757 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-z9xwp" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.533151 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.533404 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.537409 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-config-data-custom\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.538426 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4hxw\" (UniqueName: \"kubernetes.io/projected/e453f6f5-cef4-4ff2-a391-192467480ea6-kube-api-access-p4hxw\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.538547 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-config-data\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.538727 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.538899 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e453f6f5-cef4-4ff2-a391-192467480ea6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.539075 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.539184 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e453f6f5-cef4-4ff2-a391-192467480ea6-logs\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.539357 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.539519 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-scripts\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.644147 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e453f6f5-cef4-4ff2-a391-192467480ea6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.644751 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.644322 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e453f6f5-cef4-4ff2-a391-192467480ea6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.645136 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e453f6f5-cef4-4ff2-a391-192467480ea6-logs\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.645299 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.645560 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-scripts\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.645747 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-config-data-custom\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.646580 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4hxw\" (UniqueName: \"kubernetes.io/projected/e453f6f5-cef4-4ff2-a391-192467480ea6-kube-api-access-p4hxw\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.646823 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-config-data\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.647044 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.648137 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e453f6f5-cef4-4ff2-a391-192467480ea6-logs\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.653696 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.653750 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-scripts\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.660879 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.661227 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-config-data\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.661499 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-config-data-custom\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.667881 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4hxw\" (UniqueName: \"kubernetes.io/projected/e453f6f5-cef4-4ff2-a391-192467480ea6-kube-api-access-p4hxw\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.671800 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " pod="openstack/cinder-api-0" Feb 23 08:48:03 crc kubenswrapper[5047]: I0223 08:48:03.851738 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:48:04 crc kubenswrapper[5047]: I0223 08:48:04.144177 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:48:04 crc kubenswrapper[5047]: W0223 08:48:04.153384 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode453f6f5_cef4_4ff2_a391_192467480ea6.slice/crio-7858e848acbd62e0eeb6932d4ade4fa7135412f5595ba5320ff3fdf6b86f2f2e WatchSource:0}: Error finding container 7858e848acbd62e0eeb6932d4ade4fa7135412f5595ba5320ff3fdf6b86f2f2e: Status 404 returned error can't find the container with id 7858e848acbd62e0eeb6932d4ade4fa7135412f5595ba5320ff3fdf6b86f2f2e Feb 23 08:48:04 crc kubenswrapper[5047]: I0223 08:48:04.351482 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13" path="/var/lib/kubelet/pods/2dbcbe51-dc4c-4e33-9c36-51a0d2da3d13/volumes" Feb 23 08:48:04 crc kubenswrapper[5047]: I0223 08:48:04.437238 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e453f6f5-cef4-4ff2-a391-192467480ea6","Type":"ContainerStarted","Data":"7858e848acbd62e0eeb6932d4ade4fa7135412f5595ba5320ff3fdf6b86f2f2e"} Feb 23 08:48:05 crc kubenswrapper[5047]: I0223 08:48:05.461701 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e453f6f5-cef4-4ff2-a391-192467480ea6","Type":"ContainerStarted","Data":"ba80a118cc1cae29394933b6f8cc8932fe14ae9ed0fef6b07af3ed37c8dfb24d"} Feb 23 08:48:06 crc kubenswrapper[5047]: I0223 08:48:06.481346 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e453f6f5-cef4-4ff2-a391-192467480ea6","Type":"ContainerStarted","Data":"5522886e1eb4d6218403b0092117849746ae36ff10eccfabe8e57edb9436a651"} Feb 23 08:48:06 crc kubenswrapper[5047]: I0223 08:48:06.481953 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 23 08:48:06 crc kubenswrapper[5047]: I0223 08:48:06.506105 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.5060795970000003 podStartE2EDuration="3.506079597s" podCreationTimestamp="2026-02-23 08:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:48:06.505619473 +0000 UTC m=+7408.756946637" watchObservedRunningTime="2026-02-23 08:48:06.506079597 +0000 UTC m=+7408.757406761" Feb 23 08:48:07 crc kubenswrapper[5047]: I0223 08:48:07.375648 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:48:07 crc kubenswrapper[5047]: I0223 08:48:07.376222 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:48:07 crc kubenswrapper[5047]: I0223 08:48:07.468360 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:48:07 crc kubenswrapper[5047]: I0223 08:48:07.594848 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:48:08 crc kubenswrapper[5047]: I0223 08:48:08.016801 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gcxqb"] Feb 23 08:48:09 crc kubenswrapper[5047]: I0223 08:48:09.526653 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gcxqb" podUID="c4cc1435-e723-49a2-8894-bac86f2cceca" containerName="registry-server" containerID="cri-o://9177b4b644eb8eea46c3856893d8c1c6bff7c11a802fa3f7fe8f88a20ff6d1f7" gracePeriod=2 Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.540035 5047 generic.go:334] "Generic (PLEG): container finished" podID="c4cc1435-e723-49a2-8894-bac86f2cceca" containerID="9177b4b644eb8eea46c3856893d8c1c6bff7c11a802fa3f7fe8f88a20ff6d1f7" exitCode=0 Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.540131 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcxqb" event={"ID":"c4cc1435-e723-49a2-8894-bac86f2cceca","Type":"ContainerDied","Data":"9177b4b644eb8eea46c3856893d8c1c6bff7c11a802fa3f7fe8f88a20ff6d1f7"} Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.540597 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcxqb" event={"ID":"c4cc1435-e723-49a2-8894-bac86f2cceca","Type":"ContainerDied","Data":"758007ab6ff39b470829197ba9f7494019de8dbea65536a3a363f5a40925d0f0"} Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.540623 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="758007ab6ff39b470829197ba9f7494019de8dbea65536a3a363f5a40925d0f0" Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.549464 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.708746 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4cc1435-e723-49a2-8894-bac86f2cceca-catalog-content\") pod \"c4cc1435-e723-49a2-8894-bac86f2cceca\" (UID: \"c4cc1435-e723-49a2-8894-bac86f2cceca\") " Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.709078 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4cc1435-e723-49a2-8894-bac86f2cceca-utilities\") pod \"c4cc1435-e723-49a2-8894-bac86f2cceca\" (UID: \"c4cc1435-e723-49a2-8894-bac86f2cceca\") " Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.709170 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2shv\" (UniqueName: \"kubernetes.io/projected/c4cc1435-e723-49a2-8894-bac86f2cceca-kube-api-access-z2shv\") pod \"c4cc1435-e723-49a2-8894-bac86f2cceca\" (UID: \"c4cc1435-e723-49a2-8894-bac86f2cceca\") " Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.710653 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4cc1435-e723-49a2-8894-bac86f2cceca-utilities" (OuterVolumeSpecName: "utilities") pod "c4cc1435-e723-49a2-8894-bac86f2cceca" (UID: "c4cc1435-e723-49a2-8894-bac86f2cceca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.721866 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4cc1435-e723-49a2-8894-bac86f2cceca-kube-api-access-z2shv" (OuterVolumeSpecName: "kube-api-access-z2shv") pod "c4cc1435-e723-49a2-8894-bac86f2cceca" (UID: "c4cc1435-e723-49a2-8894-bac86f2cceca"). InnerVolumeSpecName "kube-api-access-z2shv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.744863 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4cc1435-e723-49a2-8894-bac86f2cceca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4cc1435-e723-49a2-8894-bac86f2cceca" (UID: "c4cc1435-e723-49a2-8894-bac86f2cceca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.812423 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4cc1435-e723-49a2-8894-bac86f2cceca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.812472 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4cc1435-e723-49a2-8894-bac86f2cceca-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:10 crc kubenswrapper[5047]: I0223 08:48:10.812486 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2shv\" (UniqueName: \"kubernetes.io/projected/c4cc1435-e723-49a2-8894-bac86f2cceca-kube-api-access-z2shv\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:11 crc kubenswrapper[5047]: I0223 08:48:11.550669 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gcxqb" Feb 23 08:48:11 crc kubenswrapper[5047]: I0223 08:48:11.618259 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gcxqb"] Feb 23 08:48:11 crc kubenswrapper[5047]: I0223 08:48:11.629630 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gcxqb"] Feb 23 08:48:12 crc kubenswrapper[5047]: I0223 08:48:12.363700 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4cc1435-e723-49a2-8894-bac86f2cceca" path="/var/lib/kubelet/pods/c4cc1435-e723-49a2-8894-bac86f2cceca/volumes" Feb 23 08:48:12 crc kubenswrapper[5047]: E0223 08:48:12.672931 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3443e301_4942_4ae3_b3e9_5ab7dbee8973.slice/crio-ede2f63e8f90b1942079d343e69299c847811f3548dca2e09d9ef440be5d78a5\": RecentStats: unable to find data in memory cache]" Feb 23 08:48:15 crc kubenswrapper[5047]: I0223 08:48:15.843478 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 23 08:48:22 crc kubenswrapper[5047]: E0223 08:48:22.937409 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3443e301_4942_4ae3_b3e9_5ab7dbee8973.slice/crio-ede2f63e8f90b1942079d343e69299c847811f3548dca2e09d9ef440be5d78a5\": RecentStats: unable to find data in memory cache]" Feb 23 08:48:33 crc kubenswrapper[5047]: E0223 08:48:33.216605 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3443e301_4942_4ae3_b3e9_5ab7dbee8973.slice/crio-ede2f63e8f90b1942079d343e69299c847811f3548dca2e09d9ef440be5d78a5\": RecentStats: unable to find data in memory cache]" Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.800355 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:48:41 crc kubenswrapper[5047]: E0223 08:48:41.801517 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4cc1435-e723-49a2-8894-bac86f2cceca" containerName="registry-server" Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.801540 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4cc1435-e723-49a2-8894-bac86f2cceca" containerName="registry-server" Feb 23 08:48:41 crc kubenswrapper[5047]: E0223 08:48:41.801575 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4cc1435-e723-49a2-8894-bac86f2cceca" containerName="extract-content" Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.801585 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4cc1435-e723-49a2-8894-bac86f2cceca" containerName="extract-content" Feb 23 08:48:41 crc kubenswrapper[5047]: E0223 08:48:41.801603 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4cc1435-e723-49a2-8894-bac86f2cceca" containerName="extract-utilities" Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.801614 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4cc1435-e723-49a2-8894-bac86f2cceca" containerName="extract-utilities" Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.801882 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4cc1435-e723-49a2-8894-bac86f2cceca" containerName="registry-server" Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.803362 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.805980 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.815309 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.901176 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d79lj\" (UniqueName: \"kubernetes.io/projected/f007d224-7f17-4a4b-8e93-6c1403eb910b-kube-api-access-d79lj\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.901561 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f007d224-7f17-4a4b-8e93-6c1403eb910b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.901593 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-scripts\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.901714 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-config-data\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.901740 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:41 crc kubenswrapper[5047]: I0223 08:48:41.901863 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.003411 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-config-data\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.003460 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.003567 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.003611 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d79lj\" (UniqueName: \"kubernetes.io/projected/f007d224-7f17-4a4b-8e93-6c1403eb910b-kube-api-access-d79lj\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.003642 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f007d224-7f17-4a4b-8e93-6c1403eb910b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.003673 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-scripts\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.003859 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f007d224-7f17-4a4b-8e93-6c1403eb910b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.014049 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.014609 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-config-data\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.024032 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-scripts\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.024620 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.028218 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d79lj\" (UniqueName: \"kubernetes.io/projected/f007d224-7f17-4a4b-8e93-6c1403eb910b-kube-api-access-d79lj\") pod \"cinder-scheduler-0\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.127345 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.647219 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.664134 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:48:42 crc kubenswrapper[5047]: I0223 08:48:42.914453 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f007d224-7f17-4a4b-8e93-6c1403eb910b","Type":"ContainerStarted","Data":"77ef60525501a3288270def0a8a30e59dca16a37d02ae850de80d8c855f2f555"} Feb 23 08:48:43 crc kubenswrapper[5047]: I0223 08:48:43.459233 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:48:43 crc kubenswrapper[5047]: I0223 08:48:43.459617 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e453f6f5-cef4-4ff2-a391-192467480ea6" containerName="cinder-api-log" containerID="cri-o://ba80a118cc1cae29394933b6f8cc8932fe14ae9ed0fef6b07af3ed37c8dfb24d" gracePeriod=30 Feb 23 08:48:43 crc kubenswrapper[5047]: I0223 08:48:43.460194 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e453f6f5-cef4-4ff2-a391-192467480ea6" containerName="cinder-api" containerID="cri-o://5522886e1eb4d6218403b0092117849746ae36ff10eccfabe8e57edb9436a651" gracePeriod=30 Feb 23 08:48:43 crc kubenswrapper[5047]: I0223 08:48:43.931840 5047 generic.go:334] "Generic (PLEG): container finished" podID="e453f6f5-cef4-4ff2-a391-192467480ea6" containerID="ba80a118cc1cae29394933b6f8cc8932fe14ae9ed0fef6b07af3ed37c8dfb24d" exitCode=143 Feb 23 08:48:43 crc kubenswrapper[5047]: I0223 08:48:43.932051 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e453f6f5-cef4-4ff2-a391-192467480ea6","Type":"ContainerDied","Data":"ba80a118cc1cae29394933b6f8cc8932fe14ae9ed0fef6b07af3ed37c8dfb24d"} Feb 23 08:48:43 crc kubenswrapper[5047]: I0223 08:48:43.937277 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f007d224-7f17-4a4b-8e93-6c1403eb910b","Type":"ContainerStarted","Data":"c3aa0b10a4a8e5fcf3ab503cd4d20dcf5444c979b32535915f150f5c76cd8b02"} Feb 23 08:48:44 crc kubenswrapper[5047]: I0223 08:48:44.952940 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f007d224-7f17-4a4b-8e93-6c1403eb910b","Type":"ContainerStarted","Data":"aebdae66adfbe4e70788a8d5c664bcb9a469c9cbe650842c4b1dbc3c9d6264a0"} Feb 23 08:48:46 crc kubenswrapper[5047]: I0223 08:48:46.624369 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="e453f6f5-cef4-4ff2-a391-192467480ea6" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.67:8776/healthcheck\": read tcp 10.217.0.2:38384->10.217.1.67:8776: read: connection reset by peer" Feb 23 08:48:46 crc kubenswrapper[5047]: I0223 08:48:46.798671 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:48:46 crc kubenswrapper[5047]: I0223 08:48:46.798735 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:48:46 crc kubenswrapper[5047]: I0223 08:48:46.972096 5047 generic.go:334] "Generic (PLEG): container finished" podID="e453f6f5-cef4-4ff2-a391-192467480ea6" containerID="5522886e1eb4d6218403b0092117849746ae36ff10eccfabe8e57edb9436a651" exitCode=0 Feb 23 08:48:46 crc kubenswrapper[5047]: I0223 08:48:46.972144 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e453f6f5-cef4-4ff2-a391-192467480ea6","Type":"ContainerDied","Data":"5522886e1eb4d6218403b0092117849746ae36ff10eccfabe8e57edb9436a651"} Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.104508 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.128417 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.808579607 podStartE2EDuration="6.128395162s" podCreationTimestamp="2026-02-23 08:48:41 +0000 UTC" firstStartedPulling="2026-02-23 08:48:42.663876297 +0000 UTC m=+7444.915203431" lastFinishedPulling="2026-02-23 08:48:42.983691852 +0000 UTC m=+7445.235018986" observedRunningTime="2026-02-23 08:48:44.977811903 +0000 UTC m=+7447.229139077" watchObservedRunningTime="2026-02-23 08:48:47.128395162 +0000 UTC m=+7449.379722296" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.133400 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.216114 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-config-data\") pod \"e453f6f5-cef4-4ff2-a391-192467480ea6\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.216612 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-scripts\") pod \"e453f6f5-cef4-4ff2-a391-192467480ea6\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.216666 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-config-data-custom\") pod \"e453f6f5-cef4-4ff2-a391-192467480ea6\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.216719 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e453f6f5-cef4-4ff2-a391-192467480ea6-etc-machine-id\") pod \"e453f6f5-cef4-4ff2-a391-192467480ea6\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.216862 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-public-tls-certs\") pod \"e453f6f5-cef4-4ff2-a391-192467480ea6\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.216920 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e453f6f5-cef4-4ff2-a391-192467480ea6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e453f6f5-cef4-4ff2-a391-192467480ea6" (UID: "e453f6f5-cef4-4ff2-a391-192467480ea6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.216957 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4hxw\" (UniqueName: \"kubernetes.io/projected/e453f6f5-cef4-4ff2-a391-192467480ea6-kube-api-access-p4hxw\") pod \"e453f6f5-cef4-4ff2-a391-192467480ea6\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.217081 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-internal-tls-certs\") pod \"e453f6f5-cef4-4ff2-a391-192467480ea6\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.217123 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e453f6f5-cef4-4ff2-a391-192467480ea6-logs\") pod \"e453f6f5-cef4-4ff2-a391-192467480ea6\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.217349 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-combined-ca-bundle\") pod \"e453f6f5-cef4-4ff2-a391-192467480ea6\" (UID: \"e453f6f5-cef4-4ff2-a391-192467480ea6\") " Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.217840 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e453f6f5-cef4-4ff2-a391-192467480ea6-logs" (OuterVolumeSpecName: "logs") pod "e453f6f5-cef4-4ff2-a391-192467480ea6" (UID: "e453f6f5-cef4-4ff2-a391-192467480ea6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.219187 5047 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e453f6f5-cef4-4ff2-a391-192467480ea6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.219378 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e453f6f5-cef4-4ff2-a391-192467480ea6-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.224442 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e453f6f5-cef4-4ff2-a391-192467480ea6" (UID: "e453f6f5-cef4-4ff2-a391-192467480ea6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.234701 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e453f6f5-cef4-4ff2-a391-192467480ea6-kube-api-access-p4hxw" (OuterVolumeSpecName: "kube-api-access-p4hxw") pod "e453f6f5-cef4-4ff2-a391-192467480ea6" (UID: "e453f6f5-cef4-4ff2-a391-192467480ea6"). InnerVolumeSpecName "kube-api-access-p4hxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.235947 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-scripts" (OuterVolumeSpecName: "scripts") pod "e453f6f5-cef4-4ff2-a391-192467480ea6" (UID: "e453f6f5-cef4-4ff2-a391-192467480ea6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.262371 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e453f6f5-cef4-4ff2-a391-192467480ea6" (UID: "e453f6f5-cef4-4ff2-a391-192467480ea6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.274207 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e453f6f5-cef4-4ff2-a391-192467480ea6" (UID: "e453f6f5-cef4-4ff2-a391-192467480ea6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.286974 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e453f6f5-cef4-4ff2-a391-192467480ea6" (UID: "e453f6f5-cef4-4ff2-a391-192467480ea6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.295373 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-config-data" (OuterVolumeSpecName: "config-data") pod "e453f6f5-cef4-4ff2-a391-192467480ea6" (UID: "e453f6f5-cef4-4ff2-a391-192467480ea6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.320671 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.320709 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.320721 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.320735 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.320745 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.320755 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e453f6f5-cef4-4ff2-a391-192467480ea6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.320765 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4hxw\" (UniqueName: \"kubernetes.io/projected/e453f6f5-cef4-4ff2-a391-192467480ea6-kube-api-access-p4hxw\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.995072 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e453f6f5-cef4-4ff2-a391-192467480ea6","Type":"ContainerDied","Data":"7858e848acbd62e0eeb6932d4ade4fa7135412f5595ba5320ff3fdf6b86f2f2e"} Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.995224 5047 scope.go:117] "RemoveContainer" containerID="5522886e1eb4d6218403b0092117849746ae36ff10eccfabe8e57edb9436a651" Feb 23 08:48:47 crc kubenswrapper[5047]: I0223 08:48:47.995678 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.043561 5047 scope.go:117] "RemoveContainer" containerID="ba80a118cc1cae29394933b6f8cc8932fe14ae9ed0fef6b07af3ed37c8dfb24d" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.074238 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.115414 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.134130 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:48:48 crc kubenswrapper[5047]: E0223 08:48:48.134732 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e453f6f5-cef4-4ff2-a391-192467480ea6" containerName="cinder-api" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.134757 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e453f6f5-cef4-4ff2-a391-192467480ea6" containerName="cinder-api" Feb 23 08:48:48 crc kubenswrapper[5047]: E0223 08:48:48.134775 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e453f6f5-cef4-4ff2-a391-192467480ea6" containerName="cinder-api-log" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.134784 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e453f6f5-cef4-4ff2-a391-192467480ea6" containerName="cinder-api-log" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.135040 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e453f6f5-cef4-4ff2-a391-192467480ea6" containerName="cinder-api" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.135064 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e453f6f5-cef4-4ff2-a391-192467480ea6" containerName="cinder-api-log" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.136366 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.139598 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.139895 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.140660 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.147729 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.238561 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.238680 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.238711 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg9zb\" (UniqueName: \"kubernetes.io/projected/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-kube-api-access-fg9zb\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.238757 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.238845 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.238897 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-config-data\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.239014 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-scripts\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.239046 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-logs\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.239084 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.341536 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-scripts\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.342014 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-logs\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.342049 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.342084 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.342117 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.342144 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg9zb\" (UniqueName: \"kubernetes.io/projected/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-kube-api-access-fg9zb\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.342188 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.342247 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.342289 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-config-data\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.342610 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-logs\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.342691 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.352814 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.354491 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.356285 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-scripts\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.356509 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.361139 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.371263 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-config-data\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.380111 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e453f6f5-cef4-4ff2-a391-192467480ea6" path="/var/lib/kubelet/pods/e453f6f5-cef4-4ff2-a391-192467480ea6/volumes" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.380320 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg9zb\" (UniqueName: \"kubernetes.io/projected/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-kube-api-access-fg9zb\") pod \"cinder-api-0\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.465571 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:48:48 crc kubenswrapper[5047]: I0223 08:48:48.995265 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:48:49 crc kubenswrapper[5047]: I0223 08:48:49.007835 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d570b43-d0ff-42d7-a305-9d7c2f9f9881","Type":"ContainerStarted","Data":"521f1cfb6ff27fc630ac83fac3af6c49d9aff0ef6a40c0df0f03027347f320db"} Feb 23 08:48:50 crc kubenswrapper[5047]: I0223 08:48:50.020792 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d570b43-d0ff-42d7-a305-9d7c2f9f9881","Type":"ContainerStarted","Data":"3cd1af755bfcf4970c210c91d87839f5bae5b4edc4ad2683851e88ccef337c33"} Feb 23 08:48:51 crc kubenswrapper[5047]: I0223 08:48:51.034820 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d570b43-d0ff-42d7-a305-9d7c2f9f9881","Type":"ContainerStarted","Data":"6a1e38661d1c5ddbb45bad7c88e6e2bbc8848d6927477296120c0dddf2445603"} Feb 23 08:48:51 crc kubenswrapper[5047]: I0223 08:48:51.036240 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 23 08:48:51 crc kubenswrapper[5047]: I0223 08:48:51.062930 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.062867158 podStartE2EDuration="3.062867158s" podCreationTimestamp="2026-02-23 08:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:48:51.057487314 +0000 UTC m=+7453.308814488" watchObservedRunningTime="2026-02-23 08:48:51.062867158 +0000 UTC m=+7453.314194302" Feb 23 08:48:52 crc kubenswrapper[5047]: I0223 08:48:52.349350 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 23 08:48:52 crc kubenswrapper[5047]: I0223 08:48:52.420476 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:48:53 crc kubenswrapper[5047]: I0223 08:48:53.059827 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f007d224-7f17-4a4b-8e93-6c1403eb910b" containerName="cinder-scheduler" containerID="cri-o://c3aa0b10a4a8e5fcf3ab503cd4d20dcf5444c979b32535915f150f5c76cd8b02" gracePeriod=30 Feb 23 08:48:53 crc kubenswrapper[5047]: I0223 08:48:53.060620 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f007d224-7f17-4a4b-8e93-6c1403eb910b" containerName="probe" containerID="cri-o://aebdae66adfbe4e70788a8d5c664bcb9a469c9cbe650842c4b1dbc3c9d6264a0" gracePeriod=30 Feb 23 08:48:54 crc kubenswrapper[5047]: I0223 08:48:54.076553 5047 generic.go:334] "Generic (PLEG): container finished" podID="f007d224-7f17-4a4b-8e93-6c1403eb910b" containerID="aebdae66adfbe4e70788a8d5c664bcb9a469c9cbe650842c4b1dbc3c9d6264a0" exitCode=0 Feb 23 08:48:54 crc kubenswrapper[5047]: I0223 08:48:54.076654 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f007d224-7f17-4a4b-8e93-6c1403eb910b","Type":"ContainerDied","Data":"aebdae66adfbe4e70788a8d5c664bcb9a469c9cbe650842c4b1dbc3c9d6264a0"} Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.050594 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.101855 5047 generic.go:334] "Generic (PLEG): container finished" podID="f007d224-7f17-4a4b-8e93-6c1403eb910b" containerID="c3aa0b10a4a8e5fcf3ab503cd4d20dcf5444c979b32535915f150f5c76cd8b02" exitCode=0 Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.101968 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f007d224-7f17-4a4b-8e93-6c1403eb910b","Type":"ContainerDied","Data":"c3aa0b10a4a8e5fcf3ab503cd4d20dcf5444c979b32535915f150f5c76cd8b02"} Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.102016 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f007d224-7f17-4a4b-8e93-6c1403eb910b","Type":"ContainerDied","Data":"77ef60525501a3288270def0a8a30e59dca16a37d02ae850de80d8c855f2f555"} Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.102044 5047 scope.go:117] "RemoveContainer" containerID="aebdae66adfbe4e70788a8d5c664bcb9a469c9cbe650842c4b1dbc3c9d6264a0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.102295 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.121071 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-config-data\") pod \"f007d224-7f17-4a4b-8e93-6c1403eb910b\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.121113 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-config-data-custom\") pod \"f007d224-7f17-4a4b-8e93-6c1403eb910b\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.121226 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-combined-ca-bundle\") pod \"f007d224-7f17-4a4b-8e93-6c1403eb910b\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.121259 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d79lj\" (UniqueName: \"kubernetes.io/projected/f007d224-7f17-4a4b-8e93-6c1403eb910b-kube-api-access-d79lj\") pod \"f007d224-7f17-4a4b-8e93-6c1403eb910b\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.121364 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f007d224-7f17-4a4b-8e93-6c1403eb910b-etc-machine-id\") pod \"f007d224-7f17-4a4b-8e93-6c1403eb910b\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.121405 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-scripts\") pod \"f007d224-7f17-4a4b-8e93-6c1403eb910b\" (UID: \"f007d224-7f17-4a4b-8e93-6c1403eb910b\") " Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.125201 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f007d224-7f17-4a4b-8e93-6c1403eb910b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f007d224-7f17-4a4b-8e93-6c1403eb910b" (UID: "f007d224-7f17-4a4b-8e93-6c1403eb910b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.129311 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-scripts" (OuterVolumeSpecName: "scripts") pod "f007d224-7f17-4a4b-8e93-6c1403eb910b" (UID: "f007d224-7f17-4a4b-8e93-6c1403eb910b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.130463 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f007d224-7f17-4a4b-8e93-6c1403eb910b-kube-api-access-d79lj" (OuterVolumeSpecName: "kube-api-access-d79lj") pod "f007d224-7f17-4a4b-8e93-6c1403eb910b" (UID: "f007d224-7f17-4a4b-8e93-6c1403eb910b"). InnerVolumeSpecName "kube-api-access-d79lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.132800 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f007d224-7f17-4a4b-8e93-6c1403eb910b" (UID: "f007d224-7f17-4a4b-8e93-6c1403eb910b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.148687 5047 scope.go:117] "RemoveContainer" containerID="c3aa0b10a4a8e5fcf3ab503cd4d20dcf5444c979b32535915f150f5c76cd8b02" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.190860 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f007d224-7f17-4a4b-8e93-6c1403eb910b" (UID: "f007d224-7f17-4a4b-8e93-6c1403eb910b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.227156 5047 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f007d224-7f17-4a4b-8e93-6c1403eb910b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.238082 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.238307 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.238451 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.238808 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d79lj\" (UniqueName: \"kubernetes.io/projected/f007d224-7f17-4a4b-8e93-6c1403eb910b-kube-api-access-d79lj\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.255360 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-config-data" (OuterVolumeSpecName: "config-data") pod "f007d224-7f17-4a4b-8e93-6c1403eb910b" (UID: "f007d224-7f17-4a4b-8e93-6c1403eb910b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.328751 5047 scope.go:117] "RemoveContainer" containerID="aebdae66adfbe4e70788a8d5c664bcb9a469c9cbe650842c4b1dbc3c9d6264a0" Feb 23 08:48:55 crc kubenswrapper[5047]: E0223 08:48:55.330325 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aebdae66adfbe4e70788a8d5c664bcb9a469c9cbe650842c4b1dbc3c9d6264a0\": container with ID starting with aebdae66adfbe4e70788a8d5c664bcb9a469c9cbe650842c4b1dbc3c9d6264a0 not found: ID does not exist" containerID="aebdae66adfbe4e70788a8d5c664bcb9a469c9cbe650842c4b1dbc3c9d6264a0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.330388 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aebdae66adfbe4e70788a8d5c664bcb9a469c9cbe650842c4b1dbc3c9d6264a0"} err="failed to get container status \"aebdae66adfbe4e70788a8d5c664bcb9a469c9cbe650842c4b1dbc3c9d6264a0\": rpc error: code = NotFound desc = could not find container \"aebdae66adfbe4e70788a8d5c664bcb9a469c9cbe650842c4b1dbc3c9d6264a0\": container with ID starting with aebdae66adfbe4e70788a8d5c664bcb9a469c9cbe650842c4b1dbc3c9d6264a0 not found: ID does not exist" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.330426 5047 scope.go:117] "RemoveContainer" containerID="c3aa0b10a4a8e5fcf3ab503cd4d20dcf5444c979b32535915f150f5c76cd8b02" Feb 23 08:48:55 crc kubenswrapper[5047]: E0223 08:48:55.330944 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3aa0b10a4a8e5fcf3ab503cd4d20dcf5444c979b32535915f150f5c76cd8b02\": container with ID starting with c3aa0b10a4a8e5fcf3ab503cd4d20dcf5444c979b32535915f150f5c76cd8b02 not found: ID does not exist" containerID="c3aa0b10a4a8e5fcf3ab503cd4d20dcf5444c979b32535915f150f5c76cd8b02" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.331007 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3aa0b10a4a8e5fcf3ab503cd4d20dcf5444c979b32535915f150f5c76cd8b02"} err="failed to get container status \"c3aa0b10a4a8e5fcf3ab503cd4d20dcf5444c979b32535915f150f5c76cd8b02\": rpc error: code = NotFound desc = could not find container \"c3aa0b10a4a8e5fcf3ab503cd4d20dcf5444c979b32535915f150f5c76cd8b02\": container with ID starting with c3aa0b10a4a8e5fcf3ab503cd4d20dcf5444c979b32535915f150f5c76cd8b02 not found: ID does not exist" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.341756 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f007d224-7f17-4a4b-8e93-6c1403eb910b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.444060 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.448834 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.484487 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:48:55 crc kubenswrapper[5047]: E0223 08:48:55.485023 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f007d224-7f17-4a4b-8e93-6c1403eb910b" containerName="probe" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.485045 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f007d224-7f17-4a4b-8e93-6c1403eb910b" containerName="probe" Feb 23 08:48:55 crc kubenswrapper[5047]: E0223 08:48:55.485080 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f007d224-7f17-4a4b-8e93-6c1403eb910b" containerName="cinder-scheduler" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.485087 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f007d224-7f17-4a4b-8e93-6c1403eb910b" containerName="cinder-scheduler" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.485287 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f007d224-7f17-4a4b-8e93-6c1403eb910b" containerName="probe" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.485324 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f007d224-7f17-4a4b-8e93-6c1403eb910b" containerName="cinder-scheduler" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.486529 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.490522 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.509552 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.646932 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.647011 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.647243 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.647312 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.647347 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j289z\" (UniqueName: \"kubernetes.io/projected/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-kube-api-access-j289z\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.647600 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.749026 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.749096 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.749183 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.749218 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.749256 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j289z\" (UniqueName: \"kubernetes.io/projected/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-kube-api-access-j289z\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.749375 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.749565 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.754004 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.754720 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.754771 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.755184 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.766358 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j289z\" (UniqueName: \"kubernetes.io/projected/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-kube-api-access-j289z\") pod \"cinder-scheduler-0\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " pod="openstack/cinder-scheduler-0" Feb 23 08:48:55 crc kubenswrapper[5047]: I0223 08:48:55.804450 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 08:48:56 crc kubenswrapper[5047]: I0223 08:48:56.318816 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:48:56 crc kubenswrapper[5047]: I0223 08:48:56.356722 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f007d224-7f17-4a4b-8e93-6c1403eb910b" path="/var/lib/kubelet/pods/f007d224-7f17-4a4b-8e93-6c1403eb910b/volumes" Feb 23 08:48:56 crc kubenswrapper[5047]: I0223 08:48:56.792977 5047 scope.go:117] "RemoveContainer" containerID="b1646bdaab221b27450b2e82c3c3b57b4d98452c88093359f43031a9f8604659" Feb 23 08:48:57 crc kubenswrapper[5047]: I0223 08:48:57.133198 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6c3821a-cbde-41d5-95fd-1e617b2d12bc","Type":"ContainerStarted","Data":"458b0e1642c869d058c9b43e3416f43f71ec7ed0417844e4fdd5515463a4d080"} Feb 23 08:48:57 crc kubenswrapper[5047]: I0223 08:48:57.133569 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6c3821a-cbde-41d5-95fd-1e617b2d12bc","Type":"ContainerStarted","Data":"2c83c28ee8b401e217876193e0b182868a9c236011490fd79285c3519c9dd77e"} Feb 23 08:48:58 crc kubenswrapper[5047]: I0223 08:48:58.143287 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6c3821a-cbde-41d5-95fd-1e617b2d12bc","Type":"ContainerStarted","Data":"0ff894197ec53c6fa19bcceb0a1998cc1302ed5266ed87a7fab99b18cfd8df6e"} Feb 23 08:48:58 crc kubenswrapper[5047]: I0223 08:48:58.170251 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.170229466 podStartE2EDuration="3.170229466s" podCreationTimestamp="2026-02-23 08:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:48:58.166072524 +0000 UTC m=+7460.417399668" watchObservedRunningTime="2026-02-23 08:48:58.170229466 +0000 UTC m=+7460.421556600" Feb 23 08:49:00 crc kubenswrapper[5047]: I0223 08:49:00.539857 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 23 08:49:00 crc kubenswrapper[5047]: I0223 08:49:00.805390 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 23 08:49:06 crc kubenswrapper[5047]: I0223 08:49:06.231372 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.394356 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-zfpsk"] Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.395880 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zfpsk" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.409391 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zfpsk"] Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.457479 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a225bdc-5631-4a69-a67d-b59b7c055392-operator-scripts\") pod \"glance-db-create-zfpsk\" (UID: \"9a225bdc-5631-4a69-a67d-b59b7c055392\") " pod="openstack/glance-db-create-zfpsk" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.457591 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x2mr\" (UniqueName: \"kubernetes.io/projected/9a225bdc-5631-4a69-a67d-b59b7c055392-kube-api-access-4x2mr\") pod \"glance-db-create-zfpsk\" (UID: \"9a225bdc-5631-4a69-a67d-b59b7c055392\") " pod="openstack/glance-db-create-zfpsk" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.497298 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-0953-account-create-update-w46gb"] Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.498637 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0953-account-create-update-w46gb" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.500979 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.522499 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0953-account-create-update-w46gb"] Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.559156 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a225bdc-5631-4a69-a67d-b59b7c055392-operator-scripts\") pod \"glance-db-create-zfpsk\" (UID: \"9a225bdc-5631-4a69-a67d-b59b7c055392\") " pod="openstack/glance-db-create-zfpsk" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.559221 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x2mr\" (UniqueName: \"kubernetes.io/projected/9a225bdc-5631-4a69-a67d-b59b7c055392-kube-api-access-4x2mr\") pod \"glance-db-create-zfpsk\" (UID: \"9a225bdc-5631-4a69-a67d-b59b7c055392\") " pod="openstack/glance-db-create-zfpsk" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.559331 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cd73798-3881-4a98-86b0-dc80cf37d901-operator-scripts\") pod \"glance-0953-account-create-update-w46gb\" (UID: \"3cd73798-3881-4a98-86b0-dc80cf37d901\") " pod="openstack/glance-0953-account-create-update-w46gb" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.559354 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb5pf\" (UniqueName: \"kubernetes.io/projected/3cd73798-3881-4a98-86b0-dc80cf37d901-kube-api-access-kb5pf\") pod \"glance-0953-account-create-update-w46gb\" (UID: \"3cd73798-3881-4a98-86b0-dc80cf37d901\") " pod="openstack/glance-0953-account-create-update-w46gb" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.560094 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a225bdc-5631-4a69-a67d-b59b7c055392-operator-scripts\") pod \"glance-db-create-zfpsk\" (UID: \"9a225bdc-5631-4a69-a67d-b59b7c055392\") " pod="openstack/glance-db-create-zfpsk" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.588585 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x2mr\" (UniqueName: \"kubernetes.io/projected/9a225bdc-5631-4a69-a67d-b59b7c055392-kube-api-access-4x2mr\") pod \"glance-db-create-zfpsk\" (UID: \"9a225bdc-5631-4a69-a67d-b59b7c055392\") " pod="openstack/glance-db-create-zfpsk" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.662118 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cd73798-3881-4a98-86b0-dc80cf37d901-operator-scripts\") pod \"glance-0953-account-create-update-w46gb\" (UID: \"3cd73798-3881-4a98-86b0-dc80cf37d901\") " pod="openstack/glance-0953-account-create-update-w46gb" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.662416 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb5pf\" (UniqueName: \"kubernetes.io/projected/3cd73798-3881-4a98-86b0-dc80cf37d901-kube-api-access-kb5pf\") pod \"glance-0953-account-create-update-w46gb\" (UID: \"3cd73798-3881-4a98-86b0-dc80cf37d901\") " pod="openstack/glance-0953-account-create-update-w46gb" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.663463 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cd73798-3881-4a98-86b0-dc80cf37d901-operator-scripts\") pod \"glance-0953-account-create-update-w46gb\" (UID: \"3cd73798-3881-4a98-86b0-dc80cf37d901\") " pod="openstack/glance-0953-account-create-update-w46gb" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.699943 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb5pf\" (UniqueName: \"kubernetes.io/projected/3cd73798-3881-4a98-86b0-dc80cf37d901-kube-api-access-kb5pf\") pod \"glance-0953-account-create-update-w46gb\" (UID: \"3cd73798-3881-4a98-86b0-dc80cf37d901\") " pod="openstack/glance-0953-account-create-update-w46gb" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.716347 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zfpsk" Feb 23 08:49:09 crc kubenswrapper[5047]: I0223 08:49:09.835817 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0953-account-create-update-w46gb" Feb 23 08:49:10 crc kubenswrapper[5047]: I0223 08:49:10.176887 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zfpsk"] Feb 23 08:49:10 crc kubenswrapper[5047]: W0223 08:49:10.180250 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a225bdc_5631_4a69_a67d_b59b7c055392.slice/crio-cc5840b4c014c4fc6985236652fd1d0493ba93ab1b4d7164439be22aeee5df37 WatchSource:0}: Error finding container cc5840b4c014c4fc6985236652fd1d0493ba93ab1b4d7164439be22aeee5df37: Status 404 returned error can't find the container with id cc5840b4c014c4fc6985236652fd1d0493ba93ab1b4d7164439be22aeee5df37 Feb 23 08:49:10 crc kubenswrapper[5047]: I0223 08:49:10.285258 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zfpsk" event={"ID":"9a225bdc-5631-4a69-a67d-b59b7c055392","Type":"ContainerStarted","Data":"cc5840b4c014c4fc6985236652fd1d0493ba93ab1b4d7164439be22aeee5df37"} Feb 23 08:49:10 crc kubenswrapper[5047]: I0223 08:49:10.312484 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0953-account-create-update-w46gb"] Feb 23 08:49:11 crc kubenswrapper[5047]: I0223 08:49:11.298288 5047 generic.go:334] "Generic (PLEG): container finished" podID="9a225bdc-5631-4a69-a67d-b59b7c055392" containerID="bafd5f1c994050944eaf630f40ff770762aa20659a81d657072958d5ad173f93" exitCode=0 Feb 23 08:49:11 crc kubenswrapper[5047]: I0223 08:49:11.298426 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zfpsk" event={"ID":"9a225bdc-5631-4a69-a67d-b59b7c055392","Type":"ContainerDied","Data":"bafd5f1c994050944eaf630f40ff770762aa20659a81d657072958d5ad173f93"} Feb 23 08:49:11 crc kubenswrapper[5047]: I0223 08:49:11.301217 5047 generic.go:334] "Generic (PLEG): container finished" podID="3cd73798-3881-4a98-86b0-dc80cf37d901" containerID="a164cf73073bc9b7a575ba24df33d8f55c9938205bd8eacdd659460c4cd11ee7" exitCode=0 Feb 23 08:49:11 crc kubenswrapper[5047]: I0223 08:49:11.301282 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0953-account-create-update-w46gb" event={"ID":"3cd73798-3881-4a98-86b0-dc80cf37d901","Type":"ContainerDied","Data":"a164cf73073bc9b7a575ba24df33d8f55c9938205bd8eacdd659460c4cd11ee7"} Feb 23 08:49:11 crc kubenswrapper[5047]: I0223 08:49:11.301319 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0953-account-create-update-w46gb" event={"ID":"3cd73798-3881-4a98-86b0-dc80cf37d901","Type":"ContainerStarted","Data":"36ff2a0ffb18bb2e53abcefa372b74e1c86d4a39088e4bce7b5002be777c9636"} Feb 23 08:49:12 crc kubenswrapper[5047]: I0223 08:49:12.784180 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0953-account-create-update-w46gb" Feb 23 08:49:12 crc kubenswrapper[5047]: I0223 08:49:12.797124 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zfpsk" Feb 23 08:49:12 crc kubenswrapper[5047]: I0223 08:49:12.840480 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb5pf\" (UniqueName: \"kubernetes.io/projected/3cd73798-3881-4a98-86b0-dc80cf37d901-kube-api-access-kb5pf\") pod \"3cd73798-3881-4a98-86b0-dc80cf37d901\" (UID: \"3cd73798-3881-4a98-86b0-dc80cf37d901\") " Feb 23 08:49:12 crc kubenswrapper[5047]: I0223 08:49:12.840566 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cd73798-3881-4a98-86b0-dc80cf37d901-operator-scripts\") pod \"3cd73798-3881-4a98-86b0-dc80cf37d901\" (UID: \"3cd73798-3881-4a98-86b0-dc80cf37d901\") " Feb 23 08:49:12 crc kubenswrapper[5047]: I0223 08:49:12.842310 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cd73798-3881-4a98-86b0-dc80cf37d901-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3cd73798-3881-4a98-86b0-dc80cf37d901" (UID: "3cd73798-3881-4a98-86b0-dc80cf37d901"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:49:12 crc kubenswrapper[5047]: I0223 08:49:12.850290 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd73798-3881-4a98-86b0-dc80cf37d901-kube-api-access-kb5pf" (OuterVolumeSpecName: "kube-api-access-kb5pf") pod "3cd73798-3881-4a98-86b0-dc80cf37d901" (UID: "3cd73798-3881-4a98-86b0-dc80cf37d901"). InnerVolumeSpecName "kube-api-access-kb5pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:12 crc kubenswrapper[5047]: I0223 08:49:12.944224 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a225bdc-5631-4a69-a67d-b59b7c055392-operator-scripts\") pod \"9a225bdc-5631-4a69-a67d-b59b7c055392\" (UID: \"9a225bdc-5631-4a69-a67d-b59b7c055392\") " Feb 23 08:49:12 crc kubenswrapper[5047]: I0223 08:49:12.945053 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x2mr\" (UniqueName: \"kubernetes.io/projected/9a225bdc-5631-4a69-a67d-b59b7c055392-kube-api-access-4x2mr\") pod \"9a225bdc-5631-4a69-a67d-b59b7c055392\" (UID: \"9a225bdc-5631-4a69-a67d-b59b7c055392\") " Feb 23 08:49:12 crc kubenswrapper[5047]: I0223 08:49:12.945740 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a225bdc-5631-4a69-a67d-b59b7c055392-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a225bdc-5631-4a69-a67d-b59b7c055392" (UID: "9a225bdc-5631-4a69-a67d-b59b7c055392"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:49:12 crc kubenswrapper[5047]: I0223 08:49:12.946695 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb5pf\" (UniqueName: \"kubernetes.io/projected/3cd73798-3881-4a98-86b0-dc80cf37d901-kube-api-access-kb5pf\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:12 crc kubenswrapper[5047]: I0223 08:49:12.946723 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cd73798-3881-4a98-86b0-dc80cf37d901-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:12 crc kubenswrapper[5047]: I0223 08:49:12.946767 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a225bdc-5631-4a69-a67d-b59b7c055392-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:12 crc kubenswrapper[5047]: I0223 08:49:12.950122 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a225bdc-5631-4a69-a67d-b59b7c055392-kube-api-access-4x2mr" (OuterVolumeSpecName: "kube-api-access-4x2mr") pod "9a225bdc-5631-4a69-a67d-b59b7c055392" (UID: "9a225bdc-5631-4a69-a67d-b59b7c055392"). InnerVolumeSpecName "kube-api-access-4x2mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:13 crc kubenswrapper[5047]: I0223 08:49:13.049065 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x2mr\" (UniqueName: \"kubernetes.io/projected/9a225bdc-5631-4a69-a67d-b59b7c055392-kube-api-access-4x2mr\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:13 crc kubenswrapper[5047]: I0223 08:49:13.322923 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zfpsk" event={"ID":"9a225bdc-5631-4a69-a67d-b59b7c055392","Type":"ContainerDied","Data":"cc5840b4c014c4fc6985236652fd1d0493ba93ab1b4d7164439be22aeee5df37"} Feb 23 08:49:13 crc kubenswrapper[5047]: I0223 08:49:13.322952 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zfpsk" Feb 23 08:49:13 crc kubenswrapper[5047]: I0223 08:49:13.322978 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc5840b4c014c4fc6985236652fd1d0493ba93ab1b4d7164439be22aeee5df37" Feb 23 08:49:13 crc kubenswrapper[5047]: I0223 08:49:13.324569 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0953-account-create-update-w46gb" event={"ID":"3cd73798-3881-4a98-86b0-dc80cf37d901","Type":"ContainerDied","Data":"36ff2a0ffb18bb2e53abcefa372b74e1c86d4a39088e4bce7b5002be777c9636"} Feb 23 08:49:13 crc kubenswrapper[5047]: I0223 08:49:13.324608 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36ff2a0ffb18bb2e53abcefa372b74e1c86d4a39088e4bce7b5002be777c9636" Feb 23 08:49:13 crc kubenswrapper[5047]: I0223 08:49:13.324651 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0953-account-create-update-w46gb" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.748588 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-h4lq4"] Feb 23 08:49:14 crc kubenswrapper[5047]: E0223 08:49:14.749669 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd73798-3881-4a98-86b0-dc80cf37d901" containerName="mariadb-account-create-update" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.749691 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd73798-3881-4a98-86b0-dc80cf37d901" containerName="mariadb-account-create-update" Feb 23 08:49:14 crc kubenswrapper[5047]: E0223 08:49:14.749741 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a225bdc-5631-4a69-a67d-b59b7c055392" containerName="mariadb-database-create" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.749748 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a225bdc-5631-4a69-a67d-b59b7c055392" containerName="mariadb-database-create" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.749983 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a225bdc-5631-4a69-a67d-b59b7c055392" containerName="mariadb-database-create" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.750007 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd73798-3881-4a98-86b0-dc80cf37d901" containerName="mariadb-account-create-update" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.750874 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.753671 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.756216 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-52cgw" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.760383 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-h4lq4"] Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.825518 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-db-sync-config-data\") pod \"glance-db-sync-h4lq4\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.828530 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-config-data\") pod \"glance-db-sync-h4lq4\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.828608 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-combined-ca-bundle\") pod \"glance-db-sync-h4lq4\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.828877 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lq2l\" (UniqueName: \"kubernetes.io/projected/6eb42e62-854d-4100-a22c-631a4292d890-kube-api-access-5lq2l\") pod \"glance-db-sync-h4lq4\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.931353 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-config-data\") pod \"glance-db-sync-h4lq4\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.931412 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-combined-ca-bundle\") pod \"glance-db-sync-h4lq4\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.931506 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lq2l\" (UniqueName: \"kubernetes.io/projected/6eb42e62-854d-4100-a22c-631a4292d890-kube-api-access-5lq2l\") pod \"glance-db-sync-h4lq4\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.931627 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-db-sync-config-data\") pod \"glance-db-sync-h4lq4\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.938335 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-db-sync-config-data\") pod \"glance-db-sync-h4lq4\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.938433 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-config-data\") pod \"glance-db-sync-h4lq4\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.939199 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-combined-ca-bundle\") pod \"glance-db-sync-h4lq4\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:14 crc kubenswrapper[5047]: I0223 08:49:14.951440 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lq2l\" (UniqueName: \"kubernetes.io/projected/6eb42e62-854d-4100-a22c-631a4292d890-kube-api-access-5lq2l\") pod \"glance-db-sync-h4lq4\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:15 crc kubenswrapper[5047]: I0223 08:49:15.077272 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:15 crc kubenswrapper[5047]: I0223 08:49:15.472397 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-h4lq4"] Feb 23 08:49:15 crc kubenswrapper[5047]: W0223 08:49:15.477641 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6eb42e62_854d_4100_a22c_631a4292d890.slice/crio-b5a4c31e01062ca7fe3e55822945ecdc0cdf554963dd081d77354379df685168 WatchSource:0}: Error finding container b5a4c31e01062ca7fe3e55822945ecdc0cdf554963dd081d77354379df685168: Status 404 returned error can't find the container with id b5a4c31e01062ca7fe3e55822945ecdc0cdf554963dd081d77354379df685168 Feb 23 08:49:16 crc kubenswrapper[5047]: I0223 08:49:16.369599 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-h4lq4" event={"ID":"6eb42e62-854d-4100-a22c-631a4292d890","Type":"ContainerStarted","Data":"b5a4c31e01062ca7fe3e55822945ecdc0cdf554963dd081d77354379df685168"} Feb 23 08:49:16 crc kubenswrapper[5047]: I0223 08:49:16.760565 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:49:16 crc kubenswrapper[5047]: I0223 08:49:16.760658 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:49:33 crc kubenswrapper[5047]: I0223 08:49:33.712442 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-h4lq4" event={"ID":"6eb42e62-854d-4100-a22c-631a4292d890","Type":"ContainerStarted","Data":"797cd075f412154a1f8d9ad573c8dace687d3d7909df29163a20b4d2fcc3c2ba"} Feb 23 08:49:33 crc kubenswrapper[5047]: I0223 08:49:33.740121 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-h4lq4" podStartSLOduration=2.729097974 podStartE2EDuration="19.740097446s" podCreationTimestamp="2026-02-23 08:49:14 +0000 UTC" firstStartedPulling="2026-02-23 08:49:15.483579991 +0000 UTC m=+7477.734907125" lastFinishedPulling="2026-02-23 08:49:32.494579453 +0000 UTC m=+7494.745906597" observedRunningTime="2026-02-23 08:49:33.737227059 +0000 UTC m=+7495.988554233" watchObservedRunningTime="2026-02-23 08:49:33.740097446 +0000 UTC m=+7495.991424620" Feb 23 08:49:37 crc kubenswrapper[5047]: I0223 08:49:37.758074 5047 generic.go:334] "Generic (PLEG): container finished" podID="6eb42e62-854d-4100-a22c-631a4292d890" containerID="797cd075f412154a1f8d9ad573c8dace687d3d7909df29163a20b4d2fcc3c2ba" exitCode=0 Feb 23 08:49:37 crc kubenswrapper[5047]: I0223 08:49:37.758181 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-h4lq4" event={"ID":"6eb42e62-854d-4100-a22c-631a4292d890","Type":"ContainerDied","Data":"797cd075f412154a1f8d9ad573c8dace687d3d7909df29163a20b4d2fcc3c2ba"} Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.223157 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.310019 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lq2l\" (UniqueName: \"kubernetes.io/projected/6eb42e62-854d-4100-a22c-631a4292d890-kube-api-access-5lq2l\") pod \"6eb42e62-854d-4100-a22c-631a4292d890\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.310255 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-db-sync-config-data\") pod \"6eb42e62-854d-4100-a22c-631a4292d890\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.310322 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-combined-ca-bundle\") pod \"6eb42e62-854d-4100-a22c-631a4292d890\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.310352 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-config-data\") pod \"6eb42e62-854d-4100-a22c-631a4292d890\" (UID: \"6eb42e62-854d-4100-a22c-631a4292d890\") " Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.318943 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb42e62-854d-4100-a22c-631a4292d890-kube-api-access-5lq2l" (OuterVolumeSpecName: "kube-api-access-5lq2l") pod "6eb42e62-854d-4100-a22c-631a4292d890" (UID: "6eb42e62-854d-4100-a22c-631a4292d890"). InnerVolumeSpecName "kube-api-access-5lq2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.319964 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6eb42e62-854d-4100-a22c-631a4292d890" (UID: "6eb42e62-854d-4100-a22c-631a4292d890"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.350488 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6eb42e62-854d-4100-a22c-631a4292d890" (UID: "6eb42e62-854d-4100-a22c-631a4292d890"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.405493 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-config-data" (OuterVolumeSpecName: "config-data") pod "6eb42e62-854d-4100-a22c-631a4292d890" (UID: "6eb42e62-854d-4100-a22c-631a4292d890"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.414185 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lq2l\" (UniqueName: \"kubernetes.io/projected/6eb42e62-854d-4100-a22c-631a4292d890-kube-api-access-5lq2l\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.414225 5047 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.414322 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.414342 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb42e62-854d-4100-a22c-631a4292d890-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.777509 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-h4lq4" event={"ID":"6eb42e62-854d-4100-a22c-631a4292d890","Type":"ContainerDied","Data":"b5a4c31e01062ca7fe3e55822945ecdc0cdf554963dd081d77354379df685168"} Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.777562 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5a4c31e01062ca7fe3e55822945ecdc0cdf554963dd081d77354379df685168" Feb 23 08:49:39 crc kubenswrapper[5047]: I0223 08:49:39.777601 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-h4lq4" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.151660 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:49:40 crc kubenswrapper[5047]: E0223 08:49:40.152634 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb42e62-854d-4100-a22c-631a4292d890" containerName="glance-db-sync" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.152657 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb42e62-854d-4100-a22c-631a4292d890" containerName="glance-db-sync" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.152889 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb42e62-854d-4100-a22c-631a4292d890" containerName="glance-db-sync" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.154277 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.156782 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-52cgw" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.157008 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.158551 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.179249 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.229829 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.229927 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-config-data\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.229989 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c96c5d8-0169-468a-beda-f39bb76b150c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.230087 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x4m6\" (UniqueName: \"kubernetes.io/projected/5c96c5d8-0169-468a-beda-f39bb76b150c-kube-api-access-5x4m6\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.230120 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c96c5d8-0169-468a-beda-f39bb76b150c-logs\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.230150 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-scripts\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.281299 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fbc9bf8d5-tkl62"] Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.282781 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.292068 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fbc9bf8d5-tkl62"] Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.332474 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x4m6\" (UniqueName: \"kubernetes.io/projected/5c96c5d8-0169-468a-beda-f39bb76b150c-kube-api-access-5x4m6\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.332557 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c96c5d8-0169-468a-beda-f39bb76b150c-logs\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.332591 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-scripts\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.332670 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-config\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.332720 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.332758 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.332788 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-config-data\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.332816 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c96c5d8-0169-468a-beda-f39bb76b150c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.332840 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-dns-svc\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.332874 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8tzn\" (UniqueName: \"kubernetes.io/projected/866615de-c907-42a8-835e-918da29902ff-kube-api-access-p8tzn\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.332924 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.334086 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c96c5d8-0169-468a-beda-f39bb76b150c-logs\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.336046 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c96c5d8-0169-468a-beda-f39bb76b150c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.339173 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-scripts\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.345268 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-config-data\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.356156 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x4m6\" (UniqueName: \"kubernetes.io/projected/5c96c5d8-0169-468a-beda-f39bb76b150c-kube-api-access-5x4m6\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.357145 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.431684 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.434281 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-config\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.434438 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.434560 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-dns-svc\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.434657 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8tzn\" (UniqueName: \"kubernetes.io/projected/866615de-c907-42a8-835e-918da29902ff-kube-api-access-p8tzn\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.434744 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.435761 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-config\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.437045 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-dns-svc\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.439484 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.439532 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.441074 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.448479 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.466051 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8tzn\" (UniqueName: \"kubernetes.io/projected/866615de-c907-42a8-835e-918da29902ff-kube-api-access-p8tzn\") pod \"dnsmasq-dns-5fbc9bf8d5-tkl62\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.470599 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.473998 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.536599 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-logs\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.537145 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.537171 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qslv6\" (UniqueName: \"kubernetes.io/projected/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-kube-api-access-qslv6\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.537228 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.537267 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.537297 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.607753 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.638622 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-logs\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.638691 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.638717 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qslv6\" (UniqueName: \"kubernetes.io/projected/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-kube-api-access-qslv6\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.638764 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.638795 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.638833 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.639496 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-logs\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.640625 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.646764 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.646843 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.647805 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.659314 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qslv6\" (UniqueName: \"kubernetes.io/projected/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-kube-api-access-qslv6\") pod \"glance-default-internal-api-0\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:40 crc kubenswrapper[5047]: I0223 08:49:40.810242 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:49:41 crc kubenswrapper[5047]: I0223 08:49:41.218302 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fbc9bf8d5-tkl62"] Feb 23 08:49:41 crc kubenswrapper[5047]: I0223 08:49:41.226985 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:49:41 crc kubenswrapper[5047]: I0223 08:49:41.691985 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:49:41 crc kubenswrapper[5047]: I0223 08:49:41.820470 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:49:41 crc kubenswrapper[5047]: I0223 08:49:41.826893 5047 generic.go:334] "Generic (PLEG): container finished" podID="866615de-c907-42a8-835e-918da29902ff" containerID="9fbd843f28c36caba22d7af5245651d3660d2c1b95aa7a7ff25eb1170cf0db05" exitCode=0 Feb 23 08:49:41 crc kubenswrapper[5047]: I0223 08:49:41.826982 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" event={"ID":"866615de-c907-42a8-835e-918da29902ff","Type":"ContainerDied","Data":"9fbd843f28c36caba22d7af5245651d3660d2c1b95aa7a7ff25eb1170cf0db05"} Feb 23 08:49:41 crc kubenswrapper[5047]: I0223 08:49:41.827011 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" event={"ID":"866615de-c907-42a8-835e-918da29902ff","Type":"ContainerStarted","Data":"c99bb28fedc518a884067b76d3d60b93b5c9ba7cba1b8597a401220613efea5e"} Feb 23 08:49:41 crc kubenswrapper[5047]: I0223 08:49:41.835577 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c96c5d8-0169-468a-beda-f39bb76b150c","Type":"ContainerStarted","Data":"b4c300806af7f9c4fac9da53802cb0820de3d74e883378395dbfd47d54c7139d"} Feb 23 08:49:41 crc kubenswrapper[5047]: I0223 08:49:41.854274 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8","Type":"ContainerStarted","Data":"edd353545e12a481804bbf67096ac273221a13bb6722522b70086cdc2fd3eae6"} Feb 23 08:49:42 crc kubenswrapper[5047]: I0223 08:49:42.880975 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" event={"ID":"866615de-c907-42a8-835e-918da29902ff","Type":"ContainerStarted","Data":"65d6527c9fd975141f1e88f9856495da6bfef1f371f62015cad478b6e9a4daf2"} Feb 23 08:49:42 crc kubenswrapper[5047]: I0223 08:49:42.881633 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:42 crc kubenswrapper[5047]: I0223 08:49:42.884403 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c96c5d8-0169-468a-beda-f39bb76b150c","Type":"ContainerStarted","Data":"eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e"} Feb 23 08:49:42 crc kubenswrapper[5047]: I0223 08:49:42.884430 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c96c5d8-0169-468a-beda-f39bb76b150c","Type":"ContainerStarted","Data":"9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f"} Feb 23 08:49:42 crc kubenswrapper[5047]: I0223 08:49:42.884566 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5c96c5d8-0169-468a-beda-f39bb76b150c" containerName="glance-log" containerID="cri-o://9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f" gracePeriod=30 Feb 23 08:49:42 crc kubenswrapper[5047]: I0223 08:49:42.884853 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5c96c5d8-0169-468a-beda-f39bb76b150c" containerName="glance-httpd" containerID="cri-o://eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e" gracePeriod=30 Feb 23 08:49:42 crc kubenswrapper[5047]: I0223 08:49:42.889564 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8","Type":"ContainerStarted","Data":"a6186d215ad7a57d35d2f665f282fc8f3c5f56b3ee5f6bf2f1bdca7762d547e9"} Feb 23 08:49:42 crc kubenswrapper[5047]: I0223 08:49:42.927860 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" podStartSLOduration=2.9278309 podStartE2EDuration="2.9278309s" podCreationTimestamp="2026-02-23 08:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:42.911557893 +0000 UTC m=+7505.162885027" watchObservedRunningTime="2026-02-23 08:49:42.9278309 +0000 UTC m=+7505.179158024" Feb 23 08:49:42 crc kubenswrapper[5047]: I0223 08:49:42.952993 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:49:42 crc kubenswrapper[5047]: I0223 08:49:42.958984 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.958961595 podStartE2EDuration="2.958961595s" podCreationTimestamp="2026-02-23 08:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:42.941695122 +0000 UTC m=+7505.193022256" watchObservedRunningTime="2026-02-23 08:49:42.958961595 +0000 UTC m=+7505.210288719" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.452508 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.614524 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-scripts\") pod \"5c96c5d8-0169-468a-beda-f39bb76b150c\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.614624 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-config-data\") pod \"5c96c5d8-0169-468a-beda-f39bb76b150c\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.614988 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c96c5d8-0169-468a-beda-f39bb76b150c-logs\") pod \"5c96c5d8-0169-468a-beda-f39bb76b150c\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.615358 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c96c5d8-0169-468a-beda-f39bb76b150c-logs" (OuterVolumeSpecName: "logs") pod "5c96c5d8-0169-468a-beda-f39bb76b150c" (UID: "5c96c5d8-0169-468a-beda-f39bb76b150c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.615439 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c96c5d8-0169-468a-beda-f39bb76b150c-httpd-run\") pod \"5c96c5d8-0169-468a-beda-f39bb76b150c\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.615628 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c96c5d8-0169-468a-beda-f39bb76b150c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5c96c5d8-0169-468a-beda-f39bb76b150c" (UID: "5c96c5d8-0169-468a-beda-f39bb76b150c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.615695 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-combined-ca-bundle\") pod \"5c96c5d8-0169-468a-beda-f39bb76b150c\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.616031 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x4m6\" (UniqueName: \"kubernetes.io/projected/5c96c5d8-0169-468a-beda-f39bb76b150c-kube-api-access-5x4m6\") pod \"5c96c5d8-0169-468a-beda-f39bb76b150c\" (UID: \"5c96c5d8-0169-468a-beda-f39bb76b150c\") " Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.616450 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c96c5d8-0169-468a-beda-f39bb76b150c-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.616467 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c96c5d8-0169-468a-beda-f39bb76b150c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.622028 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c96c5d8-0169-468a-beda-f39bb76b150c-kube-api-access-5x4m6" (OuterVolumeSpecName: "kube-api-access-5x4m6") pod "5c96c5d8-0169-468a-beda-f39bb76b150c" (UID: "5c96c5d8-0169-468a-beda-f39bb76b150c"). InnerVolumeSpecName "kube-api-access-5x4m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.624564 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-scripts" (OuterVolumeSpecName: "scripts") pod "5c96c5d8-0169-468a-beda-f39bb76b150c" (UID: "5c96c5d8-0169-468a-beda-f39bb76b150c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.650046 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c96c5d8-0169-468a-beda-f39bb76b150c" (UID: "5c96c5d8-0169-468a-beda-f39bb76b150c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.676985 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-config-data" (OuterVolumeSpecName: "config-data") pod "5c96c5d8-0169-468a-beda-f39bb76b150c" (UID: "5c96c5d8-0169-468a-beda-f39bb76b150c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.717711 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.717747 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x4m6\" (UniqueName: \"kubernetes.io/projected/5c96c5d8-0169-468a-beda-f39bb76b150c-kube-api-access-5x4m6\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.717760 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.717769 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c96c5d8-0169-468a-beda-f39bb76b150c-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.900179 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8","Type":"ContainerStarted","Data":"640785e4e41685161f7d1ad957f6b772163bce0d1962206bf9d40200db078131"} Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.900344 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" containerName="glance-log" containerID="cri-o://a6186d215ad7a57d35d2f665f282fc8f3c5f56b3ee5f6bf2f1bdca7762d547e9" gracePeriod=30 Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.900392 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" containerName="glance-httpd" containerID="cri-o://640785e4e41685161f7d1ad957f6b772163bce0d1962206bf9d40200db078131" gracePeriod=30 Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.903175 5047 generic.go:334] "Generic (PLEG): container finished" podID="5c96c5d8-0169-468a-beda-f39bb76b150c" containerID="eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e" exitCode=143 Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.903202 5047 generic.go:334] "Generic (PLEG): container finished" podID="5c96c5d8-0169-468a-beda-f39bb76b150c" containerID="9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f" exitCode=143 Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.903217 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c96c5d8-0169-468a-beda-f39bb76b150c","Type":"ContainerDied","Data":"eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e"} Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.903247 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.903267 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c96c5d8-0169-468a-beda-f39bb76b150c","Type":"ContainerDied","Data":"9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f"} Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.903281 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c96c5d8-0169-468a-beda-f39bb76b150c","Type":"ContainerDied","Data":"b4c300806af7f9c4fac9da53802cb0820de3d74e883378395dbfd47d54c7139d"} Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.903308 5047 scope.go:117] "RemoveContainer" containerID="eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.944592 5047 scope.go:117] "RemoveContainer" containerID="9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.950987 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.950968434 podStartE2EDuration="3.950968434s" podCreationTimestamp="2026-02-23 08:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:43.942741213 +0000 UTC m=+7506.194068347" watchObservedRunningTime="2026-02-23 08:49:43.950968434 +0000 UTC m=+7506.202295568" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.986092 5047 scope.go:117] "RemoveContainer" containerID="eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e" Feb 23 08:49:43 crc kubenswrapper[5047]: E0223 08:49:43.986646 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e\": container with ID starting with eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e not found: ID does not exist" containerID="eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.986685 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e"} err="failed to get container status \"eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e\": rpc error: code = NotFound desc = could not find container \"eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e\": container with ID starting with eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e not found: ID does not exist" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.986724 5047 scope.go:117] "RemoveContainer" containerID="9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f" Feb 23 08:49:43 crc kubenswrapper[5047]: E0223 08:49:43.986949 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f\": container with ID starting with 9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f not found: ID does not exist" containerID="9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.986972 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f"} err="failed to get container status \"9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f\": rpc error: code = NotFound desc = could not find container \"9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f\": container with ID starting with 9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f not found: ID does not exist" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.987005 5047 scope.go:117] "RemoveContainer" containerID="eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.988569 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e"} err="failed to get container status \"eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e\": rpc error: code = NotFound desc = could not find container \"eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e\": container with ID starting with eda7bbdc02daa5b36d9a29b5d180d02d7ac642ead7f4f5ad8467a78a0c1a863e not found: ID does not exist" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.988590 5047 scope.go:117] "RemoveContainer" containerID="9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.990124 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f"} err="failed to get container status \"9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f\": rpc error: code = NotFound desc = could not find container \"9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f\": container with ID starting with 9ac0721eb9d342731b36af4d86f353a7f060151903c516b40c7617a7a4ca544f not found: ID does not exist" Feb 23 08:49:43 crc kubenswrapper[5047]: I0223 08:49:43.990650 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.004099 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.025623 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:49:44 crc kubenswrapper[5047]: E0223 08:49:44.026081 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c96c5d8-0169-468a-beda-f39bb76b150c" containerName="glance-log" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.026103 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c96c5d8-0169-468a-beda-f39bb76b150c" containerName="glance-log" Feb 23 08:49:44 crc kubenswrapper[5047]: E0223 08:49:44.026121 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c96c5d8-0169-468a-beda-f39bb76b150c" containerName="glance-httpd" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.026129 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c96c5d8-0169-468a-beda-f39bb76b150c" containerName="glance-httpd" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.026382 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c96c5d8-0169-468a-beda-f39bb76b150c" containerName="glance-log" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.026407 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c96c5d8-0169-468a-beda-f39bb76b150c" containerName="glance-httpd" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.027439 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.030130 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.030453 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.037708 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.128783 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.128927 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25721b59-54fc-4bb4-99d3-baeec00b6794-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.129295 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.129410 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-config-data\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.129453 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25721b59-54fc-4bb4-99d3-baeec00b6794-logs\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.129478 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-scripts\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.129581 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4ck8\" (UniqueName: \"kubernetes.io/projected/25721b59-54fc-4bb4-99d3-baeec00b6794-kube-api-access-z4ck8\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.231727 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-config-data\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.231788 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25721b59-54fc-4bb4-99d3-baeec00b6794-logs\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.231817 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-scripts\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.231918 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4ck8\" (UniqueName: \"kubernetes.io/projected/25721b59-54fc-4bb4-99d3-baeec00b6794-kube-api-access-z4ck8\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.232019 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.232057 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25721b59-54fc-4bb4-99d3-baeec00b6794-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.232132 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.232506 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25721b59-54fc-4bb4-99d3-baeec00b6794-logs\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.232685 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25721b59-54fc-4bb4-99d3-baeec00b6794-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.237683 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-scripts\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.238289 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.238978 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.249415 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-config-data\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.252857 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4ck8\" (UniqueName: \"kubernetes.io/projected/25721b59-54fc-4bb4-99d3-baeec00b6794-kube-api-access-z4ck8\") pod \"glance-default-external-api-0\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.351390 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c96c5d8-0169-468a-beda-f39bb76b150c" path="/var/lib/kubelet/pods/5c96c5d8-0169-468a-beda-f39bb76b150c/volumes" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.362743 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.940014 5047 generic.go:334] "Generic (PLEG): container finished" podID="b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" containerID="640785e4e41685161f7d1ad957f6b772163bce0d1962206bf9d40200db078131" exitCode=0 Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.940471 5047 generic.go:334] "Generic (PLEG): container finished" podID="b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" containerID="a6186d215ad7a57d35d2f665f282fc8f3c5f56b3ee5f6bf2f1bdca7762d547e9" exitCode=143 Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.940544 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8","Type":"ContainerDied","Data":"640785e4e41685161f7d1ad957f6b772163bce0d1962206bf9d40200db078131"} Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.940582 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8","Type":"ContainerDied","Data":"a6186d215ad7a57d35d2f665f282fc8f3c5f56b3ee5f6bf2f1bdca7762d547e9"} Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.940596 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8","Type":"ContainerDied","Data":"edd353545e12a481804bbf67096ac273221a13bb6722522b70086cdc2fd3eae6"} Feb 23 08:49:44 crc kubenswrapper[5047]: I0223 08:49:44.940632 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edd353545e12a481804bbf67096ac273221a13bb6722522b70086cdc2fd3eae6" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.008673 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.008989 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:49:45 crc kubenswrapper[5047]: W0223 08:49:45.011505 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25721b59_54fc_4bb4_99d3_baeec00b6794.slice/crio-1d44fe834b1c58cbe166cac3125e8397d4937d44adfde62a5a54135f7b570bde WatchSource:0}: Error finding container 1d44fe834b1c58cbe166cac3125e8397d4937d44adfde62a5a54135f7b570bde: Status 404 returned error can't find the container with id 1d44fe834b1c58cbe166cac3125e8397d4937d44adfde62a5a54135f7b570bde Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.161271 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-config-data\") pod \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.161666 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-scripts\") pod \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.161719 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-httpd-run\") pod \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.161780 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-combined-ca-bundle\") pod \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.161799 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-logs\") pod \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.161856 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qslv6\" (UniqueName: \"kubernetes.io/projected/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-kube-api-access-qslv6\") pod \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\" (UID: \"b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8\") " Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.165322 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" (UID: "b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.165561 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-logs" (OuterVolumeSpecName: "logs") pod "b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" (UID: "b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.173087 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-scripts" (OuterVolumeSpecName: "scripts") pod "b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" (UID: "b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.173141 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-kube-api-access-qslv6" (OuterVolumeSpecName: "kube-api-access-qslv6") pod "b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" (UID: "b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8"). InnerVolumeSpecName "kube-api-access-qslv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.209840 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" (UID: "b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.240222 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-config-data" (OuterVolumeSpecName: "config-data") pod "b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" (UID: "b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.263696 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.263729 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.263740 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.263750 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qslv6\" (UniqueName: \"kubernetes.io/projected/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-kube-api-access-qslv6\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.263758 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.263766 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.968716 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.969150 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25721b59-54fc-4bb4-99d3-baeec00b6794","Type":"ContainerStarted","Data":"d3d9093d8deff26964fc7d5561bb294d0155048f79dbc17801862ad39d212e4e"} Feb 23 08:49:45 crc kubenswrapper[5047]: I0223 08:49:45.969393 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25721b59-54fc-4bb4-99d3-baeec00b6794","Type":"ContainerStarted","Data":"1d44fe834b1c58cbe166cac3125e8397d4937d44adfde62a5a54135f7b570bde"} Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.033516 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.046242 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.066044 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:49:46 crc kubenswrapper[5047]: E0223 08:49:46.066697 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" containerName="glance-httpd" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.066742 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" containerName="glance-httpd" Feb 23 08:49:46 crc kubenswrapper[5047]: E0223 08:49:46.066754 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" containerName="glance-log" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.066763 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" containerName="glance-log" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.067602 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" containerName="glance-httpd" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.067755 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" containerName="glance-log" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.071207 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.076243 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.076402 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.093379 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.096817 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.096876 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.096910 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7fa1af-25c8-42b0-a972-4666fa4e077f-logs\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.096957 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.097033 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxfnb\" (UniqueName: \"kubernetes.io/projected/9b7fa1af-25c8-42b0-a972-4666fa4e077f-kube-api-access-cxfnb\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.097072 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b7fa1af-25c8-42b0-a972-4666fa4e077f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.097091 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.199191 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.199279 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.199322 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7fa1af-25c8-42b0-a972-4666fa4e077f-logs\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.199374 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.199483 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxfnb\" (UniqueName: \"kubernetes.io/projected/9b7fa1af-25c8-42b0-a972-4666fa4e077f-kube-api-access-cxfnb\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.199539 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b7fa1af-25c8-42b0-a972-4666fa4e077f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.199568 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.200310 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b7fa1af-25c8-42b0-a972-4666fa4e077f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.200435 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7fa1af-25c8-42b0-a972-4666fa4e077f-logs\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.204845 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.207322 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.218732 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.219455 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.222358 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxfnb\" (UniqueName: \"kubernetes.io/projected/9b7fa1af-25c8-42b0-a972-4666fa4e077f-kube-api-access-cxfnb\") pod \"glance-default-internal-api-0\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.353262 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8" path="/var/lib/kubelet/pods/b1fb0ac3-a6aa-4d6e-972c-2b94ed0463a8/volumes" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.409729 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.761297 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.761759 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.761812 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.762560 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.762608 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" gracePeriod=600 Feb 23 08:49:46 crc kubenswrapper[5047]: E0223 08:49:46.943191 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.966941 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:49:46 crc kubenswrapper[5047]: W0223 08:49:46.970334 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b7fa1af_25c8_42b0_a972_4666fa4e077f.slice/crio-c5468c8e21bd17c3568a2bcad01e6bedd812a73e027653ac80f8232fb6af1d22 WatchSource:0}: Error finding container c5468c8e21bd17c3568a2bcad01e6bedd812a73e027653ac80f8232fb6af1d22: Status 404 returned error can't find the container with id c5468c8e21bd17c3568a2bcad01e6bedd812a73e027653ac80f8232fb6af1d22 Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.981680 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" exitCode=0 Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.981745 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873"} Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.981782 5047 scope.go:117] "RemoveContainer" containerID="5480ffc93828193b25025f4f8362ac8f87100f59cbe44b98393e3159b56048e0" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.982454 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:49:46 crc kubenswrapper[5047]: E0223 08:49:46.982713 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:49:46 crc kubenswrapper[5047]: I0223 08:49:46.983975 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25721b59-54fc-4bb4-99d3-baeec00b6794","Type":"ContainerStarted","Data":"2e1c400557585d9a0e53099a74875e3e37dfd67d5c33a46d6a666c6b8800a90d"} Feb 23 08:49:48 crc kubenswrapper[5047]: I0223 08:49:48.012618 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b7fa1af-25c8-42b0-a972-4666fa4e077f","Type":"ContainerStarted","Data":"cbabf61df6138f8686107e1586bb8754fc246799ee80efcedfe1302d8c33e2fb"} Feb 23 08:49:48 crc kubenswrapper[5047]: I0223 08:49:48.012995 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b7fa1af-25c8-42b0-a972-4666fa4e077f","Type":"ContainerStarted","Data":"c5468c8e21bd17c3568a2bcad01e6bedd812a73e027653ac80f8232fb6af1d22"} Feb 23 08:49:48 crc kubenswrapper[5047]: I0223 08:49:48.379361 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.379336398 podStartE2EDuration="5.379336398s" podCreationTimestamp="2026-02-23 08:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:47.044374303 +0000 UTC m=+7509.295701437" watchObservedRunningTime="2026-02-23 08:49:48.379336398 +0000 UTC m=+7510.630663532" Feb 23 08:49:49 crc kubenswrapper[5047]: I0223 08:49:49.029229 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b7fa1af-25c8-42b0-a972-4666fa4e077f","Type":"ContainerStarted","Data":"a9c8d1972e5c3f1cc86887c0e2504a4bc819cd81afddbe8505b9ce83220d4f84"} Feb 23 08:49:49 crc kubenswrapper[5047]: I0223 08:49:49.059280 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.05925281 podStartE2EDuration="3.05925281s" podCreationTimestamp="2026-02-23 08:49:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:49.058467698 +0000 UTC m=+7511.309794852" watchObservedRunningTime="2026-02-23 08:49:49.05925281 +0000 UTC m=+7511.310579944" Feb 23 08:49:50 crc kubenswrapper[5047]: I0223 08:49:50.609051 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:49:50 crc kubenswrapper[5047]: I0223 08:49:50.671599 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-788bb74c87-zcjrv"] Feb 23 08:49:50 crc kubenswrapper[5047]: I0223 08:49:50.671936 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" podUID="14629101-8773-4daf-8eab-2eb6d4b555dc" containerName="dnsmasq-dns" containerID="cri-o://48e0650f732385d422f2d60e1c549dbcc8d57ef7e39e1ded07af1b09e5f8788b" gracePeriod=10 Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.052355 5047 generic.go:334] "Generic (PLEG): container finished" podID="14629101-8773-4daf-8eab-2eb6d4b555dc" containerID="48e0650f732385d422f2d60e1c549dbcc8d57ef7e39e1ded07af1b09e5f8788b" exitCode=0 Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.052403 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" event={"ID":"14629101-8773-4daf-8eab-2eb6d4b555dc","Type":"ContainerDied","Data":"48e0650f732385d422f2d60e1c549dbcc8d57ef7e39e1ded07af1b09e5f8788b"} Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.257588 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.421276 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl9lt\" (UniqueName: \"kubernetes.io/projected/14629101-8773-4daf-8eab-2eb6d4b555dc-kube-api-access-nl9lt\") pod \"14629101-8773-4daf-8eab-2eb6d4b555dc\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.421392 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-ovsdbserver-nb\") pod \"14629101-8773-4daf-8eab-2eb6d4b555dc\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.421492 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-config\") pod \"14629101-8773-4daf-8eab-2eb6d4b555dc\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.421538 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-ovsdbserver-sb\") pod \"14629101-8773-4daf-8eab-2eb6d4b555dc\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.421559 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-dns-svc\") pod \"14629101-8773-4daf-8eab-2eb6d4b555dc\" (UID: \"14629101-8773-4daf-8eab-2eb6d4b555dc\") " Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.428129 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14629101-8773-4daf-8eab-2eb6d4b555dc-kube-api-access-nl9lt" (OuterVolumeSpecName: "kube-api-access-nl9lt") pod "14629101-8773-4daf-8eab-2eb6d4b555dc" (UID: "14629101-8773-4daf-8eab-2eb6d4b555dc"). InnerVolumeSpecName "kube-api-access-nl9lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.472594 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "14629101-8773-4daf-8eab-2eb6d4b555dc" (UID: "14629101-8773-4daf-8eab-2eb6d4b555dc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.476860 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "14629101-8773-4daf-8eab-2eb6d4b555dc" (UID: "14629101-8773-4daf-8eab-2eb6d4b555dc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.477619 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-config" (OuterVolumeSpecName: "config") pod "14629101-8773-4daf-8eab-2eb6d4b555dc" (UID: "14629101-8773-4daf-8eab-2eb6d4b555dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.480340 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "14629101-8773-4daf-8eab-2eb6d4b555dc" (UID: "14629101-8773-4daf-8eab-2eb6d4b555dc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.524280 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl9lt\" (UniqueName: \"kubernetes.io/projected/14629101-8773-4daf-8eab-2eb6d4b555dc-kube-api-access-nl9lt\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.524312 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.524324 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.524333 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:51 crc kubenswrapper[5047]: I0223 08:49:51.524342 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14629101-8773-4daf-8eab-2eb6d4b555dc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:52 crc kubenswrapper[5047]: I0223 08:49:52.067024 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" event={"ID":"14629101-8773-4daf-8eab-2eb6d4b555dc","Type":"ContainerDied","Data":"03563cbc08d9817fbfbd47e291680361789ca8c37e99946c9fb3442d9d88cb10"} Feb 23 08:49:52 crc kubenswrapper[5047]: I0223 08:49:52.067079 5047 scope.go:117] "RemoveContainer" containerID="48e0650f732385d422f2d60e1c549dbcc8d57ef7e39e1ded07af1b09e5f8788b" Feb 23 08:49:52 crc kubenswrapper[5047]: I0223 08:49:52.067148 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-788bb74c87-zcjrv" Feb 23 08:49:52 crc kubenswrapper[5047]: I0223 08:49:52.099992 5047 scope.go:117] "RemoveContainer" containerID="2bbdd0c4f119bfaaba3e1360ad16a71d8ab9a176ad0f3c50c51c378426156ac5" Feb 23 08:49:52 crc kubenswrapper[5047]: I0223 08:49:52.111263 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-788bb74c87-zcjrv"] Feb 23 08:49:52 crc kubenswrapper[5047]: I0223 08:49:52.127253 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-788bb74c87-zcjrv"] Feb 23 08:49:52 crc kubenswrapper[5047]: I0223 08:49:52.356274 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14629101-8773-4daf-8eab-2eb6d4b555dc" path="/var/lib/kubelet/pods/14629101-8773-4daf-8eab-2eb6d4b555dc/volumes" Feb 23 08:49:54 crc kubenswrapper[5047]: I0223 08:49:54.364016 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 08:49:54 crc kubenswrapper[5047]: I0223 08:49:54.364526 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 08:49:54 crc kubenswrapper[5047]: I0223 08:49:54.407791 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 08:49:54 crc kubenswrapper[5047]: I0223 08:49:54.444422 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 08:49:55 crc kubenswrapper[5047]: I0223 08:49:55.109196 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 08:49:55 crc kubenswrapper[5047]: I0223 08:49:55.109704 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 08:49:56 crc kubenswrapper[5047]: I0223 08:49:56.410647 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 08:49:56 crc kubenswrapper[5047]: I0223 08:49:56.410750 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 08:49:56 crc kubenswrapper[5047]: I0223 08:49:56.451382 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 08:49:56 crc kubenswrapper[5047]: I0223 08:49:56.473472 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 08:49:57 crc kubenswrapper[5047]: I0223 08:49:57.131441 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 08:49:57 crc kubenswrapper[5047]: I0223 08:49:57.131975 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 08:49:57 crc kubenswrapper[5047]: I0223 08:49:57.157625 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 08:49:57 crc kubenswrapper[5047]: I0223 08:49:57.158004 5047 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 08:49:57 crc kubenswrapper[5047]: I0223 08:49:57.234303 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 08:49:59 crc kubenswrapper[5047]: I0223 08:49:59.222354 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 08:49:59 crc kubenswrapper[5047]: I0223 08:49:59.223081 5047 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 08:49:59 crc kubenswrapper[5047]: I0223 08:49:59.341823 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:49:59 crc kubenswrapper[5047]: E0223 08:49:59.342188 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:49:59 crc kubenswrapper[5047]: I0223 08:49:59.359722 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.757866 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-lh9zc"] Feb 23 08:50:05 crc kubenswrapper[5047]: E0223 08:50:05.758885 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14629101-8773-4daf-8eab-2eb6d4b555dc" containerName="dnsmasq-dns" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.758901 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="14629101-8773-4daf-8eab-2eb6d4b555dc" containerName="dnsmasq-dns" Feb 23 08:50:05 crc kubenswrapper[5047]: E0223 08:50:05.758928 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14629101-8773-4daf-8eab-2eb6d4b555dc" containerName="init" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.758936 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="14629101-8773-4daf-8eab-2eb6d4b555dc" containerName="init" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.759157 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="14629101-8773-4daf-8eab-2eb6d4b555dc" containerName="dnsmasq-dns" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.759867 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lh9zc" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.768165 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lh9zc"] Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.850577 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-697vv\" (UniqueName: \"kubernetes.io/projected/0d2461d7-38c9-4c73-b4d1-4bfb14579729-kube-api-access-697vv\") pod \"placement-db-create-lh9zc\" (UID: \"0d2461d7-38c9-4c73-b4d1-4bfb14579729\") " pod="openstack/placement-db-create-lh9zc" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.851155 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d2461d7-38c9-4c73-b4d1-4bfb14579729-operator-scripts\") pod \"placement-db-create-lh9zc\" (UID: \"0d2461d7-38c9-4c73-b4d1-4bfb14579729\") " pod="openstack/placement-db-create-lh9zc" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.873264 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5aa7-account-create-update-7sxdw"] Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.874532 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5aa7-account-create-update-7sxdw" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.877378 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.884766 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5aa7-account-create-update-7sxdw"] Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.952536 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d2461d7-38c9-4c73-b4d1-4bfb14579729-operator-scripts\") pod \"placement-db-create-lh9zc\" (UID: \"0d2461d7-38c9-4c73-b4d1-4bfb14579729\") " pod="openstack/placement-db-create-lh9zc" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.952617 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56cc0be6-76ee-4ceb-941c-7de1f44f8e5a-operator-scripts\") pod \"placement-5aa7-account-create-update-7sxdw\" (UID: \"56cc0be6-76ee-4ceb-941c-7de1f44f8e5a\") " pod="openstack/placement-5aa7-account-create-update-7sxdw" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.952772 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc4wz\" (UniqueName: \"kubernetes.io/projected/56cc0be6-76ee-4ceb-941c-7de1f44f8e5a-kube-api-access-kc4wz\") pod \"placement-5aa7-account-create-update-7sxdw\" (UID: \"56cc0be6-76ee-4ceb-941c-7de1f44f8e5a\") " pod="openstack/placement-5aa7-account-create-update-7sxdw" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.952811 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-697vv\" (UniqueName: \"kubernetes.io/projected/0d2461d7-38c9-4c73-b4d1-4bfb14579729-kube-api-access-697vv\") pod \"placement-db-create-lh9zc\" (UID: \"0d2461d7-38c9-4c73-b4d1-4bfb14579729\") " pod="openstack/placement-db-create-lh9zc" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.953738 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d2461d7-38c9-4c73-b4d1-4bfb14579729-operator-scripts\") pod \"placement-db-create-lh9zc\" (UID: \"0d2461d7-38c9-4c73-b4d1-4bfb14579729\") " pod="openstack/placement-db-create-lh9zc" Feb 23 08:50:05 crc kubenswrapper[5047]: I0223 08:50:05.976784 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-697vv\" (UniqueName: \"kubernetes.io/projected/0d2461d7-38c9-4c73-b4d1-4bfb14579729-kube-api-access-697vv\") pod \"placement-db-create-lh9zc\" (UID: \"0d2461d7-38c9-4c73-b4d1-4bfb14579729\") " pod="openstack/placement-db-create-lh9zc" Feb 23 08:50:06 crc kubenswrapper[5047]: I0223 08:50:06.054877 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56cc0be6-76ee-4ceb-941c-7de1f44f8e5a-operator-scripts\") pod \"placement-5aa7-account-create-update-7sxdw\" (UID: \"56cc0be6-76ee-4ceb-941c-7de1f44f8e5a\") " pod="openstack/placement-5aa7-account-create-update-7sxdw" Feb 23 08:50:06 crc kubenswrapper[5047]: I0223 08:50:06.055073 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc4wz\" (UniqueName: \"kubernetes.io/projected/56cc0be6-76ee-4ceb-941c-7de1f44f8e5a-kube-api-access-kc4wz\") pod \"placement-5aa7-account-create-update-7sxdw\" (UID: \"56cc0be6-76ee-4ceb-941c-7de1f44f8e5a\") " pod="openstack/placement-5aa7-account-create-update-7sxdw" Feb 23 08:50:06 crc kubenswrapper[5047]: I0223 08:50:06.055795 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56cc0be6-76ee-4ceb-941c-7de1f44f8e5a-operator-scripts\") pod \"placement-5aa7-account-create-update-7sxdw\" (UID: \"56cc0be6-76ee-4ceb-941c-7de1f44f8e5a\") " pod="openstack/placement-5aa7-account-create-update-7sxdw" Feb 23 08:50:06 crc kubenswrapper[5047]: I0223 08:50:06.074325 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc4wz\" (UniqueName: \"kubernetes.io/projected/56cc0be6-76ee-4ceb-941c-7de1f44f8e5a-kube-api-access-kc4wz\") pod \"placement-5aa7-account-create-update-7sxdw\" (UID: \"56cc0be6-76ee-4ceb-941c-7de1f44f8e5a\") " pod="openstack/placement-5aa7-account-create-update-7sxdw" Feb 23 08:50:06 crc kubenswrapper[5047]: I0223 08:50:06.113280 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lh9zc" Feb 23 08:50:06 crc kubenswrapper[5047]: I0223 08:50:06.192077 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5aa7-account-create-update-7sxdw" Feb 23 08:50:06 crc kubenswrapper[5047]: I0223 08:50:06.703292 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lh9zc"] Feb 23 08:50:06 crc kubenswrapper[5047]: I0223 08:50:06.851705 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5aa7-account-create-update-7sxdw"] Feb 23 08:50:07 crc kubenswrapper[5047]: I0223 08:50:07.269942 5047 generic.go:334] "Generic (PLEG): container finished" podID="56cc0be6-76ee-4ceb-941c-7de1f44f8e5a" containerID="82664f001a677293fec10488e9ce7ea1ab3bc3bd67b4a1161cfbf65efe2a9a95" exitCode=0 Feb 23 08:50:07 crc kubenswrapper[5047]: I0223 08:50:07.270038 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5aa7-account-create-update-7sxdw" event={"ID":"56cc0be6-76ee-4ceb-941c-7de1f44f8e5a","Type":"ContainerDied","Data":"82664f001a677293fec10488e9ce7ea1ab3bc3bd67b4a1161cfbf65efe2a9a95"} Feb 23 08:50:07 crc kubenswrapper[5047]: I0223 08:50:07.270420 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5aa7-account-create-update-7sxdw" event={"ID":"56cc0be6-76ee-4ceb-941c-7de1f44f8e5a","Type":"ContainerStarted","Data":"9a517f1f3a51b27aad57be0042cdd467497156b1d0c0e5e76769d60a9345ea22"} Feb 23 08:50:07 crc kubenswrapper[5047]: I0223 08:50:07.272083 5047 generic.go:334] "Generic (PLEG): container finished" podID="0d2461d7-38c9-4c73-b4d1-4bfb14579729" containerID="afd4a6068fba22f091867e206c4d7e4c0238f6a75c0b5879958e874b98751887" exitCode=0 Feb 23 08:50:07 crc kubenswrapper[5047]: I0223 08:50:07.272157 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lh9zc" event={"ID":"0d2461d7-38c9-4c73-b4d1-4bfb14579729","Type":"ContainerDied","Data":"afd4a6068fba22f091867e206c4d7e4c0238f6a75c0b5879958e874b98751887"} Feb 23 08:50:07 crc kubenswrapper[5047]: I0223 08:50:07.272215 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lh9zc" event={"ID":"0d2461d7-38c9-4c73-b4d1-4bfb14579729","Type":"ContainerStarted","Data":"c8d2f483047ae85adee1e0cae2724f5c19c366058533087ca64d575a9f8cd1a7"} Feb 23 08:50:08 crc kubenswrapper[5047]: I0223 08:50:08.944225 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5aa7-account-create-update-7sxdw" Feb 23 08:50:08 crc kubenswrapper[5047]: I0223 08:50:08.950034 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lh9zc" Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.041490 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d2461d7-38c9-4c73-b4d1-4bfb14579729-operator-scripts\") pod \"0d2461d7-38c9-4c73-b4d1-4bfb14579729\" (UID: \"0d2461d7-38c9-4c73-b4d1-4bfb14579729\") " Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.041616 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56cc0be6-76ee-4ceb-941c-7de1f44f8e5a-operator-scripts\") pod \"56cc0be6-76ee-4ceb-941c-7de1f44f8e5a\" (UID: \"56cc0be6-76ee-4ceb-941c-7de1f44f8e5a\") " Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.041681 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc4wz\" (UniqueName: \"kubernetes.io/projected/56cc0be6-76ee-4ceb-941c-7de1f44f8e5a-kube-api-access-kc4wz\") pod \"56cc0be6-76ee-4ceb-941c-7de1f44f8e5a\" (UID: \"56cc0be6-76ee-4ceb-941c-7de1f44f8e5a\") " Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.041786 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-697vv\" (UniqueName: \"kubernetes.io/projected/0d2461d7-38c9-4c73-b4d1-4bfb14579729-kube-api-access-697vv\") pod \"0d2461d7-38c9-4c73-b4d1-4bfb14579729\" (UID: \"0d2461d7-38c9-4c73-b4d1-4bfb14579729\") " Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.042383 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56cc0be6-76ee-4ceb-941c-7de1f44f8e5a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56cc0be6-76ee-4ceb-941c-7de1f44f8e5a" (UID: "56cc0be6-76ee-4ceb-941c-7de1f44f8e5a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.042378 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d2461d7-38c9-4c73-b4d1-4bfb14579729-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d2461d7-38c9-4c73-b4d1-4bfb14579729" (UID: "0d2461d7-38c9-4c73-b4d1-4bfb14579729"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.049508 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56cc0be6-76ee-4ceb-941c-7de1f44f8e5a-kube-api-access-kc4wz" (OuterVolumeSpecName: "kube-api-access-kc4wz") pod "56cc0be6-76ee-4ceb-941c-7de1f44f8e5a" (UID: "56cc0be6-76ee-4ceb-941c-7de1f44f8e5a"). InnerVolumeSpecName "kube-api-access-kc4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.049794 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d2461d7-38c9-4c73-b4d1-4bfb14579729-kube-api-access-697vv" (OuterVolumeSpecName: "kube-api-access-697vv") pod "0d2461d7-38c9-4c73-b4d1-4bfb14579729" (UID: "0d2461d7-38c9-4c73-b4d1-4bfb14579729"). InnerVolumeSpecName "kube-api-access-697vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.144330 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d2461d7-38c9-4c73-b4d1-4bfb14579729-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.144613 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56cc0be6-76ee-4ceb-941c-7de1f44f8e5a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.144623 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc4wz\" (UniqueName: \"kubernetes.io/projected/56cc0be6-76ee-4ceb-941c-7de1f44f8e5a-kube-api-access-kc4wz\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.144632 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-697vv\" (UniqueName: \"kubernetes.io/projected/0d2461d7-38c9-4c73-b4d1-4bfb14579729-kube-api-access-697vv\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.293509 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5aa7-account-create-update-7sxdw" Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.293514 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5aa7-account-create-update-7sxdw" event={"ID":"56cc0be6-76ee-4ceb-941c-7de1f44f8e5a","Type":"ContainerDied","Data":"9a517f1f3a51b27aad57be0042cdd467497156b1d0c0e5e76769d60a9345ea22"} Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.293591 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a517f1f3a51b27aad57be0042cdd467497156b1d0c0e5e76769d60a9345ea22" Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.295656 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lh9zc" event={"ID":"0d2461d7-38c9-4c73-b4d1-4bfb14579729","Type":"ContainerDied","Data":"c8d2f483047ae85adee1e0cae2724f5c19c366058533087ca64d575a9f8cd1a7"} Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.295703 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8d2f483047ae85adee1e0cae2724f5c19c366058533087ca64d575a9f8cd1a7" Feb 23 08:50:09 crc kubenswrapper[5047]: I0223 08:50:09.295711 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lh9zc" Feb 23 08:50:10 crc kubenswrapper[5047]: I0223 08:50:10.342560 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:50:10 crc kubenswrapper[5047]: E0223 08:50:10.342865 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.166883 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bc7dfddf9-69wmk"] Feb 23 08:50:11 crc kubenswrapper[5047]: E0223 08:50:11.167650 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2461d7-38c9-4c73-b4d1-4bfb14579729" containerName="mariadb-database-create" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.167669 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2461d7-38c9-4c73-b4d1-4bfb14579729" containerName="mariadb-database-create" Feb 23 08:50:11 crc kubenswrapper[5047]: E0223 08:50:11.167682 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56cc0be6-76ee-4ceb-941c-7de1f44f8e5a" containerName="mariadb-account-create-update" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.167692 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="56cc0be6-76ee-4ceb-941c-7de1f44f8e5a" containerName="mariadb-account-create-update" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.167867 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="56cc0be6-76ee-4ceb-941c-7de1f44f8e5a" containerName="mariadb-account-create-update" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.167888 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2461d7-38c9-4c73-b4d1-4bfb14579729" containerName="mariadb-database-create" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.168830 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.178948 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zdtw6"] Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.180723 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.189133 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.189385 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-657jp" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.189528 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.191119 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.191190 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-dns-svc\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.191226 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-config\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.191387 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bjwt\" (UniqueName: \"kubernetes.io/projected/8d8399d9-f20e-4917-acd7-d1f94eb6269d-kube-api-access-8bjwt\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.191443 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.202914 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zdtw6"] Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.214289 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bc7dfddf9-69wmk"] Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.292869 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-dns-svc\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.292961 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-config\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.293051 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-combined-ca-bundle\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.293085 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bjwt\" (UniqueName: \"kubernetes.io/projected/8d8399d9-f20e-4917-acd7-d1f94eb6269d-kube-api-access-8bjwt\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.293107 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.293136 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-scripts\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.293190 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5d4b\" (UniqueName: \"kubernetes.io/projected/26ecf089-0275-4659-9180-dd68b3ee8b3a-kube-api-access-d5d4b\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.293218 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-config-data\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.293257 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26ecf089-0275-4659-9180-dd68b3ee8b3a-logs\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.293283 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.294323 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-dns-svc\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.294338 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-config\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.294644 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.294671 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.317965 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bjwt\" (UniqueName: \"kubernetes.io/projected/8d8399d9-f20e-4917-acd7-d1f94eb6269d-kube-api-access-8bjwt\") pod \"dnsmasq-dns-7bc7dfddf9-69wmk\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.394981 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-scripts\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.395059 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5d4b\" (UniqueName: \"kubernetes.io/projected/26ecf089-0275-4659-9180-dd68b3ee8b3a-kube-api-access-d5d4b\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.395087 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-config-data\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.395113 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26ecf089-0275-4659-9180-dd68b3ee8b3a-logs\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.395241 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-combined-ca-bundle\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.396885 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26ecf089-0275-4659-9180-dd68b3ee8b3a-logs\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.398809 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-config-data\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.398982 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-combined-ca-bundle\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.399207 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-scripts\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.415522 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5d4b\" (UniqueName: \"kubernetes.io/projected/26ecf089-0275-4659-9180-dd68b3ee8b3a-kube-api-access-d5d4b\") pod \"placement-db-sync-zdtw6\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.487357 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:11 crc kubenswrapper[5047]: I0223 08:50:11.500194 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:12 crc kubenswrapper[5047]: I0223 08:50:12.011181 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bc7dfddf9-69wmk"] Feb 23 08:50:12 crc kubenswrapper[5047]: I0223 08:50:12.126242 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zdtw6"] Feb 23 08:50:12 crc kubenswrapper[5047]: W0223 08:50:12.131989 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26ecf089_0275_4659_9180_dd68b3ee8b3a.slice/crio-26aafee23764151480d10112a124a42ec8b97c37729653420599648889a8dcee WatchSource:0}: Error finding container 26aafee23764151480d10112a124a42ec8b97c37729653420599648889a8dcee: Status 404 returned error can't find the container with id 26aafee23764151480d10112a124a42ec8b97c37729653420599648889a8dcee Feb 23 08:50:12 crc kubenswrapper[5047]: I0223 08:50:12.339378 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zdtw6" event={"ID":"26ecf089-0275-4659-9180-dd68b3ee8b3a","Type":"ContainerStarted","Data":"26aafee23764151480d10112a124a42ec8b97c37729653420599648889a8dcee"} Feb 23 08:50:12 crc kubenswrapper[5047]: I0223 08:50:12.346240 5047 generic.go:334] "Generic (PLEG): container finished" podID="8d8399d9-f20e-4917-acd7-d1f94eb6269d" containerID="31f16e59216955638fe22ea9ca7717ca02e56ba40b17fa772cdca5f175acc768" exitCode=0 Feb 23 08:50:12 crc kubenswrapper[5047]: I0223 08:50:12.356686 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" event={"ID":"8d8399d9-f20e-4917-acd7-d1f94eb6269d","Type":"ContainerDied","Data":"31f16e59216955638fe22ea9ca7717ca02e56ba40b17fa772cdca5f175acc768"} Feb 23 08:50:12 crc kubenswrapper[5047]: I0223 08:50:12.356751 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" event={"ID":"8d8399d9-f20e-4917-acd7-d1f94eb6269d","Type":"ContainerStarted","Data":"c6533418e3e55fb00d28556990303d44941393dc57fa9d4402b3c55626891a8e"} Feb 23 08:50:13 crc kubenswrapper[5047]: I0223 08:50:13.360989 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" event={"ID":"8d8399d9-f20e-4917-acd7-d1f94eb6269d","Type":"ContainerStarted","Data":"a1e39e930216b49fad367ef294a370baecdcacf8c84a006b1d85522565452fbb"} Feb 23 08:50:13 crc kubenswrapper[5047]: I0223 08:50:13.361962 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:13 crc kubenswrapper[5047]: I0223 08:50:13.402036 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" podStartSLOduration=2.402013821 podStartE2EDuration="2.402013821s" podCreationTimestamp="2026-02-23 08:50:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:50:13.388102838 +0000 UTC m=+7535.639429972" watchObservedRunningTime="2026-02-23 08:50:13.402013821 +0000 UTC m=+7535.653340955" Feb 23 08:50:16 crc kubenswrapper[5047]: I0223 08:50:16.397099 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zdtw6" event={"ID":"26ecf089-0275-4659-9180-dd68b3ee8b3a","Type":"ContainerStarted","Data":"e8f5ead500e17b9d657ae1be75538f707abe9c64946bc1720280b8e0264adbc1"} Feb 23 08:50:16 crc kubenswrapper[5047]: I0223 08:50:16.446168 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zdtw6" podStartSLOduration=1.5462192 podStartE2EDuration="5.446142408s" podCreationTimestamp="2026-02-23 08:50:11 +0000 UTC" firstStartedPulling="2026-02-23 08:50:12.149277934 +0000 UTC m=+7534.400605078" lastFinishedPulling="2026-02-23 08:50:16.049201112 +0000 UTC m=+7538.300528286" observedRunningTime="2026-02-23 08:50:16.421745192 +0000 UTC m=+7538.673072376" watchObservedRunningTime="2026-02-23 08:50:16.446142408 +0000 UTC m=+7538.697469542" Feb 23 08:50:18 crc kubenswrapper[5047]: I0223 08:50:18.422352 5047 generic.go:334] "Generic (PLEG): container finished" podID="26ecf089-0275-4659-9180-dd68b3ee8b3a" containerID="e8f5ead500e17b9d657ae1be75538f707abe9c64946bc1720280b8e0264adbc1" exitCode=0 Feb 23 08:50:18 crc kubenswrapper[5047]: I0223 08:50:18.422501 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zdtw6" event={"ID":"26ecf089-0275-4659-9180-dd68b3ee8b3a","Type":"ContainerDied","Data":"e8f5ead500e17b9d657ae1be75538f707abe9c64946bc1720280b8e0264adbc1"} Feb 23 08:50:19 crc kubenswrapper[5047]: I0223 08:50:19.954641 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.083996 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26ecf089-0275-4659-9180-dd68b3ee8b3a-logs\") pod \"26ecf089-0275-4659-9180-dd68b3ee8b3a\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.084211 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-config-data\") pod \"26ecf089-0275-4659-9180-dd68b3ee8b3a\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.084320 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-combined-ca-bundle\") pod \"26ecf089-0275-4659-9180-dd68b3ee8b3a\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.084414 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5d4b\" (UniqueName: \"kubernetes.io/projected/26ecf089-0275-4659-9180-dd68b3ee8b3a-kube-api-access-d5d4b\") pod \"26ecf089-0275-4659-9180-dd68b3ee8b3a\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.084551 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-scripts\") pod \"26ecf089-0275-4659-9180-dd68b3ee8b3a\" (UID: \"26ecf089-0275-4659-9180-dd68b3ee8b3a\") " Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.084661 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26ecf089-0275-4659-9180-dd68b3ee8b3a-logs" (OuterVolumeSpecName: "logs") pod "26ecf089-0275-4659-9180-dd68b3ee8b3a" (UID: "26ecf089-0275-4659-9180-dd68b3ee8b3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.085059 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26ecf089-0275-4659-9180-dd68b3ee8b3a-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.095010 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-scripts" (OuterVolumeSpecName: "scripts") pod "26ecf089-0275-4659-9180-dd68b3ee8b3a" (UID: "26ecf089-0275-4659-9180-dd68b3ee8b3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.095135 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26ecf089-0275-4659-9180-dd68b3ee8b3a-kube-api-access-d5d4b" (OuterVolumeSpecName: "kube-api-access-d5d4b") pod "26ecf089-0275-4659-9180-dd68b3ee8b3a" (UID: "26ecf089-0275-4659-9180-dd68b3ee8b3a"). InnerVolumeSpecName "kube-api-access-d5d4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.121888 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26ecf089-0275-4659-9180-dd68b3ee8b3a" (UID: "26ecf089-0275-4659-9180-dd68b3ee8b3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.132751 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-config-data" (OuterVolumeSpecName: "config-data") pod "26ecf089-0275-4659-9180-dd68b3ee8b3a" (UID: "26ecf089-0275-4659-9180-dd68b3ee8b3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.187000 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.187166 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.187247 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ecf089-0275-4659-9180-dd68b3ee8b3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.187332 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5d4b\" (UniqueName: \"kubernetes.io/projected/26ecf089-0275-4659-9180-dd68b3ee8b3a-kube-api-access-d5d4b\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.445703 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zdtw6" event={"ID":"26ecf089-0275-4659-9180-dd68b3ee8b3a","Type":"ContainerDied","Data":"26aafee23764151480d10112a124a42ec8b97c37729653420599648889a8dcee"} Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.446291 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26aafee23764151480d10112a124a42ec8b97c37729653420599648889a8dcee" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.446053 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zdtw6" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.554686 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6ff9694944-nmq4c"] Feb 23 08:50:20 crc kubenswrapper[5047]: E0223 08:50:20.555514 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26ecf089-0275-4659-9180-dd68b3ee8b3a" containerName="placement-db-sync" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.555538 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ecf089-0275-4659-9180-dd68b3ee8b3a" containerName="placement-db-sync" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.555792 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="26ecf089-0275-4659-9180-dd68b3ee8b3a" containerName="placement-db-sync" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.557637 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.564284 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.564821 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.565239 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.565522 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.566685 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-657jp" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.569401 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6ff9694944-nmq4c"] Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.699297 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-public-tls-certs\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.699423 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6gr4\" (UniqueName: \"kubernetes.io/projected/07094621-ca88-4942-a226-76667658a5bc-kube-api-access-m6gr4\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.699469 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-combined-ca-bundle\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.699651 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07094621-ca88-4942-a226-76667658a5bc-logs\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.699851 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-internal-tls-certs\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.699987 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-config-data\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.700181 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-scripts\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.802882 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-public-tls-certs\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.802987 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6gr4\" (UniqueName: \"kubernetes.io/projected/07094621-ca88-4942-a226-76667658a5bc-kube-api-access-m6gr4\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.803029 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-combined-ca-bundle\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.803056 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07094621-ca88-4942-a226-76667658a5bc-logs\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.803127 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-internal-tls-certs\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.803175 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-config-data\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.803472 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-scripts\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.804254 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07094621-ca88-4942-a226-76667658a5bc-logs\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.809135 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-internal-tls-certs\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.810979 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-combined-ca-bundle\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.813815 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-scripts\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.822321 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-config-data\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.822763 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-public-tls-certs\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.828415 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6gr4\" (UniqueName: \"kubernetes.io/projected/07094621-ca88-4942-a226-76667658a5bc-kube-api-access-m6gr4\") pod \"placement-6ff9694944-nmq4c\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:20 crc kubenswrapper[5047]: I0223 08:50:20.920637 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:21 crc kubenswrapper[5047]: I0223 08:50:21.442099 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6ff9694944-nmq4c"] Feb 23 08:50:21 crc kubenswrapper[5047]: I0223 08:50:21.457248 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ff9694944-nmq4c" event={"ID":"07094621-ca88-4942-a226-76667658a5bc","Type":"ContainerStarted","Data":"40ddd88a6a1cc45ff5d983c62212c82d17cddd1b683063a0a534936a73694212"} Feb 23 08:50:21 crc kubenswrapper[5047]: I0223 08:50:21.490831 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:50:21 crc kubenswrapper[5047]: I0223 08:50:21.563797 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbc9bf8d5-tkl62"] Feb 23 08:50:21 crc kubenswrapper[5047]: I0223 08:50:21.564112 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" podUID="866615de-c907-42a8-835e-918da29902ff" containerName="dnsmasq-dns" containerID="cri-o://65d6527c9fd975141f1e88f9856495da6bfef1f371f62015cad478b6e9a4daf2" gracePeriod=10 Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.234482 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.345942 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-ovsdbserver-nb\") pod \"866615de-c907-42a8-835e-918da29902ff\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.346537 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-config\") pod \"866615de-c907-42a8-835e-918da29902ff\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.346612 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-ovsdbserver-sb\") pod \"866615de-c907-42a8-835e-918da29902ff\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.346770 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8tzn\" (UniqueName: \"kubernetes.io/projected/866615de-c907-42a8-835e-918da29902ff-kube-api-access-p8tzn\") pod \"866615de-c907-42a8-835e-918da29902ff\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.346940 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-dns-svc\") pod \"866615de-c907-42a8-835e-918da29902ff\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.370062 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/866615de-c907-42a8-835e-918da29902ff-kube-api-access-p8tzn" (OuterVolumeSpecName: "kube-api-access-p8tzn") pod "866615de-c907-42a8-835e-918da29902ff" (UID: "866615de-c907-42a8-835e-918da29902ff"). InnerVolumeSpecName "kube-api-access-p8tzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.408830 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-config" (OuterVolumeSpecName: "config") pod "866615de-c907-42a8-835e-918da29902ff" (UID: "866615de-c907-42a8-835e-918da29902ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.438411 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "866615de-c907-42a8-835e-918da29902ff" (UID: "866615de-c907-42a8-835e-918da29902ff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.450972 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.451553 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.451685 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8tzn\" (UniqueName: \"kubernetes.io/projected/866615de-c907-42a8-835e-918da29902ff-kube-api-access-p8tzn\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:22 crc kubenswrapper[5047]: E0223 08:50:22.459602 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-dns-svc podName:866615de-c907-42a8-835e-918da29902ff nodeName:}" failed. No retries permitted until 2026-02-23 08:50:22.95956651 +0000 UTC m=+7545.210893644 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-dns-svc") pod "866615de-c907-42a8-835e-918da29902ff" (UID: "866615de-c907-42a8-835e-918da29902ff") : error deleting /var/lib/kubelet/pods/866615de-c907-42a8-835e-918da29902ff/volume-subpaths: remove /var/lib/kubelet/pods/866615de-c907-42a8-835e-918da29902ff/volume-subpaths: no such file or directory Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.459782 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "866615de-c907-42a8-835e-918da29902ff" (UID: "866615de-c907-42a8-835e-918da29902ff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.485500 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ff9694944-nmq4c" event={"ID":"07094621-ca88-4942-a226-76667658a5bc","Type":"ContainerStarted","Data":"7ec9f905acecad1325d415f2ddb8f8369e83f15dfec01b649e5e7392716e0b34"} Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.485953 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ff9694944-nmq4c" event={"ID":"07094621-ca88-4942-a226-76667658a5bc","Type":"ContainerStarted","Data":"20ea81041bb52ceb23d8bb279a41e64793f561c21214179c617e0cd48b7db8c3"} Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.487081 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.487143 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.498416 5047 generic.go:334] "Generic (PLEG): container finished" podID="866615de-c907-42a8-835e-918da29902ff" containerID="65d6527c9fd975141f1e88f9856495da6bfef1f371f62015cad478b6e9a4daf2" exitCode=0 Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.498597 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" event={"ID":"866615de-c907-42a8-835e-918da29902ff","Type":"ContainerDied","Data":"65d6527c9fd975141f1e88f9856495da6bfef1f371f62015cad478b6e9a4daf2"} Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.498649 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" event={"ID":"866615de-c907-42a8-835e-918da29902ff","Type":"ContainerDied","Data":"c99bb28fedc518a884067b76d3d60b93b5c9ba7cba1b8597a401220613efea5e"} Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.498674 5047 scope.go:117] "RemoveContainer" containerID="65d6527c9fd975141f1e88f9856495da6bfef1f371f62015cad478b6e9a4daf2" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.499036 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc9bf8d5-tkl62" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.527799 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6ff9694944-nmq4c" podStartSLOduration=2.527764921 podStartE2EDuration="2.527764921s" podCreationTimestamp="2026-02-23 08:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:50:22.510982211 +0000 UTC m=+7544.762309365" watchObservedRunningTime="2026-02-23 08:50:22.527764921 +0000 UTC m=+7544.779092065" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.535690 5047 scope.go:117] "RemoveContainer" containerID="9fbd843f28c36caba22d7af5245651d3660d2c1b95aa7a7ff25eb1170cf0db05" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.554359 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.563355 5047 scope.go:117] "RemoveContainer" containerID="65d6527c9fd975141f1e88f9856495da6bfef1f371f62015cad478b6e9a4daf2" Feb 23 08:50:22 crc kubenswrapper[5047]: E0223 08:50:22.564016 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65d6527c9fd975141f1e88f9856495da6bfef1f371f62015cad478b6e9a4daf2\": container with ID starting with 65d6527c9fd975141f1e88f9856495da6bfef1f371f62015cad478b6e9a4daf2 not found: ID does not exist" containerID="65d6527c9fd975141f1e88f9856495da6bfef1f371f62015cad478b6e9a4daf2" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.564205 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65d6527c9fd975141f1e88f9856495da6bfef1f371f62015cad478b6e9a4daf2"} err="failed to get container status \"65d6527c9fd975141f1e88f9856495da6bfef1f371f62015cad478b6e9a4daf2\": rpc error: code = NotFound desc = could not find container \"65d6527c9fd975141f1e88f9856495da6bfef1f371f62015cad478b6e9a4daf2\": container with ID starting with 65d6527c9fd975141f1e88f9856495da6bfef1f371f62015cad478b6e9a4daf2 not found: ID does not exist" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.564324 5047 scope.go:117] "RemoveContainer" containerID="9fbd843f28c36caba22d7af5245651d3660d2c1b95aa7a7ff25eb1170cf0db05" Feb 23 08:50:22 crc kubenswrapper[5047]: E0223 08:50:22.565190 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fbd843f28c36caba22d7af5245651d3660d2c1b95aa7a7ff25eb1170cf0db05\": container with ID starting with 9fbd843f28c36caba22d7af5245651d3660d2c1b95aa7a7ff25eb1170cf0db05 not found: ID does not exist" containerID="9fbd843f28c36caba22d7af5245651d3660d2c1b95aa7a7ff25eb1170cf0db05" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.565243 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbd843f28c36caba22d7af5245651d3660d2c1b95aa7a7ff25eb1170cf0db05"} err="failed to get container status \"9fbd843f28c36caba22d7af5245651d3660d2c1b95aa7a7ff25eb1170cf0db05\": rpc error: code = NotFound desc = could not find container \"9fbd843f28c36caba22d7af5245651d3660d2c1b95aa7a7ff25eb1170cf0db05\": container with ID starting with 9fbd843f28c36caba22d7af5245651d3660d2c1b95aa7a7ff25eb1170cf0db05 not found: ID does not exist" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.962973 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-dns-svc\") pod \"866615de-c907-42a8-835e-918da29902ff\" (UID: \"866615de-c907-42a8-835e-918da29902ff\") " Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.963647 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "866615de-c907-42a8-835e-918da29902ff" (UID: "866615de-c907-42a8-835e-918da29902ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:50:22 crc kubenswrapper[5047]: I0223 08:50:22.963855 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/866615de-c907-42a8-835e-918da29902ff-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:23 crc kubenswrapper[5047]: I0223 08:50:23.149857 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbc9bf8d5-tkl62"] Feb 23 08:50:23 crc kubenswrapper[5047]: I0223 08:50:23.157240 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fbc9bf8d5-tkl62"] Feb 23 08:50:23 crc kubenswrapper[5047]: I0223 08:50:23.342308 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:50:23 crc kubenswrapper[5047]: E0223 08:50:23.343051 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:50:24 crc kubenswrapper[5047]: I0223 08:50:24.351209 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="866615de-c907-42a8-835e-918da29902ff" path="/var/lib/kubelet/pods/866615de-c907-42a8-835e-918da29902ff/volumes" Feb 23 08:50:34 crc kubenswrapper[5047]: I0223 08:50:34.340785 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:50:34 crc kubenswrapper[5047]: E0223 08:50:34.343843 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:50:46 crc kubenswrapper[5047]: I0223 08:50:46.341778 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:50:46 crc kubenswrapper[5047]: E0223 08:50:46.343290 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:50:51 crc kubenswrapper[5047]: I0223 08:50:51.975459 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:51 crc kubenswrapper[5047]: I0223 08:50:51.984701 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6ff9694944-nmq4c" Feb 23 08:50:59 crc kubenswrapper[5047]: I0223 08:50:59.341900 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:50:59 crc kubenswrapper[5047]: E0223 08:50:59.343372 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:51:11 crc kubenswrapper[5047]: I0223 08:51:11.341314 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:51:11 crc kubenswrapper[5047]: E0223 08:51:11.342845 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.252674 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9pg8r"] Feb 23 08:51:14 crc kubenswrapper[5047]: E0223 08:51:14.253579 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="866615de-c907-42a8-835e-918da29902ff" containerName="init" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.253596 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="866615de-c907-42a8-835e-918da29902ff" containerName="init" Feb 23 08:51:14 crc kubenswrapper[5047]: E0223 08:51:14.253623 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="866615de-c907-42a8-835e-918da29902ff" containerName="dnsmasq-dns" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.253631 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="866615de-c907-42a8-835e-918da29902ff" containerName="dnsmasq-dns" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.253824 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="866615de-c907-42a8-835e-918da29902ff" containerName="dnsmasq-dns" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.254625 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9pg8r" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.262641 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9pg8r"] Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.372035 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-p9mvk"] Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.374536 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p9mvk" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.396135 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc65c\" (UniqueName: \"kubernetes.io/projected/7a81e655-7763-4211-a07b-8fcf1ab82ede-kube-api-access-bc65c\") pod \"nova-api-db-create-9pg8r\" (UID: \"7a81e655-7763-4211-a07b-8fcf1ab82ede\") " pod="openstack/nova-api-db-create-9pg8r" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.396286 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a81e655-7763-4211-a07b-8fcf1ab82ede-operator-scripts\") pod \"nova-api-db-create-9pg8r\" (UID: \"7a81e655-7763-4211-a07b-8fcf1ab82ede\") " pod="openstack/nova-api-db-create-9pg8r" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.422409 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-p9mvk"] Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.467986 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0ef9-account-create-update-dwtpm"] Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.469417 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0ef9-account-create-update-dwtpm" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.474168 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.488298 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0ef9-account-create-update-dwtpm"] Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.498051 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5vtb\" (UniqueName: \"kubernetes.io/projected/8f76d580-9fff-4d98-854c-01082ff6b82a-kube-api-access-m5vtb\") pod \"nova-cell0-db-create-p9mvk\" (UID: \"8f76d580-9fff-4d98-854c-01082ff6b82a\") " pod="openstack/nova-cell0-db-create-p9mvk" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.498152 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a81e655-7763-4211-a07b-8fcf1ab82ede-operator-scripts\") pod \"nova-api-db-create-9pg8r\" (UID: \"7a81e655-7763-4211-a07b-8fcf1ab82ede\") " pod="openstack/nova-api-db-create-9pg8r" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.498251 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc65c\" (UniqueName: \"kubernetes.io/projected/7a81e655-7763-4211-a07b-8fcf1ab82ede-kube-api-access-bc65c\") pod \"nova-api-db-create-9pg8r\" (UID: \"7a81e655-7763-4211-a07b-8fcf1ab82ede\") " pod="openstack/nova-api-db-create-9pg8r" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.498289 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f76d580-9fff-4d98-854c-01082ff6b82a-operator-scripts\") pod \"nova-cell0-db-create-p9mvk\" (UID: \"8f76d580-9fff-4d98-854c-01082ff6b82a\") " pod="openstack/nova-cell0-db-create-p9mvk" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.498856 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a81e655-7763-4211-a07b-8fcf1ab82ede-operator-scripts\") pod \"nova-api-db-create-9pg8r\" (UID: \"7a81e655-7763-4211-a07b-8fcf1ab82ede\") " pod="openstack/nova-api-db-create-9pg8r" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.525681 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc65c\" (UniqueName: \"kubernetes.io/projected/7a81e655-7763-4211-a07b-8fcf1ab82ede-kube-api-access-bc65c\") pod \"nova-api-db-create-9pg8r\" (UID: \"7a81e655-7763-4211-a07b-8fcf1ab82ede\") " pod="openstack/nova-api-db-create-9pg8r" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.559079 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wwttw"] Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.560547 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwttw" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.573855 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wwttw"] Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.575665 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9pg8r" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.600707 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e634979e-62c4-4419-8917-e9e4b008de10-operator-scripts\") pod \"nova-cell1-db-create-wwttw\" (UID: \"e634979e-62c4-4419-8917-e9e4b008de10\") " pod="openstack/nova-cell1-db-create-wwttw" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.600783 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dql4j\" (UniqueName: \"kubernetes.io/projected/f13aae8d-09b0-4975-80da-a02d609028d7-kube-api-access-dql4j\") pod \"nova-api-0ef9-account-create-update-dwtpm\" (UID: \"f13aae8d-09b0-4975-80da-a02d609028d7\") " pod="openstack/nova-api-0ef9-account-create-update-dwtpm" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.600817 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f13aae8d-09b0-4975-80da-a02d609028d7-operator-scripts\") pod \"nova-api-0ef9-account-create-update-dwtpm\" (UID: \"f13aae8d-09b0-4975-80da-a02d609028d7\") " pod="openstack/nova-api-0ef9-account-create-update-dwtpm" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.600855 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l944s\" (UniqueName: \"kubernetes.io/projected/e634979e-62c4-4419-8917-e9e4b008de10-kube-api-access-l944s\") pod \"nova-cell1-db-create-wwttw\" (UID: \"e634979e-62c4-4419-8917-e9e4b008de10\") " pod="openstack/nova-cell1-db-create-wwttw" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.600879 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f76d580-9fff-4d98-854c-01082ff6b82a-operator-scripts\") pod \"nova-cell0-db-create-p9mvk\" (UID: \"8f76d580-9fff-4d98-854c-01082ff6b82a\") " pod="openstack/nova-cell0-db-create-p9mvk" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.601026 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5vtb\" (UniqueName: \"kubernetes.io/projected/8f76d580-9fff-4d98-854c-01082ff6b82a-kube-api-access-m5vtb\") pod \"nova-cell0-db-create-p9mvk\" (UID: \"8f76d580-9fff-4d98-854c-01082ff6b82a\") " pod="openstack/nova-cell0-db-create-p9mvk" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.601735 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f76d580-9fff-4d98-854c-01082ff6b82a-operator-scripts\") pod \"nova-cell0-db-create-p9mvk\" (UID: \"8f76d580-9fff-4d98-854c-01082ff6b82a\") " pod="openstack/nova-cell0-db-create-p9mvk" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.624731 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5vtb\" (UniqueName: \"kubernetes.io/projected/8f76d580-9fff-4d98-854c-01082ff6b82a-kube-api-access-m5vtb\") pod \"nova-cell0-db-create-p9mvk\" (UID: \"8f76d580-9fff-4d98-854c-01082ff6b82a\") " pod="openstack/nova-cell0-db-create-p9mvk" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.658062 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-aad1-account-create-update-6p4rv"] Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.659391 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.663415 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.685245 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-aad1-account-create-update-6p4rv"] Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.702991 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e634979e-62c4-4419-8917-e9e4b008de10-operator-scripts\") pod \"nova-cell1-db-create-wwttw\" (UID: \"e634979e-62c4-4419-8917-e9e4b008de10\") " pod="openstack/nova-cell1-db-create-wwttw" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.703265 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dql4j\" (UniqueName: \"kubernetes.io/projected/f13aae8d-09b0-4975-80da-a02d609028d7-kube-api-access-dql4j\") pod \"nova-api-0ef9-account-create-update-dwtpm\" (UID: \"f13aae8d-09b0-4975-80da-a02d609028d7\") " pod="openstack/nova-api-0ef9-account-create-update-dwtpm" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.703293 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f13aae8d-09b0-4975-80da-a02d609028d7-operator-scripts\") pod \"nova-api-0ef9-account-create-update-dwtpm\" (UID: \"f13aae8d-09b0-4975-80da-a02d609028d7\") " pod="openstack/nova-api-0ef9-account-create-update-dwtpm" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.703323 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l944s\" (UniqueName: \"kubernetes.io/projected/e634979e-62c4-4419-8917-e9e4b008de10-kube-api-access-l944s\") pod \"nova-cell1-db-create-wwttw\" (UID: \"e634979e-62c4-4419-8917-e9e4b008de10\") " pod="openstack/nova-cell1-db-create-wwttw" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.703375 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6-operator-scripts\") pod \"nova-cell0-aad1-account-create-update-6p4rv\" (UID: \"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6\") " pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.703405 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bptf\" (UniqueName: \"kubernetes.io/projected/3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6-kube-api-access-4bptf\") pod \"nova-cell0-aad1-account-create-update-6p4rv\" (UID: \"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6\") " pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.704112 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e634979e-62c4-4419-8917-e9e4b008de10-operator-scripts\") pod \"nova-cell1-db-create-wwttw\" (UID: \"e634979e-62c4-4419-8917-e9e4b008de10\") " pod="openstack/nova-cell1-db-create-wwttw" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.704245 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f13aae8d-09b0-4975-80da-a02d609028d7-operator-scripts\") pod \"nova-api-0ef9-account-create-update-dwtpm\" (UID: \"f13aae8d-09b0-4975-80da-a02d609028d7\") " pod="openstack/nova-api-0ef9-account-create-update-dwtpm" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.726936 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p9mvk" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.732599 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l944s\" (UniqueName: \"kubernetes.io/projected/e634979e-62c4-4419-8917-e9e4b008de10-kube-api-access-l944s\") pod \"nova-cell1-db-create-wwttw\" (UID: \"e634979e-62c4-4419-8917-e9e4b008de10\") " pod="openstack/nova-cell1-db-create-wwttw" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.733126 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dql4j\" (UniqueName: \"kubernetes.io/projected/f13aae8d-09b0-4975-80da-a02d609028d7-kube-api-access-dql4j\") pod \"nova-api-0ef9-account-create-update-dwtpm\" (UID: \"f13aae8d-09b0-4975-80da-a02d609028d7\") " pod="openstack/nova-api-0ef9-account-create-update-dwtpm" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.805345 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6-operator-scripts\") pod \"nova-cell0-aad1-account-create-update-6p4rv\" (UID: \"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6\") " pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.805398 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bptf\" (UniqueName: \"kubernetes.io/projected/3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6-kube-api-access-4bptf\") pod \"nova-cell0-aad1-account-create-update-6p4rv\" (UID: \"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6\") " pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.806600 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6-operator-scripts\") pod \"nova-cell0-aad1-account-create-update-6p4rv\" (UID: \"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6\") " pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.843191 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bptf\" (UniqueName: \"kubernetes.io/projected/3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6-kube-api-access-4bptf\") pod \"nova-cell0-aad1-account-create-update-6p4rv\" (UID: \"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6\") " pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.854224 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b6cf-account-create-update-xkq2k"] Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.855824 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.860780 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.868643 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b6cf-account-create-update-xkq2k"] Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.901102 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0ef9-account-create-update-dwtpm" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.906546 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/699962d8-9437-4095-8743-67593f336f95-operator-scripts\") pod \"nova-cell1-b6cf-account-create-update-xkq2k\" (UID: \"699962d8-9437-4095-8743-67593f336f95\") " pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.906660 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm98c\" (UniqueName: \"kubernetes.io/projected/699962d8-9437-4095-8743-67593f336f95-kube-api-access-vm98c\") pod \"nova-cell1-b6cf-account-create-update-xkq2k\" (UID: \"699962d8-9437-4095-8743-67593f336f95\") " pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" Feb 23 08:51:14 crc kubenswrapper[5047]: I0223 08:51:14.987258 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwttw" Feb 23 08:51:15 crc kubenswrapper[5047]: I0223 08:51:15.004829 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" Feb 23 08:51:15 crc kubenswrapper[5047]: I0223 08:51:15.007846 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/699962d8-9437-4095-8743-67593f336f95-operator-scripts\") pod \"nova-cell1-b6cf-account-create-update-xkq2k\" (UID: \"699962d8-9437-4095-8743-67593f336f95\") " pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" Feb 23 08:51:15 crc kubenswrapper[5047]: I0223 08:51:15.007956 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm98c\" (UniqueName: \"kubernetes.io/projected/699962d8-9437-4095-8743-67593f336f95-kube-api-access-vm98c\") pod \"nova-cell1-b6cf-account-create-update-xkq2k\" (UID: \"699962d8-9437-4095-8743-67593f336f95\") " pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" Feb 23 08:51:15 crc kubenswrapper[5047]: I0223 08:51:15.008866 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/699962d8-9437-4095-8743-67593f336f95-operator-scripts\") pod \"nova-cell1-b6cf-account-create-update-xkq2k\" (UID: \"699962d8-9437-4095-8743-67593f336f95\") " pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" Feb 23 08:51:15 crc kubenswrapper[5047]: I0223 08:51:15.035259 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm98c\" (UniqueName: \"kubernetes.io/projected/699962d8-9437-4095-8743-67593f336f95-kube-api-access-vm98c\") pod \"nova-cell1-b6cf-account-create-update-xkq2k\" (UID: \"699962d8-9437-4095-8743-67593f336f95\") " pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" Feb 23 08:51:15 crc kubenswrapper[5047]: I0223 08:51:15.155144 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9pg8r"] Feb 23 08:51:15 crc kubenswrapper[5047]: I0223 08:51:15.178129 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" Feb 23 08:51:15 crc kubenswrapper[5047]: I0223 08:51:15.292159 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-p9mvk"] Feb 23 08:51:15 crc kubenswrapper[5047]: I0223 08:51:15.425338 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0ef9-account-create-update-dwtpm"] Feb 23 08:51:15 crc kubenswrapper[5047]: I0223 08:51:15.551609 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-aad1-account-create-update-6p4rv"] Feb 23 08:51:15 crc kubenswrapper[5047]: I0223 08:51:15.570287 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wwttw"] Feb 23 08:51:15 crc kubenswrapper[5047]: W0223 08:51:15.587104 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode634979e_62c4_4419_8917_e9e4b008de10.slice/crio-04c60c50ccb03155ff0083fa0a9319a0a899f5e89f315fba7c78c71ea8a96a4b WatchSource:0}: Error finding container 04c60c50ccb03155ff0083fa0a9319a0a899f5e89f315fba7c78c71ea8a96a4b: Status 404 returned error can't find the container with id 04c60c50ccb03155ff0083fa0a9319a0a899f5e89f315fba7c78c71ea8a96a4b Feb 23 08:51:15 crc kubenswrapper[5047]: I0223 08:51:15.768843 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b6cf-account-create-update-xkq2k"] Feb 23 08:51:15 crc kubenswrapper[5047]: W0223 08:51:15.812809 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod699962d8_9437_4095_8743_67593f336f95.slice/crio-36b4d0434f19045da6bce36ade2b5accbcd05d8ea4cec0574eb58b582e9fab0a WatchSource:0}: Error finding container 36b4d0434f19045da6bce36ade2b5accbcd05d8ea4cec0574eb58b582e9fab0a: Status 404 returned error can't find the container with id 36b4d0434f19045da6bce36ade2b5accbcd05d8ea4cec0574eb58b582e9fab0a Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.115535 5047 generic.go:334] "Generic (PLEG): container finished" podID="7a81e655-7763-4211-a07b-8fcf1ab82ede" containerID="518e4d6da9c96b15872e77d6d277ee4e6e57682d3ea3a0de2b053eb45a8a73e0" exitCode=0 Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.115604 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9pg8r" event={"ID":"7a81e655-7763-4211-a07b-8fcf1ab82ede","Type":"ContainerDied","Data":"518e4d6da9c96b15872e77d6d277ee4e6e57682d3ea3a0de2b053eb45a8a73e0"} Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.116467 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9pg8r" event={"ID":"7a81e655-7763-4211-a07b-8fcf1ab82ede","Type":"ContainerStarted","Data":"d4416479016872c453b65145cc7fd2398fe5efff393fbbba883c01aa5b4ddc3e"} Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.120311 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwttw" event={"ID":"e634979e-62c4-4419-8917-e9e4b008de10","Type":"ContainerStarted","Data":"87f6e8f1567802a3589710cbc69996fba4d07e1b87bdd2d285304d8b7b5997b1"} Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.120357 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwttw" event={"ID":"e634979e-62c4-4419-8917-e9e4b008de10","Type":"ContainerStarted","Data":"04c60c50ccb03155ff0083fa0a9319a0a899f5e89f315fba7c78c71ea8a96a4b"} Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.122205 5047 generic.go:334] "Generic (PLEG): container finished" podID="8f76d580-9fff-4d98-854c-01082ff6b82a" containerID="7badcde11eeef8d36fa1c5d05192588cff9fabb6a6326c412253d56983b6de3a" exitCode=0 Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.122264 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p9mvk" event={"ID":"8f76d580-9fff-4d98-854c-01082ff6b82a","Type":"ContainerDied","Data":"7badcde11eeef8d36fa1c5d05192588cff9fabb6a6326c412253d56983b6de3a"} Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.122284 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p9mvk" event={"ID":"8f76d580-9fff-4d98-854c-01082ff6b82a","Type":"ContainerStarted","Data":"d4cd76797aa5b8d8a00cb088d5dfb4732db22174ebcb2c1c35b5917e65e5e36c"} Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.124211 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" event={"ID":"699962d8-9437-4095-8743-67593f336f95","Type":"ContainerStarted","Data":"84015c9c9972656e9b3d1708d54e6f2eab6c8245200499b42efa88851b8205f3"} Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.124239 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" event={"ID":"699962d8-9437-4095-8743-67593f336f95","Type":"ContainerStarted","Data":"36b4d0434f19045da6bce36ade2b5accbcd05d8ea4cec0574eb58b582e9fab0a"} Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.125845 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" event={"ID":"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6","Type":"ContainerStarted","Data":"6f3df3e41ece39ce4894e89c3599e80bdf48470ac34e98e79a40c703bb5ad2cb"} Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.125870 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" event={"ID":"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6","Type":"ContainerStarted","Data":"1401c8d7e9a31c90788162925f574ac49db72faff3202007b0607a9128d01ad5"} Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.127635 5047 generic.go:334] "Generic (PLEG): container finished" podID="f13aae8d-09b0-4975-80da-a02d609028d7" containerID="c96385941abbc12d87ca7d23f31336f6e5447d8ab180955bf9e313dba733d5fb" exitCode=0 Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.127664 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0ef9-account-create-update-dwtpm" event={"ID":"f13aae8d-09b0-4975-80da-a02d609028d7","Type":"ContainerDied","Data":"c96385941abbc12d87ca7d23f31336f6e5447d8ab180955bf9e313dba733d5fb"} Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.127679 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0ef9-account-create-update-dwtpm" event={"ID":"f13aae8d-09b0-4975-80da-a02d609028d7","Type":"ContainerStarted","Data":"0b96548131558fbbffd8b707e8149f047ef47c092969b52ce1b916674588ae2c"} Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.152554 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" podStartSLOduration=2.152532634 podStartE2EDuration="2.152532634s" podCreationTimestamp="2026-02-23 08:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:51:16.151828385 +0000 UTC m=+7598.403155519" watchObservedRunningTime="2026-02-23 08:51:16.152532634 +0000 UTC m=+7598.403859768" Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.189839 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-wwttw" podStartSLOduration=2.189815294 podStartE2EDuration="2.189815294s" podCreationTimestamp="2026-02-23 08:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:51:16.183048883 +0000 UTC m=+7598.434376027" watchObservedRunningTime="2026-02-23 08:51:16.189815294 +0000 UTC m=+7598.441142448" Feb 23 08:51:16 crc kubenswrapper[5047]: I0223 08:51:16.250092 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" podStartSLOduration=2.250066732 podStartE2EDuration="2.250066732s" podCreationTimestamp="2026-02-23 08:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:51:16.228439801 +0000 UTC m=+7598.479766935" watchObservedRunningTime="2026-02-23 08:51:16.250066732 +0000 UTC m=+7598.501393886" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.146462 5047 generic.go:334] "Generic (PLEG): container finished" podID="3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6" containerID="6f3df3e41ece39ce4894e89c3599e80bdf48470ac34e98e79a40c703bb5ad2cb" exitCode=0 Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.146723 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" event={"ID":"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6","Type":"ContainerDied","Data":"6f3df3e41ece39ce4894e89c3599e80bdf48470ac34e98e79a40c703bb5ad2cb"} Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.149565 5047 generic.go:334] "Generic (PLEG): container finished" podID="e634979e-62c4-4419-8917-e9e4b008de10" containerID="87f6e8f1567802a3589710cbc69996fba4d07e1b87bdd2d285304d8b7b5997b1" exitCode=0 Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.149660 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwttw" event={"ID":"e634979e-62c4-4419-8917-e9e4b008de10","Type":"ContainerDied","Data":"87f6e8f1567802a3589710cbc69996fba4d07e1b87bdd2d285304d8b7b5997b1"} Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.151606 5047 generic.go:334] "Generic (PLEG): container finished" podID="699962d8-9437-4095-8743-67593f336f95" containerID="84015c9c9972656e9b3d1708d54e6f2eab6c8245200499b42efa88851b8205f3" exitCode=0 Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.151848 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" event={"ID":"699962d8-9437-4095-8743-67593f336f95","Type":"ContainerDied","Data":"84015c9c9972656e9b3d1708d54e6f2eab6c8245200499b42efa88851b8205f3"} Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.604177 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0ef9-account-create-update-dwtpm" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.680364 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f13aae8d-09b0-4975-80da-a02d609028d7-operator-scripts\") pod \"f13aae8d-09b0-4975-80da-a02d609028d7\" (UID: \"f13aae8d-09b0-4975-80da-a02d609028d7\") " Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.680415 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dql4j\" (UniqueName: \"kubernetes.io/projected/f13aae8d-09b0-4975-80da-a02d609028d7-kube-api-access-dql4j\") pod \"f13aae8d-09b0-4975-80da-a02d609028d7\" (UID: \"f13aae8d-09b0-4975-80da-a02d609028d7\") " Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.681139 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13aae8d-09b0-4975-80da-a02d609028d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f13aae8d-09b0-4975-80da-a02d609028d7" (UID: "f13aae8d-09b0-4975-80da-a02d609028d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.686779 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f13aae8d-09b0-4975-80da-a02d609028d7-kube-api-access-dql4j" (OuterVolumeSpecName: "kube-api-access-dql4j") pod "f13aae8d-09b0-4975-80da-a02d609028d7" (UID: "f13aae8d-09b0-4975-80da-a02d609028d7"). InnerVolumeSpecName "kube-api-access-dql4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.746592 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p9mvk" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.751443 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9pg8r" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.781864 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5vtb\" (UniqueName: \"kubernetes.io/projected/8f76d580-9fff-4d98-854c-01082ff6b82a-kube-api-access-m5vtb\") pod \"8f76d580-9fff-4d98-854c-01082ff6b82a\" (UID: \"8f76d580-9fff-4d98-854c-01082ff6b82a\") " Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.781931 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f76d580-9fff-4d98-854c-01082ff6b82a-operator-scripts\") pod \"8f76d580-9fff-4d98-854c-01082ff6b82a\" (UID: \"8f76d580-9fff-4d98-854c-01082ff6b82a\") " Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.782803 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f13aae8d-09b0-4975-80da-a02d609028d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.782839 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dql4j\" (UniqueName: \"kubernetes.io/projected/f13aae8d-09b0-4975-80da-a02d609028d7-kube-api-access-dql4j\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.783063 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f76d580-9fff-4d98-854c-01082ff6b82a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f76d580-9fff-4d98-854c-01082ff6b82a" (UID: "8f76d580-9fff-4d98-854c-01082ff6b82a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.786595 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f76d580-9fff-4d98-854c-01082ff6b82a-kube-api-access-m5vtb" (OuterVolumeSpecName: "kube-api-access-m5vtb") pod "8f76d580-9fff-4d98-854c-01082ff6b82a" (UID: "8f76d580-9fff-4d98-854c-01082ff6b82a"). InnerVolumeSpecName "kube-api-access-m5vtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.884091 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc65c\" (UniqueName: \"kubernetes.io/projected/7a81e655-7763-4211-a07b-8fcf1ab82ede-kube-api-access-bc65c\") pod \"7a81e655-7763-4211-a07b-8fcf1ab82ede\" (UID: \"7a81e655-7763-4211-a07b-8fcf1ab82ede\") " Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.884842 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a81e655-7763-4211-a07b-8fcf1ab82ede-operator-scripts\") pod \"7a81e655-7763-4211-a07b-8fcf1ab82ede\" (UID: \"7a81e655-7763-4211-a07b-8fcf1ab82ede\") " Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.885158 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a81e655-7763-4211-a07b-8fcf1ab82ede-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a81e655-7763-4211-a07b-8fcf1ab82ede" (UID: "7a81e655-7763-4211-a07b-8fcf1ab82ede"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.885484 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a81e655-7763-4211-a07b-8fcf1ab82ede-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.885523 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5vtb\" (UniqueName: \"kubernetes.io/projected/8f76d580-9fff-4d98-854c-01082ff6b82a-kube-api-access-m5vtb\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.885538 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f76d580-9fff-4d98-854c-01082ff6b82a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.888304 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a81e655-7763-4211-a07b-8fcf1ab82ede-kube-api-access-bc65c" (OuterVolumeSpecName: "kube-api-access-bc65c") pod "7a81e655-7763-4211-a07b-8fcf1ab82ede" (UID: "7a81e655-7763-4211-a07b-8fcf1ab82ede"). InnerVolumeSpecName "kube-api-access-bc65c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:51:17 crc kubenswrapper[5047]: I0223 08:51:17.987731 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc65c\" (UniqueName: \"kubernetes.io/projected/7a81e655-7763-4211-a07b-8fcf1ab82ede-kube-api-access-bc65c\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.165182 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9pg8r" event={"ID":"7a81e655-7763-4211-a07b-8fcf1ab82ede","Type":"ContainerDied","Data":"d4416479016872c453b65145cc7fd2398fe5efff393fbbba883c01aa5b4ddc3e"} Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.165214 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9pg8r" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.165232 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4416479016872c453b65145cc7fd2398fe5efff393fbbba883c01aa5b4ddc3e" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.167299 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p9mvk" event={"ID":"8f76d580-9fff-4d98-854c-01082ff6b82a","Type":"ContainerDied","Data":"d4cd76797aa5b8d8a00cb088d5dfb4732db22174ebcb2c1c35b5917e65e5e36c"} Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.167336 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4cd76797aa5b8d8a00cb088d5dfb4732db22174ebcb2c1c35b5917e65e5e36c" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.167420 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p9mvk" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.172729 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0ef9-account-create-update-dwtpm" event={"ID":"f13aae8d-09b0-4975-80da-a02d609028d7","Type":"ContainerDied","Data":"0b96548131558fbbffd8b707e8149f047ef47c092969b52ce1b916674588ae2c"} Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.172961 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b96548131558fbbffd8b707e8149f047ef47c092969b52ce1b916674588ae2c" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.172831 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0ef9-account-create-update-dwtpm" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.627145 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.700640 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm98c\" (UniqueName: \"kubernetes.io/projected/699962d8-9437-4095-8743-67593f336f95-kube-api-access-vm98c\") pod \"699962d8-9437-4095-8743-67593f336f95\" (UID: \"699962d8-9437-4095-8743-67593f336f95\") " Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.700773 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/699962d8-9437-4095-8743-67593f336f95-operator-scripts\") pod \"699962d8-9437-4095-8743-67593f336f95\" (UID: \"699962d8-9437-4095-8743-67593f336f95\") " Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.703188 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/699962d8-9437-4095-8743-67593f336f95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "699962d8-9437-4095-8743-67593f336f95" (UID: "699962d8-9437-4095-8743-67593f336f95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.703837 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/699962d8-9437-4095-8743-67593f336f95-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.709319 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699962d8-9437-4095-8743-67593f336f95-kube-api-access-vm98c" (OuterVolumeSpecName: "kube-api-access-vm98c") pod "699962d8-9437-4095-8743-67593f336f95" (UID: "699962d8-9437-4095-8743-67593f336f95"). InnerVolumeSpecName "kube-api-access-vm98c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.798066 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwttw" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.803458 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.805121 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm98c\" (UniqueName: \"kubernetes.io/projected/699962d8-9437-4095-8743-67593f336f95-kube-api-access-vm98c\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.906161 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bptf\" (UniqueName: \"kubernetes.io/projected/3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6-kube-api-access-4bptf\") pod \"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6\" (UID: \"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6\") " Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.906342 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e634979e-62c4-4419-8917-e9e4b008de10-operator-scripts\") pod \"e634979e-62c4-4419-8917-e9e4b008de10\" (UID: \"e634979e-62c4-4419-8917-e9e4b008de10\") " Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.906384 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l944s\" (UniqueName: \"kubernetes.io/projected/e634979e-62c4-4419-8917-e9e4b008de10-kube-api-access-l944s\") pod \"e634979e-62c4-4419-8917-e9e4b008de10\" (UID: \"e634979e-62c4-4419-8917-e9e4b008de10\") " Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.906463 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6-operator-scripts\") pod \"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6\" (UID: \"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6\") " Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.907581 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e634979e-62c4-4419-8917-e9e4b008de10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e634979e-62c4-4419-8917-e9e4b008de10" (UID: "e634979e-62c4-4419-8917-e9e4b008de10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.907753 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6" (UID: "3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.911417 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6-kube-api-access-4bptf" (OuterVolumeSpecName: "kube-api-access-4bptf") pod "3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6" (UID: "3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6"). InnerVolumeSpecName "kube-api-access-4bptf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:51:18 crc kubenswrapper[5047]: I0223 08:51:18.912203 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e634979e-62c4-4419-8917-e9e4b008de10-kube-api-access-l944s" (OuterVolumeSpecName: "kube-api-access-l944s") pod "e634979e-62c4-4419-8917-e9e4b008de10" (UID: "e634979e-62c4-4419-8917-e9e4b008de10"). InnerVolumeSpecName "kube-api-access-l944s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:51:19 crc kubenswrapper[5047]: I0223 08:51:19.010219 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bptf\" (UniqueName: \"kubernetes.io/projected/3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6-kube-api-access-4bptf\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:19 crc kubenswrapper[5047]: I0223 08:51:19.010271 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e634979e-62c4-4419-8917-e9e4b008de10-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:19 crc kubenswrapper[5047]: I0223 08:51:19.010291 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l944s\" (UniqueName: \"kubernetes.io/projected/e634979e-62c4-4419-8917-e9e4b008de10-kube-api-access-l944s\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:19 crc kubenswrapper[5047]: I0223 08:51:19.010309 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:19 crc kubenswrapper[5047]: I0223 08:51:19.194661 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" event={"ID":"699962d8-9437-4095-8743-67593f336f95","Type":"ContainerDied","Data":"36b4d0434f19045da6bce36ade2b5accbcd05d8ea4cec0574eb58b582e9fab0a"} Feb 23 08:51:19 crc kubenswrapper[5047]: I0223 08:51:19.194732 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36b4d0434f19045da6bce36ade2b5accbcd05d8ea4cec0574eb58b582e9fab0a" Feb 23 08:51:19 crc kubenswrapper[5047]: I0223 08:51:19.194774 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b6cf-account-create-update-xkq2k" Feb 23 08:51:19 crc kubenswrapper[5047]: I0223 08:51:19.206793 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" Feb 23 08:51:19 crc kubenswrapper[5047]: I0223 08:51:19.207729 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aad1-account-create-update-6p4rv" event={"ID":"3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6","Type":"ContainerDied","Data":"1401c8d7e9a31c90788162925f574ac49db72faff3202007b0607a9128d01ad5"} Feb 23 08:51:19 crc kubenswrapper[5047]: I0223 08:51:19.207829 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1401c8d7e9a31c90788162925f574ac49db72faff3202007b0607a9128d01ad5" Feb 23 08:51:19 crc kubenswrapper[5047]: I0223 08:51:19.214710 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwttw" event={"ID":"e634979e-62c4-4419-8917-e9e4b008de10","Type":"ContainerDied","Data":"04c60c50ccb03155ff0083fa0a9319a0a899f5e89f315fba7c78c71ea8a96a4b"} Feb 23 08:51:19 crc kubenswrapper[5047]: I0223 08:51:19.214762 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04c60c50ccb03155ff0083fa0a9319a0a899f5e89f315fba7c78c71ea8a96a4b" Feb 23 08:51:19 crc kubenswrapper[5047]: I0223 08:51:19.214852 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwttw" Feb 23 08:51:23 crc kubenswrapper[5047]: I0223 08:51:23.341654 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:51:23 crc kubenswrapper[5047]: E0223 08:51:23.342767 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.891743 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wg7nn"] Feb 23 08:51:24 crc kubenswrapper[5047]: E0223 08:51:24.892471 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6" containerName="mariadb-account-create-update" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.892493 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6" containerName="mariadb-account-create-update" Feb 23 08:51:24 crc kubenswrapper[5047]: E0223 08:51:24.892511 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699962d8-9437-4095-8743-67593f336f95" containerName="mariadb-account-create-update" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.892517 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="699962d8-9437-4095-8743-67593f336f95" containerName="mariadb-account-create-update" Feb 23 08:51:24 crc kubenswrapper[5047]: E0223 08:51:24.892531 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e634979e-62c4-4419-8917-e9e4b008de10" containerName="mariadb-database-create" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.892538 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e634979e-62c4-4419-8917-e9e4b008de10" containerName="mariadb-database-create" Feb 23 08:51:24 crc kubenswrapper[5047]: E0223 08:51:24.892546 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a81e655-7763-4211-a07b-8fcf1ab82ede" containerName="mariadb-database-create" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.892552 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a81e655-7763-4211-a07b-8fcf1ab82ede" containerName="mariadb-database-create" Feb 23 08:51:24 crc kubenswrapper[5047]: E0223 08:51:24.892563 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f13aae8d-09b0-4975-80da-a02d609028d7" containerName="mariadb-account-create-update" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.892569 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13aae8d-09b0-4975-80da-a02d609028d7" containerName="mariadb-account-create-update" Feb 23 08:51:24 crc kubenswrapper[5047]: E0223 08:51:24.892580 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f76d580-9fff-4d98-854c-01082ff6b82a" containerName="mariadb-database-create" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.892586 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f76d580-9fff-4d98-854c-01082ff6b82a" containerName="mariadb-database-create" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.892741 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a81e655-7763-4211-a07b-8fcf1ab82ede" containerName="mariadb-database-create" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.892759 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f13aae8d-09b0-4975-80da-a02d609028d7" containerName="mariadb-account-create-update" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.892766 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6" containerName="mariadb-account-create-update" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.892776 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="699962d8-9437-4095-8743-67593f336f95" containerName="mariadb-account-create-update" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.892792 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f76d580-9fff-4d98-854c-01082ff6b82a" containerName="mariadb-database-create" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.892805 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e634979e-62c4-4419-8917-e9e4b008de10" containerName="mariadb-database-create" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.893407 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.896244 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4dh6l" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.897867 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.897979 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.910538 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wg7nn"] Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.939446 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p8xl\" (UniqueName: \"kubernetes.io/projected/5e16a6f3-b78e-4388-a881-8371a2e2295f-kube-api-access-2p8xl\") pod \"nova-cell0-conductor-db-sync-wg7nn\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.939608 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-config-data\") pod \"nova-cell0-conductor-db-sync-wg7nn\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.939645 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-scripts\") pod \"nova-cell0-conductor-db-sync-wg7nn\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:24 crc kubenswrapper[5047]: I0223 08:51:24.939685 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wg7nn\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:25 crc kubenswrapper[5047]: I0223 08:51:25.041201 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-config-data\") pod \"nova-cell0-conductor-db-sync-wg7nn\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:25 crc kubenswrapper[5047]: I0223 08:51:25.041260 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-scripts\") pod \"nova-cell0-conductor-db-sync-wg7nn\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:25 crc kubenswrapper[5047]: I0223 08:51:25.041290 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wg7nn\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:25 crc kubenswrapper[5047]: I0223 08:51:25.041380 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p8xl\" (UniqueName: \"kubernetes.io/projected/5e16a6f3-b78e-4388-a881-8371a2e2295f-kube-api-access-2p8xl\") pod \"nova-cell0-conductor-db-sync-wg7nn\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:25 crc kubenswrapper[5047]: I0223 08:51:25.048165 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-config-data\") pod \"nova-cell0-conductor-db-sync-wg7nn\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:25 crc kubenswrapper[5047]: I0223 08:51:25.054732 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-scripts\") pod \"nova-cell0-conductor-db-sync-wg7nn\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:25 crc kubenswrapper[5047]: I0223 08:51:25.057679 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p8xl\" (UniqueName: \"kubernetes.io/projected/5e16a6f3-b78e-4388-a881-8371a2e2295f-kube-api-access-2p8xl\") pod \"nova-cell0-conductor-db-sync-wg7nn\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:25 crc kubenswrapper[5047]: I0223 08:51:25.057992 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wg7nn\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:25 crc kubenswrapper[5047]: I0223 08:51:25.214941 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:25 crc kubenswrapper[5047]: I0223 08:51:25.743211 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wg7nn"] Feb 23 08:51:26 crc kubenswrapper[5047]: I0223 08:51:26.280998 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wg7nn" event={"ID":"5e16a6f3-b78e-4388-a881-8371a2e2295f","Type":"ContainerStarted","Data":"0baa2336bb30b57b5ec070becf20069a56552ab7ce4dd8041ed48c516135af68"} Feb 23 08:51:34 crc kubenswrapper[5047]: I0223 08:51:34.341839 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:51:34 crc kubenswrapper[5047]: E0223 08:51:34.342949 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:51:35 crc kubenswrapper[5047]: I0223 08:51:35.392739 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wg7nn" event={"ID":"5e16a6f3-b78e-4388-a881-8371a2e2295f","Type":"ContainerStarted","Data":"62d3ce1c8e36802de08108bf2d341c7932489c2988b9b51c0a31bb156c86c725"} Feb 23 08:51:35 crc kubenswrapper[5047]: I0223 08:51:35.427215 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-wg7nn" podStartSLOduration=2.624720466 podStartE2EDuration="11.427181637s" podCreationTimestamp="2026-02-23 08:51:24 +0000 UTC" firstStartedPulling="2026-02-23 08:51:25.746926213 +0000 UTC m=+7607.998253347" lastFinishedPulling="2026-02-23 08:51:34.549387384 +0000 UTC m=+7616.800714518" observedRunningTime="2026-02-23 08:51:35.421218037 +0000 UTC m=+7617.672545201" watchObservedRunningTime="2026-02-23 08:51:35.427181637 +0000 UTC m=+7617.678508801" Feb 23 08:51:40 crc kubenswrapper[5047]: I0223 08:51:40.442485 5047 generic.go:334] "Generic (PLEG): container finished" podID="5e16a6f3-b78e-4388-a881-8371a2e2295f" containerID="62d3ce1c8e36802de08108bf2d341c7932489c2988b9b51c0a31bb156c86c725" exitCode=0 Feb 23 08:51:40 crc kubenswrapper[5047]: I0223 08:51:40.442579 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wg7nn" event={"ID":"5e16a6f3-b78e-4388-a881-8371a2e2295f","Type":"ContainerDied","Data":"62d3ce1c8e36802de08108bf2d341c7932489c2988b9b51c0a31bb156c86c725"} Feb 23 08:51:41 crc kubenswrapper[5047]: I0223 08:51:41.940431 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.112002 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-combined-ca-bundle\") pod \"5e16a6f3-b78e-4388-a881-8371a2e2295f\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.112375 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p8xl\" (UniqueName: \"kubernetes.io/projected/5e16a6f3-b78e-4388-a881-8371a2e2295f-kube-api-access-2p8xl\") pod \"5e16a6f3-b78e-4388-a881-8371a2e2295f\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.112539 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-scripts\") pod \"5e16a6f3-b78e-4388-a881-8371a2e2295f\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.112573 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-config-data\") pod \"5e16a6f3-b78e-4388-a881-8371a2e2295f\" (UID: \"5e16a6f3-b78e-4388-a881-8371a2e2295f\") " Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.126362 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e16a6f3-b78e-4388-a881-8371a2e2295f-kube-api-access-2p8xl" (OuterVolumeSpecName: "kube-api-access-2p8xl") pod "5e16a6f3-b78e-4388-a881-8371a2e2295f" (UID: "5e16a6f3-b78e-4388-a881-8371a2e2295f"). InnerVolumeSpecName "kube-api-access-2p8xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.126561 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-scripts" (OuterVolumeSpecName: "scripts") pod "5e16a6f3-b78e-4388-a881-8371a2e2295f" (UID: "5e16a6f3-b78e-4388-a881-8371a2e2295f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.154236 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e16a6f3-b78e-4388-a881-8371a2e2295f" (UID: "5e16a6f3-b78e-4388-a881-8371a2e2295f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.163205 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-config-data" (OuterVolumeSpecName: "config-data") pod "5e16a6f3-b78e-4388-a881-8371a2e2295f" (UID: "5e16a6f3-b78e-4388-a881-8371a2e2295f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.214804 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.214847 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.214862 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e16a6f3-b78e-4388-a881-8371a2e2295f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.214877 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p8xl\" (UniqueName: \"kubernetes.io/projected/5e16a6f3-b78e-4388-a881-8371a2e2295f-kube-api-access-2p8xl\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.463395 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wg7nn" event={"ID":"5e16a6f3-b78e-4388-a881-8371a2e2295f","Type":"ContainerDied","Data":"0baa2336bb30b57b5ec070becf20069a56552ab7ce4dd8041ed48c516135af68"} Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.463431 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0baa2336bb30b57b5ec070becf20069a56552ab7ce4dd8041ed48c516135af68" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.463485 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wg7nn" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.607461 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 08:51:42 crc kubenswrapper[5047]: E0223 08:51:42.607957 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e16a6f3-b78e-4388-a881-8371a2e2295f" containerName="nova-cell0-conductor-db-sync" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.607974 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e16a6f3-b78e-4388-a881-8371a2e2295f" containerName="nova-cell0-conductor-db-sync" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.608242 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e16a6f3-b78e-4388-a881-8371a2e2295f" containerName="nova-cell0-conductor-db-sync" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.609074 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.611321 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4dh6l" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.613209 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.621654 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.727453 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14988409-8f11-42ca-bc6d-a4ba3d3056a4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.728555 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfskw\" (UniqueName: \"kubernetes.io/projected/14988409-8f11-42ca-bc6d-a4ba3d3056a4-kube-api-access-qfskw\") pod \"nova-cell0-conductor-0\" (UID: \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.728728 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14988409-8f11-42ca-bc6d-a4ba3d3056a4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.830712 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfskw\" (UniqueName: \"kubernetes.io/projected/14988409-8f11-42ca-bc6d-a4ba3d3056a4-kube-api-access-qfskw\") pod \"nova-cell0-conductor-0\" (UID: \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.830762 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14988409-8f11-42ca-bc6d-a4ba3d3056a4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.830855 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14988409-8f11-42ca-bc6d-a4ba3d3056a4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.837457 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14988409-8f11-42ca-bc6d-a4ba3d3056a4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.838556 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14988409-8f11-42ca-bc6d-a4ba3d3056a4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.862443 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfskw\" (UniqueName: \"kubernetes.io/projected/14988409-8f11-42ca-bc6d-a4ba3d3056a4-kube-api-access-qfskw\") pod \"nova-cell0-conductor-0\" (UID: \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:51:42 crc kubenswrapper[5047]: I0223 08:51:42.948250 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 08:51:43 crc kubenswrapper[5047]: I0223 08:51:43.547990 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 08:51:44 crc kubenswrapper[5047]: I0223 08:51:44.490429 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"14988409-8f11-42ca-bc6d-a4ba3d3056a4","Type":"ContainerStarted","Data":"b55343fde4610941e2de9a407053a8025bb06c2d056689bd185172f9ed4bdd5a"} Feb 23 08:51:44 crc kubenswrapper[5047]: I0223 08:51:44.490974 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"14988409-8f11-42ca-bc6d-a4ba3d3056a4","Type":"ContainerStarted","Data":"9501a06a1f385419b71931a2d8cb4eb06a1be0c307accee6c17be6c2cc013e2e"} Feb 23 08:51:44 crc kubenswrapper[5047]: I0223 08:51:44.491170 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 23 08:51:44 crc kubenswrapper[5047]: I0223 08:51:44.522119 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.522092289 podStartE2EDuration="2.522092289s" podCreationTimestamp="2026-02-23 08:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:51:44.511693769 +0000 UTC m=+7626.763020933" watchObservedRunningTime="2026-02-23 08:51:44.522092289 +0000 UTC m=+7626.773419443" Feb 23 08:51:48 crc kubenswrapper[5047]: I0223 08:51:48.350114 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:51:48 crc kubenswrapper[5047]: E0223 08:51:48.350950 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:51:52 crc kubenswrapper[5047]: I0223 08:51:52.997238 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.534690 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-49rcc"] Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.538016 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.540969 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.541006 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.553567 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-49rcc"] Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.687954 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkplx\" (UniqueName: \"kubernetes.io/projected/3dde3e53-3ac7-464f-a504-e1b815844cb4-kube-api-access-xkplx\") pod \"nova-cell0-cell-mapping-49rcc\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.688383 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-scripts\") pod \"nova-cell0-cell-mapping-49rcc\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.688801 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-config-data\") pod \"nova-cell0-cell-mapping-49rcc\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.688830 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-49rcc\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.792064 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-config-data\") pod \"nova-cell0-cell-mapping-49rcc\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.792112 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-49rcc\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.792143 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkplx\" (UniqueName: \"kubernetes.io/projected/3dde3e53-3ac7-464f-a504-e1b815844cb4-kube-api-access-xkplx\") pod \"nova-cell0-cell-mapping-49rcc\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.792219 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-scripts\") pod \"nova-cell0-cell-mapping-49rcc\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.808131 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-config-data\") pod \"nova-cell0-cell-mapping-49rcc\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.820009 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-49rcc\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.835498 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.843821 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-scripts\") pod \"nova-cell0-cell-mapping-49rcc\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.850674 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkplx\" (UniqueName: \"kubernetes.io/projected/3dde3e53-3ac7-464f-a504-e1b815844cb4-kube-api-access-xkplx\") pod \"nova-cell0-cell-mapping-49rcc\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.856793 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.860718 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.874101 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 08:51:53 crc kubenswrapper[5047]: I0223 08:51:53.907861 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.004029 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-config-data\") pod \"nova-api-0\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " pod="openstack/nova-api-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.004089 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vhm4\" (UniqueName: \"kubernetes.io/projected/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-kube-api-access-8vhm4\") pod \"nova-api-0\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " pod="openstack/nova-api-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.004119 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-logs\") pod \"nova-api-0\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " pod="openstack/nova-api-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.004153 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " pod="openstack/nova-api-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.073956 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.075560 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.095097 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.110573 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-config-data\") pod \"nova-api-0\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " pod="openstack/nova-api-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.110647 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vhm4\" (UniqueName: \"kubernetes.io/projected/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-kube-api-access-8vhm4\") pod \"nova-api-0\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " pod="openstack/nova-api-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.110683 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-logs\") pod \"nova-api-0\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " pod="openstack/nova-api-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.110724 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " pod="openstack/nova-api-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.121341 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.133702 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-config-data\") pod \"nova-api-0\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " pod="openstack/nova-api-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.138358 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-logs\") pod \"nova-api-0\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " pod="openstack/nova-api-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.157734 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " pod="openstack/nova-api-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.157799 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.158991 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.165408 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.180300 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vhm4\" (UniqueName: \"kubernetes.io/projected/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-kube-api-access-8vhm4\") pod \"nova-api-0\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " pod="openstack/nova-api-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.184808 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.215056 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn9qs\" (UniqueName: \"kubernetes.io/projected/18a35c0a-dcef-48db-8033-f48013cada37-kube-api-access-hn9qs\") pod \"nova-metadata-0\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.215447 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18a35c0a-dcef-48db-8033-f48013cada37-config-data\") pod \"nova-metadata-0\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.215516 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a35c0a-dcef-48db-8033-f48013cada37-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.215536 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18a35c0a-dcef-48db-8033-f48013cada37-logs\") pod \"nova-metadata-0\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.318636 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn9qs\" (UniqueName: \"kubernetes.io/projected/18a35c0a-dcef-48db-8033-f48013cada37-kube-api-access-hn9qs\") pod \"nova-metadata-0\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.318728 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18a35c0a-dcef-48db-8033-f48013cada37-config-data\") pod \"nova-metadata-0\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.318758 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rmtm\" (UniqueName: \"kubernetes.io/projected/023c3c52-cbe1-437b-ba55-f313bd545ba6-kube-api-access-5rmtm\") pod \"nova-cell1-novncproxy-0\" (UID: \"023c3c52-cbe1-437b-ba55-f313bd545ba6\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.318786 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023c3c52-cbe1-437b-ba55-f313bd545ba6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"023c3c52-cbe1-437b-ba55-f313bd545ba6\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.318805 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/023c3c52-cbe1-437b-ba55-f313bd545ba6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"023c3c52-cbe1-437b-ba55-f313bd545ba6\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.318851 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a35c0a-dcef-48db-8033-f48013cada37-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.318869 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18a35c0a-dcef-48db-8033-f48013cada37-logs\") pod \"nova-metadata-0\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.319408 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18a35c0a-dcef-48db-8033-f48013cada37-logs\") pod \"nova-metadata-0\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.327878 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18a35c0a-dcef-48db-8033-f48013cada37-config-data\") pod \"nova-metadata-0\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.329443 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a35c0a-dcef-48db-8033-f48013cada37-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.347470 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn9qs\" (UniqueName: \"kubernetes.io/projected/18a35c0a-dcef-48db-8033-f48013cada37-kube-api-access-hn9qs\") pod \"nova-metadata-0\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.383299 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7dd599d8f5-jwm7f"] Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.385150 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.390163 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.401125 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dd599d8f5-jwm7f"] Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.413611 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.414867 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.423126 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-dns-svc\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.423227 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-ovsdbserver-nb\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.423268 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-ovsdbserver-sb\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.423298 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9596w\" (UniqueName: \"kubernetes.io/projected/29eb3c3a-fe49-482a-95f2-dcb88bd99631-kube-api-access-9596w\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.423385 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-config\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.423412 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rmtm\" (UniqueName: \"kubernetes.io/projected/023c3c52-cbe1-437b-ba55-f313bd545ba6-kube-api-access-5rmtm\") pod \"nova-cell1-novncproxy-0\" (UID: \"023c3c52-cbe1-437b-ba55-f313bd545ba6\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.423441 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023c3c52-cbe1-437b-ba55-f313bd545ba6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"023c3c52-cbe1-437b-ba55-f313bd545ba6\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.423462 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/023c3c52-cbe1-437b-ba55-f313bd545ba6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"023c3c52-cbe1-437b-ba55-f313bd545ba6\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.427521 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.434632 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023c3c52-cbe1-437b-ba55-f313bd545ba6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"023c3c52-cbe1-437b-ba55-f313bd545ba6\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.434886 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/023c3c52-cbe1-437b-ba55-f313bd545ba6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"023c3c52-cbe1-437b-ba55-f313bd545ba6\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.464566 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rmtm\" (UniqueName: \"kubernetes.io/projected/023c3c52-cbe1-437b-ba55-f313bd545ba6-kube-api-access-5rmtm\") pod \"nova-cell1-novncproxy-0\" (UID: \"023c3c52-cbe1-437b-ba55-f313bd545ba6\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.467963 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.484710 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.529413 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-config\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.529652 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18b33d4-e894-476b-afa1-33794c308ddf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b18b33d4-e894-476b-afa1-33794c308ddf\") " pod="openstack/nova-scheduler-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.536669 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18b33d4-e894-476b-afa1-33794c308ddf-config-data\") pod \"nova-scheduler-0\" (UID: \"b18b33d4-e894-476b-afa1-33794c308ddf\") " pod="openstack/nova-scheduler-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.536775 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-dns-svc\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.537051 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-ovsdbserver-nb\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.537145 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-ovsdbserver-sb\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.537236 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9596w\" (UniqueName: \"kubernetes.io/projected/29eb3c3a-fe49-482a-95f2-dcb88bd99631-kube-api-access-9596w\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.537266 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxbrb\" (UniqueName: \"kubernetes.io/projected/b18b33d4-e894-476b-afa1-33794c308ddf-kube-api-access-vxbrb\") pod \"nova-scheduler-0\" (UID: \"b18b33d4-e894-476b-afa1-33794c308ddf\") " pod="openstack/nova-scheduler-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.537974 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-config\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.538500 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-dns-svc\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.538954 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-ovsdbserver-sb\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.539884 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-ovsdbserver-nb\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.541072 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.572106 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9596w\" (UniqueName: \"kubernetes.io/projected/29eb3c3a-fe49-482a-95f2-dcb88bd99631-kube-api-access-9596w\") pod \"dnsmasq-dns-7dd599d8f5-jwm7f\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.652671 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxbrb\" (UniqueName: \"kubernetes.io/projected/b18b33d4-e894-476b-afa1-33794c308ddf-kube-api-access-vxbrb\") pod \"nova-scheduler-0\" (UID: \"b18b33d4-e894-476b-afa1-33794c308ddf\") " pod="openstack/nova-scheduler-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.653329 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18b33d4-e894-476b-afa1-33794c308ddf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b18b33d4-e894-476b-afa1-33794c308ddf\") " pod="openstack/nova-scheduler-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.653429 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18b33d4-e894-476b-afa1-33794c308ddf-config-data\") pod \"nova-scheduler-0\" (UID: \"b18b33d4-e894-476b-afa1-33794c308ddf\") " pod="openstack/nova-scheduler-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.667443 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18b33d4-e894-476b-afa1-33794c308ddf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b18b33d4-e894-476b-afa1-33794c308ddf\") " pod="openstack/nova-scheduler-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.675779 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxbrb\" (UniqueName: \"kubernetes.io/projected/b18b33d4-e894-476b-afa1-33794c308ddf-kube-api-access-vxbrb\") pod \"nova-scheduler-0\" (UID: \"b18b33d4-e894-476b-afa1-33794c308ddf\") " pod="openstack/nova-scheduler-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.678798 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18b33d4-e894-476b-afa1-33794c308ddf-config-data\") pod \"nova-scheduler-0\" (UID: \"b18b33d4-e894-476b-afa1-33794c308ddf\") " pod="openstack/nova-scheduler-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.743997 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.768236 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:51:54 crc kubenswrapper[5047]: I0223 08:51:54.780949 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-49rcc"] Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.618122 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-49rcc" event={"ID":"3dde3e53-3ac7-464f-a504-e1b815844cb4","Type":"ContainerStarted","Data":"a4715fe48675ee8aaa2b2db560444de0eaab0a02b96b2c644b436815f42cbad1"} Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.795544 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-997jz"] Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.801969 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.810622 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.810957 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.843103 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-997jz"] Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.855018 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.893281 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-config-data\") pod \"nova-cell1-conductor-db-sync-997jz\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.893456 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-scripts\") pod \"nova-cell1-conductor-db-sync-997jz\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.893501 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czptw\" (UniqueName: \"kubernetes.io/projected/dafcfc67-d904-4d7b-946f-1f04e880b98f-kube-api-access-czptw\") pod \"nova-cell1-conductor-db-sync-997jz\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.893521 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-997jz\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.995440 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-config-data\") pod \"nova-cell1-conductor-db-sync-997jz\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.995624 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-scripts\") pod \"nova-cell1-conductor-db-sync-997jz\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.995660 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czptw\" (UniqueName: \"kubernetes.io/projected/dafcfc67-d904-4d7b-946f-1f04e880b98f-kube-api-access-czptw\") pod \"nova-cell1-conductor-db-sync-997jz\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:55 crc kubenswrapper[5047]: I0223 08:51:55.995679 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-997jz\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.004727 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-scripts\") pod \"nova-cell1-conductor-db-sync-997jz\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.005677 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-997jz\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.010262 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-config-data\") pod \"nova-cell1-conductor-db-sync-997jz\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.018465 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czptw\" (UniqueName: \"kubernetes.io/projected/dafcfc67-d904-4d7b-946f-1f04e880b98f-kube-api-access-czptw\") pod \"nova-cell1-conductor-db-sync-997jz\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.128462 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.201874 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:51:56 crc kubenswrapper[5047]: W0223 08:51:56.218656 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18a35c0a_dcef_48db_8033_f48013cada37.slice/crio-149664fcdd98c5506f46884247a52151bceb3ad168ec2502fcf182021b5b9424 WatchSource:0}: Error finding container 149664fcdd98c5506f46884247a52151bceb3ad168ec2502fcf182021b5b9424: Status 404 returned error can't find the container with id 149664fcdd98c5506f46884247a52151bceb3ad168ec2502fcf182021b5b9424 Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.247423 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.264958 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.276613 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dd599d8f5-jwm7f"] Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.629068 5047 generic.go:334] "Generic (PLEG): container finished" podID="29eb3c3a-fe49-482a-95f2-dcb88bd99631" containerID="94401db850ab8fb8a52690d1a1abec2f75c85554ac7083e24e6a3ea7a646477e" exitCode=0 Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.629168 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" event={"ID":"29eb3c3a-fe49-482a-95f2-dcb88bd99631","Type":"ContainerDied","Data":"94401db850ab8fb8a52690d1a1abec2f75c85554ac7083e24e6a3ea7a646477e"} Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.629204 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" event={"ID":"29eb3c3a-fe49-482a-95f2-dcb88bd99631","Type":"ContainerStarted","Data":"06426d728dd750a5be1cc05b5105273cbbd70455cee2b0b364062e08a0b23506"} Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.632390 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"023c3c52-cbe1-437b-ba55-f313bd545ba6","Type":"ContainerStarted","Data":"eb25522ce8432df344a7f8e586d94d6be6a8db16165084ee1ff550447d9300f0"} Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.635605 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-49rcc" event={"ID":"3dde3e53-3ac7-464f-a504-e1b815844cb4","Type":"ContainerStarted","Data":"4ada85ea2224cba0dd0ac5bfad26078501595310d0f4318c0f68ec1cab3e8d84"} Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.637930 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b18b33d4-e894-476b-afa1-33794c308ddf","Type":"ContainerStarted","Data":"ff6c950539a0920dfd1f378921c83e117f90cbf9abca078fea04e5e859bfcc57"} Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.640339 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18a35c0a-dcef-48db-8033-f48013cada37","Type":"ContainerStarted","Data":"149664fcdd98c5506f46884247a52151bceb3ad168ec2502fcf182021b5b9424"} Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.642066 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95ec21f5-bf2e-4bf7-80ca-b14f21b44516","Type":"ContainerStarted","Data":"faabefe8fc938777ab9bffe200b7f9e335b7586b9b5601f595fa20374b4e4f12"} Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.673947 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-49rcc" podStartSLOduration=3.67392209 podStartE2EDuration="3.67392209s" podCreationTimestamp="2026-02-23 08:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:51:56.661429534 +0000 UTC m=+7638.912756668" watchObservedRunningTime="2026-02-23 08:51:56.67392209 +0000 UTC m=+7638.925249224" Feb 23 08:51:56 crc kubenswrapper[5047]: I0223 08:51:56.773788 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-997jz"] Feb 23 08:51:56 crc kubenswrapper[5047]: W0223 08:51:56.781464 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddafcfc67_d904_4d7b_946f_1f04e880b98f.slice/crio-f5f4d50b6c3397e0186eb6bea1c1281983be601e9400aef13131b466bb223e04 WatchSource:0}: Error finding container f5f4d50b6c3397e0186eb6bea1c1281983be601e9400aef13131b466bb223e04: Status 404 returned error can't find the container with id f5f4d50b6c3397e0186eb6bea1c1281983be601e9400aef13131b466bb223e04 Feb 23 08:51:57 crc kubenswrapper[5047]: I0223 08:51:57.103019 5047 scope.go:117] "RemoveContainer" containerID="9d1bf75208e8fb1456ea1b2b64c6f201bcc3869e45634522b9eaefda6c28ed29" Feb 23 08:51:57 crc kubenswrapper[5047]: I0223 08:51:57.157393 5047 scope.go:117] "RemoveContainer" containerID="a981dabcffe67846888ca6f5f16cb9857a76d1c5757cda7a82d7452744a10c9a" Feb 23 08:51:57 crc kubenswrapper[5047]: I0223 08:51:57.195003 5047 scope.go:117] "RemoveContainer" containerID="1e1c7042b0d09b83ed7ebacd98d516960fec21979072b96038406cc60415dac1" Feb 23 08:51:57 crc kubenswrapper[5047]: I0223 08:51:57.657278 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-997jz" event={"ID":"dafcfc67-d904-4d7b-946f-1f04e880b98f","Type":"ContainerStarted","Data":"d5925a76b8dc1aa4699c52a60dfe9bafd55bce85e0b0fe07e7a09153ac94b57d"} Feb 23 08:51:57 crc kubenswrapper[5047]: I0223 08:51:57.657331 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-997jz" event={"ID":"dafcfc67-d904-4d7b-946f-1f04e880b98f","Type":"ContainerStarted","Data":"f5f4d50b6c3397e0186eb6bea1c1281983be601e9400aef13131b466bb223e04"} Feb 23 08:51:57 crc kubenswrapper[5047]: I0223 08:51:57.660470 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" event={"ID":"29eb3c3a-fe49-482a-95f2-dcb88bd99631","Type":"ContainerStarted","Data":"6159200a71b24cc7fa6af5ac76985c1fa1ccf0657187df605681b308fa0d502c"} Feb 23 08:51:57 crc kubenswrapper[5047]: I0223 08:51:57.660501 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:51:57 crc kubenswrapper[5047]: I0223 08:51:57.678229 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-997jz" podStartSLOduration=2.678205958 podStartE2EDuration="2.678205958s" podCreationTimestamp="2026-02-23 08:51:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:51:57.677868039 +0000 UTC m=+7639.929195173" watchObservedRunningTime="2026-02-23 08:51:57.678205958 +0000 UTC m=+7639.929533112" Feb 23 08:51:57 crc kubenswrapper[5047]: I0223 08:51:57.721524 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" podStartSLOduration=3.721499021 podStartE2EDuration="3.721499021s" podCreationTimestamp="2026-02-23 08:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:51:57.705306846 +0000 UTC m=+7639.956633980" watchObservedRunningTime="2026-02-23 08:51:57.721499021 +0000 UTC m=+7639.972826155" Feb 23 08:51:58 crc kubenswrapper[5047]: I0223 08:51:58.205538 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:51:58 crc kubenswrapper[5047]: I0223 08:51:58.253390 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.341562 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:52:00 crc kubenswrapper[5047]: E0223 08:52:00.344195 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.740704 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95ec21f5-bf2e-4bf7-80ca-b14f21b44516","Type":"ContainerStarted","Data":"eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5"} Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.740795 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95ec21f5-bf2e-4bf7-80ca-b14f21b44516","Type":"ContainerStarted","Data":"1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52"} Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.743693 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"023c3c52-cbe1-437b-ba55-f313bd545ba6","Type":"ContainerStarted","Data":"f806e832e2772a999811e25f467bf3106a01c38ff43fad44b5eaadb575c9eb60"} Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.744054 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="023c3c52-cbe1-437b-ba55-f313bd545ba6" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f806e832e2772a999811e25f467bf3106a01c38ff43fad44b5eaadb575c9eb60" gracePeriod=30 Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.746208 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b18b33d4-e894-476b-afa1-33794c308ddf","Type":"ContainerStarted","Data":"139b653cba3d09cd04f3b19f2671958eacc366ff39f4f54aea92188e9e684e8f"} Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.753047 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18a35c0a-dcef-48db-8033-f48013cada37","Type":"ContainerStarted","Data":"f3c8e4e3bf75bb62695771891f6c276d8f1249ca52f9a6bfb88331641636b1b7"} Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.753100 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18a35c0a-dcef-48db-8033-f48013cada37","Type":"ContainerStarted","Data":"4dff392a9ca9a93f049266a7b04f65110ff990f8a781c81b3c9d43c30d7d9fdf"} Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.753254 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="18a35c0a-dcef-48db-8033-f48013cada37" containerName="nova-metadata-log" containerID="cri-o://4dff392a9ca9a93f049266a7b04f65110ff990f8a781c81b3c9d43c30d7d9fdf" gracePeriod=30 Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.753614 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="18a35c0a-dcef-48db-8033-f48013cada37" containerName="nova-metadata-metadata" containerID="cri-o://f3c8e4e3bf75bb62695771891f6c276d8f1249ca52f9a6bfb88331641636b1b7" gracePeriod=30 Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.759474 5047 generic.go:334] "Generic (PLEG): container finished" podID="dafcfc67-d904-4d7b-946f-1f04e880b98f" containerID="d5925a76b8dc1aa4699c52a60dfe9bafd55bce85e0b0fe07e7a09153ac94b57d" exitCode=0 Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.759541 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-997jz" event={"ID":"dafcfc67-d904-4d7b-946f-1f04e880b98f","Type":"ContainerDied","Data":"d5925a76b8dc1aa4699c52a60dfe9bafd55bce85e0b0fe07e7a09153ac94b57d"} Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.771020 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.586469363 podStartE2EDuration="7.7709898s" podCreationTimestamp="2026-02-23 08:51:53 +0000 UTC" firstStartedPulling="2026-02-23 08:51:55.863811194 +0000 UTC m=+7638.115138328" lastFinishedPulling="2026-02-23 08:52:00.048331631 +0000 UTC m=+7642.299658765" observedRunningTime="2026-02-23 08:52:00.760600082 +0000 UTC m=+7643.011927206" watchObservedRunningTime="2026-02-23 08:52:00.7709898 +0000 UTC m=+7643.022316934" Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.790758 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.965290391 podStartE2EDuration="7.79073638s" podCreationTimestamp="2026-02-23 08:51:53 +0000 UTC" firstStartedPulling="2026-02-23 08:51:56.22465549 +0000 UTC m=+7638.475982624" lastFinishedPulling="2026-02-23 08:52:00.050101479 +0000 UTC m=+7642.301428613" observedRunningTime="2026-02-23 08:52:00.788345006 +0000 UTC m=+7643.039672140" watchObservedRunningTime="2026-02-23 08:52:00.79073638 +0000 UTC m=+7643.042063514" Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.816645 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.056671045 podStartE2EDuration="6.816620856s" podCreationTimestamp="2026-02-23 08:51:54 +0000 UTC" firstStartedPulling="2026-02-23 08:51:56.289977464 +0000 UTC m=+7638.541304598" lastFinishedPulling="2026-02-23 08:52:00.049927275 +0000 UTC m=+7642.301254409" observedRunningTime="2026-02-23 08:52:00.810119841 +0000 UTC m=+7643.061446995" watchObservedRunningTime="2026-02-23 08:52:00.816620856 +0000 UTC m=+7643.067948010" Feb 23 08:52:00 crc kubenswrapper[5047]: I0223 08:52:00.854347 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.072225521 podStartE2EDuration="6.854321337s" podCreationTimestamp="2026-02-23 08:51:54 +0000 UTC" firstStartedPulling="2026-02-23 08:51:56.266979526 +0000 UTC m=+7638.518306660" lastFinishedPulling="2026-02-23 08:52:00.049075342 +0000 UTC m=+7642.300402476" observedRunningTime="2026-02-23 08:52:00.840618219 +0000 UTC m=+7643.091945373" watchObservedRunningTime="2026-02-23 08:52:00.854321337 +0000 UTC m=+7643.105648471" Feb 23 08:52:01 crc kubenswrapper[5047]: I0223 08:52:01.773504 5047 generic.go:334] "Generic (PLEG): container finished" podID="3dde3e53-3ac7-464f-a504-e1b815844cb4" containerID="4ada85ea2224cba0dd0ac5bfad26078501595310d0f4318c0f68ec1cab3e8d84" exitCode=0 Feb 23 08:52:01 crc kubenswrapper[5047]: I0223 08:52:01.773579 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-49rcc" event={"ID":"3dde3e53-3ac7-464f-a504-e1b815844cb4","Type":"ContainerDied","Data":"4ada85ea2224cba0dd0ac5bfad26078501595310d0f4318c0f68ec1cab3e8d84"} Feb 23 08:52:01 crc kubenswrapper[5047]: I0223 08:52:01.778569 5047 generic.go:334] "Generic (PLEG): container finished" podID="18a35c0a-dcef-48db-8033-f48013cada37" containerID="4dff392a9ca9a93f049266a7b04f65110ff990f8a781c81b3c9d43c30d7d9fdf" exitCode=143 Feb 23 08:52:01 crc kubenswrapper[5047]: I0223 08:52:01.778663 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18a35c0a-dcef-48db-8033-f48013cada37","Type":"ContainerDied","Data":"4dff392a9ca9a93f049266a7b04f65110ff990f8a781c81b3c9d43c30d7d9fdf"} Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.254360 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.279698 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czptw\" (UniqueName: \"kubernetes.io/projected/dafcfc67-d904-4d7b-946f-1f04e880b98f-kube-api-access-czptw\") pod \"dafcfc67-d904-4d7b-946f-1f04e880b98f\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.279798 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-combined-ca-bundle\") pod \"dafcfc67-d904-4d7b-946f-1f04e880b98f\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.279854 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-scripts\") pod \"dafcfc67-d904-4d7b-946f-1f04e880b98f\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.280715 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-config-data\") pod \"dafcfc67-d904-4d7b-946f-1f04e880b98f\" (UID: \"dafcfc67-d904-4d7b-946f-1f04e880b98f\") " Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.286959 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-scripts" (OuterVolumeSpecName: "scripts") pod "dafcfc67-d904-4d7b-946f-1f04e880b98f" (UID: "dafcfc67-d904-4d7b-946f-1f04e880b98f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.294140 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dafcfc67-d904-4d7b-946f-1f04e880b98f-kube-api-access-czptw" (OuterVolumeSpecName: "kube-api-access-czptw") pod "dafcfc67-d904-4d7b-946f-1f04e880b98f" (UID: "dafcfc67-d904-4d7b-946f-1f04e880b98f"). InnerVolumeSpecName "kube-api-access-czptw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.316545 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dafcfc67-d904-4d7b-946f-1f04e880b98f" (UID: "dafcfc67-d904-4d7b-946f-1f04e880b98f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.329370 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-config-data" (OuterVolumeSpecName: "config-data") pod "dafcfc67-d904-4d7b-946f-1f04e880b98f" (UID: "dafcfc67-d904-4d7b-946f-1f04e880b98f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.383737 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.383775 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.383791 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dafcfc67-d904-4d7b-946f-1f04e880b98f-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.383803 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czptw\" (UniqueName: \"kubernetes.io/projected/dafcfc67-d904-4d7b-946f-1f04e880b98f-kube-api-access-czptw\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.795461 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-997jz" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.795459 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-997jz" event={"ID":"dafcfc67-d904-4d7b-946f-1f04e880b98f","Type":"ContainerDied","Data":"f5f4d50b6c3397e0186eb6bea1c1281983be601e9400aef13131b466bb223e04"} Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.795634 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5f4d50b6c3397e0186eb6bea1c1281983be601e9400aef13131b466bb223e04" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.962167 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 08:52:02 crc kubenswrapper[5047]: E0223 08:52:02.962983 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafcfc67-d904-4d7b-946f-1f04e880b98f" containerName="nova-cell1-conductor-db-sync" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.963002 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafcfc67-d904-4d7b-946f-1f04e880b98f" containerName="nova-cell1-conductor-db-sync" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.963200 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="dafcfc67-d904-4d7b-946f-1f04e880b98f" containerName="nova-cell1-conductor-db-sync" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.963815 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.969613 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.984892 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.999351 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01713a47-8bed-4038-8339-bdcd77e6e1db-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"01713a47-8bed-4038-8339-bdcd77e6e1db\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.999502 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01713a47-8bed-4038-8339-bdcd77e6e1db-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"01713a47-8bed-4038-8339-bdcd77e6e1db\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:52:02 crc kubenswrapper[5047]: I0223 08:52:02.999547 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8zpb\" (UniqueName: \"kubernetes.io/projected/01713a47-8bed-4038-8339-bdcd77e6e1db-kube-api-access-c8zpb\") pod \"nova-cell1-conductor-0\" (UID: \"01713a47-8bed-4038-8339-bdcd77e6e1db\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.102880 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01713a47-8bed-4038-8339-bdcd77e6e1db-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"01713a47-8bed-4038-8339-bdcd77e6e1db\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.102987 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01713a47-8bed-4038-8339-bdcd77e6e1db-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"01713a47-8bed-4038-8339-bdcd77e6e1db\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.103038 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8zpb\" (UniqueName: \"kubernetes.io/projected/01713a47-8bed-4038-8339-bdcd77e6e1db-kube-api-access-c8zpb\") pod \"nova-cell1-conductor-0\" (UID: \"01713a47-8bed-4038-8339-bdcd77e6e1db\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.109507 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01713a47-8bed-4038-8339-bdcd77e6e1db-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"01713a47-8bed-4038-8339-bdcd77e6e1db\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.112369 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01713a47-8bed-4038-8339-bdcd77e6e1db-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"01713a47-8bed-4038-8339-bdcd77e6e1db\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.122647 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8zpb\" (UniqueName: \"kubernetes.io/projected/01713a47-8bed-4038-8339-bdcd77e6e1db-kube-api-access-c8zpb\") pod \"nova-cell1-conductor-0\" (UID: \"01713a47-8bed-4038-8339-bdcd77e6e1db\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.214436 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.296412 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.305769 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkplx\" (UniqueName: \"kubernetes.io/projected/3dde3e53-3ac7-464f-a504-e1b815844cb4-kube-api-access-xkplx\") pod \"3dde3e53-3ac7-464f-a504-e1b815844cb4\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.305862 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-combined-ca-bundle\") pod \"3dde3e53-3ac7-464f-a504-e1b815844cb4\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.306009 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-config-data\") pod \"3dde3e53-3ac7-464f-a504-e1b815844cb4\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.306088 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-scripts\") pod \"3dde3e53-3ac7-464f-a504-e1b815844cb4\" (UID: \"3dde3e53-3ac7-464f-a504-e1b815844cb4\") " Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.311490 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-scripts" (OuterVolumeSpecName: "scripts") pod "3dde3e53-3ac7-464f-a504-e1b815844cb4" (UID: "3dde3e53-3ac7-464f-a504-e1b815844cb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.313924 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dde3e53-3ac7-464f-a504-e1b815844cb4-kube-api-access-xkplx" (OuterVolumeSpecName: "kube-api-access-xkplx") pod "3dde3e53-3ac7-464f-a504-e1b815844cb4" (UID: "3dde3e53-3ac7-464f-a504-e1b815844cb4"). InnerVolumeSpecName "kube-api-access-xkplx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.346869 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dde3e53-3ac7-464f-a504-e1b815844cb4" (UID: "3dde3e53-3ac7-464f-a504-e1b815844cb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.352097 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-config-data" (OuterVolumeSpecName: "config-data") pod "3dde3e53-3ac7-464f-a504-e1b815844cb4" (UID: "3dde3e53-3ac7-464f-a504-e1b815844cb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.407611 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.407937 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.407951 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkplx\" (UniqueName: \"kubernetes.io/projected/3dde3e53-3ac7-464f-a504-e1b815844cb4-kube-api-access-xkplx\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.407963 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dde3e53-3ac7-464f-a504-e1b815844cb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.763349 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 08:52:03 crc kubenswrapper[5047]: W0223 08:52:03.780901 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01713a47_8bed_4038_8339_bdcd77e6e1db.slice/crio-36d003992f1ff6be3ceeaeb907294e65d0b8ade351ef14317b3419b70e24b148 WatchSource:0}: Error finding container 36d003992f1ff6be3ceeaeb907294e65d0b8ade351ef14317b3419b70e24b148: Status 404 returned error can't find the container with id 36d003992f1ff6be3ceeaeb907294e65d0b8ade351ef14317b3419b70e24b148 Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.808394 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"01713a47-8bed-4038-8339-bdcd77e6e1db","Type":"ContainerStarted","Data":"36d003992f1ff6be3ceeaeb907294e65d0b8ade351ef14317b3419b70e24b148"} Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.810282 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-49rcc" event={"ID":"3dde3e53-3ac7-464f-a504-e1b815844cb4","Type":"ContainerDied","Data":"a4715fe48675ee8aaa2b2db560444de0eaab0a02b96b2c644b436815f42cbad1"} Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.810319 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4715fe48675ee8aaa2b2db560444de0eaab0a02b96b2c644b436815f42cbad1" Feb 23 08:52:03 crc kubenswrapper[5047]: I0223 08:52:03.810400 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-49rcc" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.010974 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.011871 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="95ec21f5-bf2e-4bf7-80ca-b14f21b44516" containerName="nova-api-log" containerID="cri-o://1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52" gracePeriod=30 Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.014229 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="95ec21f5-bf2e-4bf7-80ca-b14f21b44516" containerName="nova-api-api" containerID="cri-o://eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5" gracePeriod=30 Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.068422 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.068852 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b18b33d4-e894-476b-afa1-33794c308ddf" containerName="nova-scheduler-scheduler" containerID="cri-o://139b653cba3d09cd04f3b19f2671958eacc366ff39f4f54aea92188e9e684e8f" gracePeriod=30 Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.486504 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.487122 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.542972 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.703845 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.747510 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.768781 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.775790 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vhm4\" (UniqueName: \"kubernetes.io/projected/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-kube-api-access-8vhm4\") pod \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.777406 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-config-data\") pod \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.778235 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-logs\") pod \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.778352 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-combined-ca-bundle\") pod \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.778922 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-logs" (OuterVolumeSpecName: "logs") pod "95ec21f5-bf2e-4bf7-80ca-b14f21b44516" (UID: "95ec21f5-bf2e-4bf7-80ca-b14f21b44516"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.779678 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.797699 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-kube-api-access-8vhm4" (OuterVolumeSpecName: "kube-api-access-8vhm4") pod "95ec21f5-bf2e-4bf7-80ca-b14f21b44516" (UID: "95ec21f5-bf2e-4bf7-80ca-b14f21b44516"). InnerVolumeSpecName "kube-api-access-8vhm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:52:04 crc kubenswrapper[5047]: E0223 08:52:04.831039 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-config-data podName:95ec21f5-bf2e-4bf7-80ca-b14f21b44516 nodeName:}" failed. No retries permitted until 2026-02-23 08:52:05.330990056 +0000 UTC m=+7647.582317190 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-config-data") pod "95ec21f5-bf2e-4bf7-80ca-b14f21b44516" (UID: "95ec21f5-bf2e-4bf7-80ca-b14f21b44516") : error deleting /var/lib/kubelet/pods/95ec21f5-bf2e-4bf7-80ca-b14f21b44516/volume-subpaths: remove /var/lib/kubelet/pods/95ec21f5-bf2e-4bf7-80ca-b14f21b44516/volume-subpaths: no such file or directory Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.849157 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95ec21f5-bf2e-4bf7-80ca-b14f21b44516" (UID: "95ec21f5-bf2e-4bf7-80ca-b14f21b44516"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.849894 5047 generic.go:334] "Generic (PLEG): container finished" podID="95ec21f5-bf2e-4bf7-80ca-b14f21b44516" containerID="eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5" exitCode=0 Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.850128 5047 generic.go:334] "Generic (PLEG): container finished" podID="95ec21f5-bf2e-4bf7-80ca-b14f21b44516" containerID="1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52" exitCode=143 Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.850041 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95ec21f5-bf2e-4bf7-80ca-b14f21b44516","Type":"ContainerDied","Data":"eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5"} Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.850516 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95ec21f5-bf2e-4bf7-80ca-b14f21b44516","Type":"ContainerDied","Data":"1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52"} Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.850609 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95ec21f5-bf2e-4bf7-80ca-b14f21b44516","Type":"ContainerDied","Data":"faabefe8fc938777ab9bffe200b7f9e335b7586b9b5601f595fa20374b4e4f12"} Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.850736 5047 scope.go:117] "RemoveContainer" containerID="eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.850010 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.855651 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bc7dfddf9-69wmk"] Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.856010 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" podUID="8d8399d9-f20e-4917-acd7-d1f94eb6269d" containerName="dnsmasq-dns" containerID="cri-o://a1e39e930216b49fad367ef294a370baecdcacf8c84a006b1d85522565452fbb" gracePeriod=10 Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.865076 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"01713a47-8bed-4038-8339-bdcd77e6e1db","Type":"ContainerStarted","Data":"9f09d38933227a4847581a0cdb85bb38728c31775104288e65ecc52ea3a4a431"} Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.866110 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.893728 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.893768 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vhm4\" (UniqueName: \"kubernetes.io/projected/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-kube-api-access-8vhm4\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.902186 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.902160997 podStartE2EDuration="2.902160997s" podCreationTimestamp="2026-02-23 08:52:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:52:04.892470287 +0000 UTC m=+7647.143797421" watchObservedRunningTime="2026-02-23 08:52:04.902160997 +0000 UTC m=+7647.153488131" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.902875 5047 scope.go:117] "RemoveContainer" containerID="1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.965012 5047 scope.go:117] "RemoveContainer" containerID="eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5" Feb 23 08:52:04 crc kubenswrapper[5047]: E0223 08:52:04.966052 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5\": container with ID starting with eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5 not found: ID does not exist" containerID="eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.966086 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5"} err="failed to get container status \"eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5\": rpc error: code = NotFound desc = could not find container \"eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5\": container with ID starting with eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5 not found: ID does not exist" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.966124 5047 scope.go:117] "RemoveContainer" containerID="1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52" Feb 23 08:52:04 crc kubenswrapper[5047]: E0223 08:52:04.966394 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52\": container with ID starting with 1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52 not found: ID does not exist" containerID="1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.966418 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52"} err="failed to get container status \"1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52\": rpc error: code = NotFound desc = could not find container \"1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52\": container with ID starting with 1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52 not found: ID does not exist" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.966437 5047 scope.go:117] "RemoveContainer" containerID="eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.966995 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5"} err="failed to get container status \"eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5\": rpc error: code = NotFound desc = could not find container \"eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5\": container with ID starting with eaab2691f90d302a33f4fc7363ca36412f6f86a131ab5ca742c4c46ae1674ab5 not found: ID does not exist" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.967037 5047 scope.go:117] "RemoveContainer" containerID="1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52" Feb 23 08:52:04 crc kubenswrapper[5047]: I0223 08:52:04.967398 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52"} err="failed to get container status \"1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52\": rpc error: code = NotFound desc = could not find container \"1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52\": container with ID starting with 1e42dc083e29a4306819296e21137817a733c640db41bf48fb94b7b2d70c0c52 not found: ID does not exist" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.368414 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.402216 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-config\") pod \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.402368 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bjwt\" (UniqueName: \"kubernetes.io/projected/8d8399d9-f20e-4917-acd7-d1f94eb6269d-kube-api-access-8bjwt\") pod \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.402448 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-ovsdbserver-sb\") pod \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.402536 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-ovsdbserver-nb\") pod \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.402594 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-dns-svc\") pod \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\" (UID: \"8d8399d9-f20e-4917-acd7-d1f94eb6269d\") " Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.402743 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-config-data\") pod \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\" (UID: \"95ec21f5-bf2e-4bf7-80ca-b14f21b44516\") " Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.446330 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d8399d9-f20e-4917-acd7-d1f94eb6269d-kube-api-access-8bjwt" (OuterVolumeSpecName: "kube-api-access-8bjwt") pod "8d8399d9-f20e-4917-acd7-d1f94eb6269d" (UID: "8d8399d9-f20e-4917-acd7-d1f94eb6269d"). InnerVolumeSpecName "kube-api-access-8bjwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.446427 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-config-data" (OuterVolumeSpecName: "config-data") pod "95ec21f5-bf2e-4bf7-80ca-b14f21b44516" (UID: "95ec21f5-bf2e-4bf7-80ca-b14f21b44516"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.475660 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8d8399d9-f20e-4917-acd7-d1f94eb6269d" (UID: "8d8399d9-f20e-4917-acd7-d1f94eb6269d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.478376 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8d8399d9-f20e-4917-acd7-d1f94eb6269d" (UID: "8d8399d9-f20e-4917-acd7-d1f94eb6269d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.488024 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8d8399d9-f20e-4917-acd7-d1f94eb6269d" (UID: "8d8399d9-f20e-4917-acd7-d1f94eb6269d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.501276 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-config" (OuterVolumeSpecName: "config") pod "8d8399d9-f20e-4917-acd7-d1f94eb6269d" (UID: "8d8399d9-f20e-4917-acd7-d1f94eb6269d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.507020 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bjwt\" (UniqueName: \"kubernetes.io/projected/8d8399d9-f20e-4917-acd7-d1f94eb6269d-kube-api-access-8bjwt\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.507065 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.507077 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.507089 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.507101 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ec21f5-bf2e-4bf7-80ca-b14f21b44516-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.507114 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d8399d9-f20e-4917-acd7-d1f94eb6269d-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.533063 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.541207 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.550558 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 08:52:05 crc kubenswrapper[5047]: E0223 08:52:05.551073 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8399d9-f20e-4917-acd7-d1f94eb6269d" containerName="init" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.551094 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8399d9-f20e-4917-acd7-d1f94eb6269d" containerName="init" Feb 23 08:52:05 crc kubenswrapper[5047]: E0223 08:52:05.551131 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8399d9-f20e-4917-acd7-d1f94eb6269d" containerName="dnsmasq-dns" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.551138 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8399d9-f20e-4917-acd7-d1f94eb6269d" containerName="dnsmasq-dns" Feb 23 08:52:05 crc kubenswrapper[5047]: E0223 08:52:05.551156 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dde3e53-3ac7-464f-a504-e1b815844cb4" containerName="nova-manage" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.551162 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dde3e53-3ac7-464f-a504-e1b815844cb4" containerName="nova-manage" Feb 23 08:52:05 crc kubenswrapper[5047]: E0223 08:52:05.551176 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ec21f5-bf2e-4bf7-80ca-b14f21b44516" containerName="nova-api-log" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.551183 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ec21f5-bf2e-4bf7-80ca-b14f21b44516" containerName="nova-api-log" Feb 23 08:52:05 crc kubenswrapper[5047]: E0223 08:52:05.551194 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ec21f5-bf2e-4bf7-80ca-b14f21b44516" containerName="nova-api-api" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.551200 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ec21f5-bf2e-4bf7-80ca-b14f21b44516" containerName="nova-api-api" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.551379 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dde3e53-3ac7-464f-a504-e1b815844cb4" containerName="nova-manage" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.551398 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ec21f5-bf2e-4bf7-80ca-b14f21b44516" containerName="nova-api-api" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.551412 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ec21f5-bf2e-4bf7-80ca-b14f21b44516" containerName="nova-api-log" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.551423 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d8399d9-f20e-4917-acd7-d1f94eb6269d" containerName="dnsmasq-dns" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.552554 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.559641 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.585643 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.609082 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qn9s\" (UniqueName: \"kubernetes.io/projected/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-kube-api-access-6qn9s\") pod \"nova-api-0\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.609206 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-logs\") pod \"nova-api-0\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.609235 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-config-data\") pod \"nova-api-0\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.609264 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.711795 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-logs\") pod \"nova-api-0\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.712216 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-config-data\") pod \"nova-api-0\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.712395 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.712328 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-logs\") pod \"nova-api-0\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.712634 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qn9s\" (UniqueName: \"kubernetes.io/projected/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-kube-api-access-6qn9s\") pod \"nova-api-0\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.716062 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.717591 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-config-data\") pod \"nova-api-0\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.736886 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qn9s\" (UniqueName: \"kubernetes.io/projected/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-kube-api-access-6qn9s\") pod \"nova-api-0\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.877566 5047 generic.go:334] "Generic (PLEG): container finished" podID="8d8399d9-f20e-4917-acd7-d1f94eb6269d" containerID="a1e39e930216b49fad367ef294a370baecdcacf8c84a006b1d85522565452fbb" exitCode=0 Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.877673 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.878038 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" event={"ID":"8d8399d9-f20e-4917-acd7-d1f94eb6269d","Type":"ContainerDied","Data":"a1e39e930216b49fad367ef294a370baecdcacf8c84a006b1d85522565452fbb"} Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.878134 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc7dfddf9-69wmk" event={"ID":"8d8399d9-f20e-4917-acd7-d1f94eb6269d","Type":"ContainerDied","Data":"c6533418e3e55fb00d28556990303d44941393dc57fa9d4402b3c55626891a8e"} Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.878174 5047 scope.go:117] "RemoveContainer" containerID="a1e39e930216b49fad367ef294a370baecdcacf8c84a006b1d85522565452fbb" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.879363 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.927888 5047 scope.go:117] "RemoveContainer" containerID="31f16e59216955638fe22ea9ca7717ca02e56ba40b17fa772cdca5f175acc768" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.933037 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bc7dfddf9-69wmk"] Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.947325 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bc7dfddf9-69wmk"] Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.973026 5047 scope.go:117] "RemoveContainer" containerID="a1e39e930216b49fad367ef294a370baecdcacf8c84a006b1d85522565452fbb" Feb 23 08:52:05 crc kubenswrapper[5047]: E0223 08:52:05.973604 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e39e930216b49fad367ef294a370baecdcacf8c84a006b1d85522565452fbb\": container with ID starting with a1e39e930216b49fad367ef294a370baecdcacf8c84a006b1d85522565452fbb not found: ID does not exist" containerID="a1e39e930216b49fad367ef294a370baecdcacf8c84a006b1d85522565452fbb" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.973675 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e39e930216b49fad367ef294a370baecdcacf8c84a006b1d85522565452fbb"} err="failed to get container status \"a1e39e930216b49fad367ef294a370baecdcacf8c84a006b1d85522565452fbb\": rpc error: code = NotFound desc = could not find container \"a1e39e930216b49fad367ef294a370baecdcacf8c84a006b1d85522565452fbb\": container with ID starting with a1e39e930216b49fad367ef294a370baecdcacf8c84a006b1d85522565452fbb not found: ID does not exist" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.973743 5047 scope.go:117] "RemoveContainer" containerID="31f16e59216955638fe22ea9ca7717ca02e56ba40b17fa772cdca5f175acc768" Feb 23 08:52:05 crc kubenswrapper[5047]: E0223 08:52:05.974386 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31f16e59216955638fe22ea9ca7717ca02e56ba40b17fa772cdca5f175acc768\": container with ID starting with 31f16e59216955638fe22ea9ca7717ca02e56ba40b17fa772cdca5f175acc768 not found: ID does not exist" containerID="31f16e59216955638fe22ea9ca7717ca02e56ba40b17fa772cdca5f175acc768" Feb 23 08:52:05 crc kubenswrapper[5047]: I0223 08:52:05.974414 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f16e59216955638fe22ea9ca7717ca02e56ba40b17fa772cdca5f175acc768"} err="failed to get container status \"31f16e59216955638fe22ea9ca7717ca02e56ba40b17fa772cdca5f175acc768\": rpc error: code = NotFound desc = could not find container \"31f16e59216955638fe22ea9ca7717ca02e56ba40b17fa772cdca5f175acc768\": container with ID starting with 31f16e59216955638fe22ea9ca7717ca02e56ba40b17fa772cdca5f175acc768 not found: ID does not exist" Feb 23 08:52:06 crc kubenswrapper[5047]: I0223 08:52:06.352274 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d8399d9-f20e-4917-acd7-d1f94eb6269d" path="/var/lib/kubelet/pods/8d8399d9-f20e-4917-acd7-d1f94eb6269d/volumes" Feb 23 08:52:06 crc kubenswrapper[5047]: I0223 08:52:06.353956 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ec21f5-bf2e-4bf7-80ca-b14f21b44516" path="/var/lib/kubelet/pods/95ec21f5-bf2e-4bf7-80ca-b14f21b44516/volumes" Feb 23 08:52:06 crc kubenswrapper[5047]: W0223 08:52:06.465147 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37115b42_ccfb_4aad_8df3_9e15ae5d2dcf.slice/crio-66228b74688ac16182c4b891cb9b1ea87902277de99a4c5391fd3c32d61b30f0 WatchSource:0}: Error finding container 66228b74688ac16182c4b891cb9b1ea87902277de99a4c5391fd3c32d61b30f0: Status 404 returned error can't find the container with id 66228b74688ac16182c4b891cb9b1ea87902277de99a4c5391fd3c32d61b30f0 Feb 23 08:52:06 crc kubenswrapper[5047]: I0223 08:52:06.466133 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:52:06 crc kubenswrapper[5047]: I0223 08:52:06.890954 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf","Type":"ContainerStarted","Data":"59dfcbeeffabc99d039d88010bb5dbb9ae513cd15cc4d2873ae9b5dbbbe1aea4"} Feb 23 08:52:06 crc kubenswrapper[5047]: I0223 08:52:06.891415 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf","Type":"ContainerStarted","Data":"66228b74688ac16182c4b891cb9b1ea87902277de99a4c5391fd3c32d61b30f0"} Feb 23 08:52:07 crc kubenswrapper[5047]: I0223 08:52:07.049530 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-sbvrk"] Feb 23 08:52:07 crc kubenswrapper[5047]: I0223 08:52:07.058295 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ce6c-account-create-update-hlzml"] Feb 23 08:52:07 crc kubenswrapper[5047]: I0223 08:52:07.069229 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ce6c-account-create-update-hlzml"] Feb 23 08:52:07 crc kubenswrapper[5047]: I0223 08:52:07.078654 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-sbvrk"] Feb 23 08:52:07 crc kubenswrapper[5047]: I0223 08:52:07.904790 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf","Type":"ContainerStarted","Data":"e91f6fc2a51b6dd890ddd0cd906848298a6cfb611dc8571a601ec0e8de7c70bd"} Feb 23 08:52:07 crc kubenswrapper[5047]: I0223 08:52:07.938167 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.938138394 podStartE2EDuration="2.938138394s" podCreationTimestamp="2026-02-23 08:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:52:07.928051963 +0000 UTC m=+7650.179379287" watchObservedRunningTime="2026-02-23 08:52:07.938138394 +0000 UTC m=+7650.189465528" Feb 23 08:52:08 crc kubenswrapper[5047]: I0223 08:52:08.362697 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9" path="/var/lib/kubelet/pods/8d9c6bb4-ecce-45e6-abe5-a9312dca8ff9/volumes" Feb 23 08:52:08 crc kubenswrapper[5047]: I0223 08:52:08.363601 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9181e45e-92b4-4708-b4f6-6a6d2788f58c" path="/var/lib/kubelet/pods/9181e45e-92b4-4708-b4f6-6a6d2788f58c/volumes" Feb 23 08:52:13 crc kubenswrapper[5047]: I0223 08:52:13.340768 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 23 08:52:13 crc kubenswrapper[5047]: I0223 08:52:13.341646 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:52:13 crc kubenswrapper[5047]: E0223 08:52:13.342033 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:52:15 crc kubenswrapper[5047]: I0223 08:52:15.880093 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 08:52:15 crc kubenswrapper[5047]: I0223 08:52:15.880568 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 08:52:16 crc kubenswrapper[5047]: I0223 08:52:16.963105 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.100:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:52:16 crc kubenswrapper[5047]: I0223 08:52:16.963211 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.100:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:52:20 crc kubenswrapper[5047]: I0223 08:52:20.056325 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-xstzb"] Feb 23 08:52:20 crc kubenswrapper[5047]: I0223 08:52:20.070504 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-xstzb"] Feb 23 08:52:20 crc kubenswrapper[5047]: I0223 08:52:20.363027 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="423c715b-9e17-481e-8cf4-2ff875c1c45b" path="/var/lib/kubelet/pods/423c715b-9e17-481e-8cf4-2ff875c1c45b/volumes" Feb 23 08:52:26 crc kubenswrapper[5047]: I0223 08:52:26.921212 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.100:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:52:26 crc kubenswrapper[5047]: I0223 08:52:26.962286 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.100:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:52:27 crc kubenswrapper[5047]: I0223 08:52:27.341860 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:52:27 crc kubenswrapper[5047]: E0223 08:52:27.342380 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.190893 5047 generic.go:334] "Generic (PLEG): container finished" podID="18a35c0a-dcef-48db-8033-f48013cada37" containerID="f3c8e4e3bf75bb62695771891f6c276d8f1249ca52f9a6bfb88331641636b1b7" exitCode=137 Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.190985 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18a35c0a-dcef-48db-8033-f48013cada37","Type":"ContainerDied","Data":"f3c8e4e3bf75bb62695771891f6c276d8f1249ca52f9a6bfb88331641636b1b7"} Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.194539 5047 generic.go:334] "Generic (PLEG): container finished" podID="023c3c52-cbe1-437b-ba55-f313bd545ba6" containerID="f806e832e2772a999811e25f467bf3106a01c38ff43fad44b5eaadb575c9eb60" exitCode=137 Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.194598 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"023c3c52-cbe1-437b-ba55-f313bd545ba6","Type":"ContainerDied","Data":"f806e832e2772a999811e25f467bf3106a01c38ff43fad44b5eaadb575c9eb60"} Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.194629 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"023c3c52-cbe1-437b-ba55-f313bd545ba6","Type":"ContainerDied","Data":"eb25522ce8432df344a7f8e586d94d6be6a8db16165084ee1ff550447d9300f0"} Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.194651 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb25522ce8432df344a7f8e586d94d6be6a8db16165084ee1ff550447d9300f0" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.273606 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.280225 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.454032 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18a35c0a-dcef-48db-8033-f48013cada37-logs\") pod \"18a35c0a-dcef-48db-8033-f48013cada37\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.454115 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18a35c0a-dcef-48db-8033-f48013cada37-config-data\") pod \"18a35c0a-dcef-48db-8033-f48013cada37\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.454183 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rmtm\" (UniqueName: \"kubernetes.io/projected/023c3c52-cbe1-437b-ba55-f313bd545ba6-kube-api-access-5rmtm\") pod \"023c3c52-cbe1-437b-ba55-f313bd545ba6\" (UID: \"023c3c52-cbe1-437b-ba55-f313bd545ba6\") " Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.454857 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18a35c0a-dcef-48db-8033-f48013cada37-logs" (OuterVolumeSpecName: "logs") pod "18a35c0a-dcef-48db-8033-f48013cada37" (UID: "18a35c0a-dcef-48db-8033-f48013cada37"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.455089 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023c3c52-cbe1-437b-ba55-f313bd545ba6-combined-ca-bundle\") pod \"023c3c52-cbe1-437b-ba55-f313bd545ba6\" (UID: \"023c3c52-cbe1-437b-ba55-f313bd545ba6\") " Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.455165 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn9qs\" (UniqueName: \"kubernetes.io/projected/18a35c0a-dcef-48db-8033-f48013cada37-kube-api-access-hn9qs\") pod \"18a35c0a-dcef-48db-8033-f48013cada37\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.455220 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a35c0a-dcef-48db-8033-f48013cada37-combined-ca-bundle\") pod \"18a35c0a-dcef-48db-8033-f48013cada37\" (UID: \"18a35c0a-dcef-48db-8033-f48013cada37\") " Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.455249 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/023c3c52-cbe1-437b-ba55-f313bd545ba6-config-data\") pod \"023c3c52-cbe1-437b-ba55-f313bd545ba6\" (UID: \"023c3c52-cbe1-437b-ba55-f313bd545ba6\") " Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.455735 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18a35c0a-dcef-48db-8033-f48013cada37-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.461363 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/023c3c52-cbe1-437b-ba55-f313bd545ba6-kube-api-access-5rmtm" (OuterVolumeSpecName: "kube-api-access-5rmtm") pod "023c3c52-cbe1-437b-ba55-f313bd545ba6" (UID: "023c3c52-cbe1-437b-ba55-f313bd545ba6"). InnerVolumeSpecName "kube-api-access-5rmtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.462374 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a35c0a-dcef-48db-8033-f48013cada37-kube-api-access-hn9qs" (OuterVolumeSpecName: "kube-api-access-hn9qs") pod "18a35c0a-dcef-48db-8033-f48013cada37" (UID: "18a35c0a-dcef-48db-8033-f48013cada37"). InnerVolumeSpecName "kube-api-access-hn9qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.484076 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a35c0a-dcef-48db-8033-f48013cada37-config-data" (OuterVolumeSpecName: "config-data") pod "18a35c0a-dcef-48db-8033-f48013cada37" (UID: "18a35c0a-dcef-48db-8033-f48013cada37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.484649 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/023c3c52-cbe1-437b-ba55-f313bd545ba6-config-data" (OuterVolumeSpecName: "config-data") pod "023c3c52-cbe1-437b-ba55-f313bd545ba6" (UID: "023c3c52-cbe1-437b-ba55-f313bd545ba6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.487350 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/023c3c52-cbe1-437b-ba55-f313bd545ba6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "023c3c52-cbe1-437b-ba55-f313bd545ba6" (UID: "023c3c52-cbe1-437b-ba55-f313bd545ba6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.499785 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a35c0a-dcef-48db-8033-f48013cada37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18a35c0a-dcef-48db-8033-f48013cada37" (UID: "18a35c0a-dcef-48db-8033-f48013cada37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.557710 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rmtm\" (UniqueName: \"kubernetes.io/projected/023c3c52-cbe1-437b-ba55-f313bd545ba6-kube-api-access-5rmtm\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.557749 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023c3c52-cbe1-437b-ba55-f313bd545ba6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.557766 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn9qs\" (UniqueName: \"kubernetes.io/projected/18a35c0a-dcef-48db-8033-f48013cada37-kube-api-access-hn9qs\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.557779 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a35c0a-dcef-48db-8033-f48013cada37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.557793 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/023c3c52-cbe1-437b-ba55-f313bd545ba6-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:31 crc kubenswrapper[5047]: I0223 08:52:31.557804 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18a35c0a-dcef-48db-8033-f48013cada37-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.208703 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.208703 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.208733 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18a35c0a-dcef-48db-8033-f48013cada37","Type":"ContainerDied","Data":"149664fcdd98c5506f46884247a52151bceb3ad168ec2502fcf182021b5b9424"} Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.210495 5047 scope.go:117] "RemoveContainer" containerID="f3c8e4e3bf75bb62695771891f6c276d8f1249ca52f9a6bfb88331641636b1b7" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.250179 5047 scope.go:117] "RemoveContainer" containerID="4dff392a9ca9a93f049266a7b04f65110ff990f8a781c81b3c9d43c30d7d9fdf" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.269482 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.294360 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.318605 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.332303 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.410350 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="023c3c52-cbe1-437b-ba55-f313bd545ba6" path="/var/lib/kubelet/pods/023c3c52-cbe1-437b-ba55-f313bd545ba6/volumes" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.411043 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18a35c0a-dcef-48db-8033-f48013cada37" path="/var/lib/kubelet/pods/18a35c0a-dcef-48db-8033-f48013cada37/volumes" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.411748 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:52:32 crc kubenswrapper[5047]: E0223 08:52:32.412096 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a35c0a-dcef-48db-8033-f48013cada37" containerName="nova-metadata-log" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.412117 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a35c0a-dcef-48db-8033-f48013cada37" containerName="nova-metadata-log" Feb 23 08:52:32 crc kubenswrapper[5047]: E0223 08:52:32.412143 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a35c0a-dcef-48db-8033-f48013cada37" containerName="nova-metadata-metadata" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.412152 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a35c0a-dcef-48db-8033-f48013cada37" containerName="nova-metadata-metadata" Feb 23 08:52:32 crc kubenswrapper[5047]: E0223 08:52:32.412179 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="023c3c52-cbe1-437b-ba55-f313bd545ba6" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.412187 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="023c3c52-cbe1-437b-ba55-f313bd545ba6" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.412409 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="023c3c52-cbe1-437b-ba55-f313bd545ba6" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.412445 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a35c0a-dcef-48db-8033-f48013cada37" containerName="nova-metadata-log" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.412465 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a35c0a-dcef-48db-8033-f48013cada37" containerName="nova-metadata-metadata" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.413851 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.414077 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.415866 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.416009 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.416935 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.417163 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.420379 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.420594 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.420738 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.423421 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.582221 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/099c24fd-6452-4bfb-99bb-08b1304412b3-logs\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.582423 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj9xl\" (UniqueName: \"kubernetes.io/projected/054e8a69-ea75-47c8-bd54-b4475341100f-kube-api-access-tj9xl\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.582475 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pvnd\" (UniqueName: \"kubernetes.io/projected/099c24fd-6452-4bfb-99bb-08b1304412b3-kube-api-access-6pvnd\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.582551 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.582624 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-config-data\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.582682 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.582771 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.582814 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.582960 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.583044 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.686289 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.686387 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.686437 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/099c24fd-6452-4bfb-99bb-08b1304412b3-logs\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.686489 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pvnd\" (UniqueName: \"kubernetes.io/projected/099c24fd-6452-4bfb-99bb-08b1304412b3-kube-api-access-6pvnd\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.686511 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj9xl\" (UniqueName: \"kubernetes.io/projected/054e8a69-ea75-47c8-bd54-b4475341100f-kube-api-access-tj9xl\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.686552 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.686593 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-config-data\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.686640 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.686701 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.686739 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.694022 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.695680 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.699057 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.699583 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.700305 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/099c24fd-6452-4bfb-99bb-08b1304412b3-logs\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.701503 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-config-data\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.702795 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.708615 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.710959 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj9xl\" (UniqueName: \"kubernetes.io/projected/054e8a69-ea75-47c8-bd54-b4475341100f-kube-api-access-tj9xl\") pod \"nova-cell1-novncproxy-0\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.718550 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pvnd\" (UniqueName: \"kubernetes.io/projected/099c24fd-6452-4bfb-99bb-08b1304412b3-kube-api-access-6pvnd\") pod \"nova-metadata-0\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " pod="openstack/nova-metadata-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.742899 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:32 crc kubenswrapper[5047]: I0223 08:52:32.752696 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:52:33 crc kubenswrapper[5047]: I0223 08:52:33.302752 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:52:33 crc kubenswrapper[5047]: W0223 08:52:33.305821 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod054e8a69_ea75_47c8_bd54_b4475341100f.slice/crio-b1f821f0c0aa971eb2f41a23bef8446760fe40b1f778e18fd5ac0eb656946e38 WatchSource:0}: Error finding container b1f821f0c0aa971eb2f41a23bef8446760fe40b1f778e18fd5ac0eb656946e38: Status 404 returned error can't find the container with id b1f821f0c0aa971eb2f41a23bef8446760fe40b1f778e18fd5ac0eb656946e38 Feb 23 08:52:33 crc kubenswrapper[5047]: I0223 08:52:33.340478 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:52:33 crc kubenswrapper[5047]: W0223 08:52:33.345075 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod099c24fd_6452_4bfb_99bb_08b1304412b3.slice/crio-5d75e09e6b11c9acd3c70741bf8d77988f67438fed8cc0ebfaf49ba170d5ad78 WatchSource:0}: Error finding container 5d75e09e6b11c9acd3c70741bf8d77988f67438fed8cc0ebfaf49ba170d5ad78: Status 404 returned error can't find the container with id 5d75e09e6b11c9acd3c70741bf8d77988f67438fed8cc0ebfaf49ba170d5ad78 Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.282790 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"099c24fd-6452-4bfb-99bb-08b1304412b3","Type":"ContainerStarted","Data":"fe25c9c5d07dc38acc41cd64e8eebf2bad34c1053763c03e932f4786a28248cb"} Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.283742 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"099c24fd-6452-4bfb-99bb-08b1304412b3","Type":"ContainerStarted","Data":"990d530341f70397379fe224e343c016e6e159f6712f07a61694c646dab09e15"} Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.283765 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"099c24fd-6452-4bfb-99bb-08b1304412b3","Type":"ContainerStarted","Data":"5d75e09e6b11c9acd3c70741bf8d77988f67438fed8cc0ebfaf49ba170d5ad78"} Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.290115 5047 generic.go:334] "Generic (PLEG): container finished" podID="b18b33d4-e894-476b-afa1-33794c308ddf" containerID="139b653cba3d09cd04f3b19f2671958eacc366ff39f4f54aea92188e9e684e8f" exitCode=137 Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.290197 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b18b33d4-e894-476b-afa1-33794c308ddf","Type":"ContainerDied","Data":"139b653cba3d09cd04f3b19f2671958eacc366ff39f4f54aea92188e9e684e8f"} Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.293401 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"054e8a69-ea75-47c8-bd54-b4475341100f","Type":"ContainerStarted","Data":"06eae257f344b94484be127cf7a2511935e34c00c735dcc879f6b8f3cea4311b"} Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.293474 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"054e8a69-ea75-47c8-bd54-b4475341100f","Type":"ContainerStarted","Data":"b1f821f0c0aa971eb2f41a23bef8446760fe40b1f778e18fd5ac0eb656946e38"} Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.317306 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.317276768 podStartE2EDuration="2.317276768s" podCreationTimestamp="2026-02-23 08:52:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:52:34.301053042 +0000 UTC m=+7676.552380176" watchObservedRunningTime="2026-02-23 08:52:34.317276768 +0000 UTC m=+7676.568603902" Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.328602 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.328578551 podStartE2EDuration="2.328578551s" podCreationTimestamp="2026-02-23 08:52:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:52:34.320068983 +0000 UTC m=+7676.571396117" watchObservedRunningTime="2026-02-23 08:52:34.328578551 +0000 UTC m=+7676.579905685" Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.579328 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.632561 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18b33d4-e894-476b-afa1-33794c308ddf-combined-ca-bundle\") pod \"b18b33d4-e894-476b-afa1-33794c308ddf\" (UID: \"b18b33d4-e894-476b-afa1-33794c308ddf\") " Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.632734 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxbrb\" (UniqueName: \"kubernetes.io/projected/b18b33d4-e894-476b-afa1-33794c308ddf-kube-api-access-vxbrb\") pod \"b18b33d4-e894-476b-afa1-33794c308ddf\" (UID: \"b18b33d4-e894-476b-afa1-33794c308ddf\") " Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.632825 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18b33d4-e894-476b-afa1-33794c308ddf-config-data\") pod \"b18b33d4-e894-476b-afa1-33794c308ddf\" (UID: \"b18b33d4-e894-476b-afa1-33794c308ddf\") " Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.639229 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b18b33d4-e894-476b-afa1-33794c308ddf-kube-api-access-vxbrb" (OuterVolumeSpecName: "kube-api-access-vxbrb") pod "b18b33d4-e894-476b-afa1-33794c308ddf" (UID: "b18b33d4-e894-476b-afa1-33794c308ddf"). InnerVolumeSpecName "kube-api-access-vxbrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.663532 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18b33d4-e894-476b-afa1-33794c308ddf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b18b33d4-e894-476b-afa1-33794c308ddf" (UID: "b18b33d4-e894-476b-afa1-33794c308ddf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.675325 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18b33d4-e894-476b-afa1-33794c308ddf-config-data" (OuterVolumeSpecName: "config-data") pod "b18b33d4-e894-476b-afa1-33794c308ddf" (UID: "b18b33d4-e894-476b-afa1-33794c308ddf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.734516 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18b33d4-e894-476b-afa1-33794c308ddf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.734557 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxbrb\" (UniqueName: \"kubernetes.io/projected/b18b33d4-e894-476b-afa1-33794c308ddf-kube-api-access-vxbrb\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:34 crc kubenswrapper[5047]: I0223 08:52:34.734569 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18b33d4-e894-476b-afa1-33794c308ddf-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.065662 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2tfvh"] Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.080241 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2tfvh"] Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.318553 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b18b33d4-e894-476b-afa1-33794c308ddf","Type":"ContainerDied","Data":"ff6c950539a0920dfd1f378921c83e117f90cbf9abca078fea04e5e859bfcc57"} Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.319364 5047 scope.go:117] "RemoveContainer" containerID="139b653cba3d09cd04f3b19f2671958eacc366ff39f4f54aea92188e9e684e8f" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.320832 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.388549 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.414884 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.429462 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:52:35 crc kubenswrapper[5047]: E0223 08:52:35.430267 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18b33d4-e894-476b-afa1-33794c308ddf" containerName="nova-scheduler-scheduler" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.430354 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18b33d4-e894-476b-afa1-33794c308ddf" containerName="nova-scheduler-scheduler" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.430626 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b18b33d4-e894-476b-afa1-33794c308ddf" containerName="nova-scheduler-scheduler" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.431564 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.436255 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.444798 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.555042 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw2tg\" (UniqueName: \"kubernetes.io/projected/46dcfef0-cc05-4de1-977f-622b93900d64-kube-api-access-mw2tg\") pod \"nova-scheduler-0\" (UID: \"46dcfef0-cc05-4de1-977f-622b93900d64\") " pod="openstack/nova-scheduler-0" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.555128 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dcfef0-cc05-4de1-977f-622b93900d64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46dcfef0-cc05-4de1-977f-622b93900d64\") " pod="openstack/nova-scheduler-0" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.555742 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46dcfef0-cc05-4de1-977f-622b93900d64-config-data\") pod \"nova-scheduler-0\" (UID: \"46dcfef0-cc05-4de1-977f-622b93900d64\") " pod="openstack/nova-scheduler-0" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.657599 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw2tg\" (UniqueName: \"kubernetes.io/projected/46dcfef0-cc05-4de1-977f-622b93900d64-kube-api-access-mw2tg\") pod \"nova-scheduler-0\" (UID: \"46dcfef0-cc05-4de1-977f-622b93900d64\") " pod="openstack/nova-scheduler-0" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.657972 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dcfef0-cc05-4de1-977f-622b93900d64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46dcfef0-cc05-4de1-977f-622b93900d64\") " pod="openstack/nova-scheduler-0" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.658156 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46dcfef0-cc05-4de1-977f-622b93900d64-config-data\") pod \"nova-scheduler-0\" (UID: \"46dcfef0-cc05-4de1-977f-622b93900d64\") " pod="openstack/nova-scheduler-0" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.665296 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dcfef0-cc05-4de1-977f-622b93900d64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46dcfef0-cc05-4de1-977f-622b93900d64\") " pod="openstack/nova-scheduler-0" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.665314 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46dcfef0-cc05-4de1-977f-622b93900d64-config-data\") pod \"nova-scheduler-0\" (UID: \"46dcfef0-cc05-4de1-977f-622b93900d64\") " pod="openstack/nova-scheduler-0" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.688304 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw2tg\" (UniqueName: \"kubernetes.io/projected/46dcfef0-cc05-4de1-977f-622b93900d64-kube-api-access-mw2tg\") pod \"nova-scheduler-0\" (UID: \"46dcfef0-cc05-4de1-977f-622b93900d64\") " pod="openstack/nova-scheduler-0" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.764707 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.879780 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 08:52:35 crc kubenswrapper[5047]: I0223 08:52:35.880278 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 08:52:36 crc kubenswrapper[5047]: W0223 08:52:36.069079 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dcfef0_cc05_4de1_977f_622b93900d64.slice/crio-0af00cb3447a308c778d79461b822a0683c4ab285e68c6cb08115b2e0faf314e WatchSource:0}: Error finding container 0af00cb3447a308c778d79461b822a0683c4ab285e68c6cb08115b2e0faf314e: Status 404 returned error can't find the container with id 0af00cb3447a308c778d79461b822a0683c4ab285e68c6cb08115b2e0faf314e Feb 23 08:52:36 crc kubenswrapper[5047]: I0223 08:52:36.074053 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:52:36 crc kubenswrapper[5047]: I0223 08:52:36.335170 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46dcfef0-cc05-4de1-977f-622b93900d64","Type":"ContainerStarted","Data":"0af00cb3447a308c778d79461b822a0683c4ab285e68c6cb08115b2e0faf314e"} Feb 23 08:52:36 crc kubenswrapper[5047]: I0223 08:52:36.362357 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab4fce8-e5e0-42e1-b0ec-b185ab028913" path="/var/lib/kubelet/pods/8ab4fce8-e5e0-42e1-b0ec-b185ab028913/volumes" Feb 23 08:52:36 crc kubenswrapper[5047]: I0223 08:52:36.363163 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b18b33d4-e894-476b-afa1-33794c308ddf" path="/var/lib/kubelet/pods/b18b33d4-e894-476b-afa1-33794c308ddf/volumes" Feb 23 08:52:36 crc kubenswrapper[5047]: I0223 08:52:36.879606 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.100:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:52:36 crc kubenswrapper[5047]: I0223 08:52:36.921177 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.100:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:52:37 crc kubenswrapper[5047]: I0223 08:52:37.353640 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46dcfef0-cc05-4de1-977f-622b93900d64","Type":"ContainerStarted","Data":"4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d"} Feb 23 08:52:37 crc kubenswrapper[5047]: I0223 08:52:37.390807 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.390775043 podStartE2EDuration="2.390775043s" podCreationTimestamp="2026-02-23 08:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:52:37.378130204 +0000 UTC m=+7679.629457428" watchObservedRunningTime="2026-02-23 08:52:37.390775043 +0000 UTC m=+7679.642102207" Feb 23 08:52:37 crc kubenswrapper[5047]: I0223 08:52:37.744084 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:37 crc kubenswrapper[5047]: I0223 08:52:37.753195 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:52:37 crc kubenswrapper[5047]: I0223 08:52:37.753242 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:52:40 crc kubenswrapper[5047]: I0223 08:52:40.765304 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 08:52:42 crc kubenswrapper[5047]: I0223 08:52:42.342952 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:52:42 crc kubenswrapper[5047]: E0223 08:52:42.344045 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:52:42 crc kubenswrapper[5047]: I0223 08:52:42.743222 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:42 crc kubenswrapper[5047]: I0223 08:52:42.753153 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 08:52:42 crc kubenswrapper[5047]: I0223 08:52:42.753257 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 08:52:42 crc kubenswrapper[5047]: I0223 08:52:42.782563 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.475504 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.726594 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mlbcp"] Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.728100 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.735118 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.735161 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.749589 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mlbcp"] Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.774205 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.102:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.774274 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.102:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.864102 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lrnw\" (UniqueName: \"kubernetes.io/projected/4323bc60-603e-45a2-81fb-7a8fa07d07c7-kube-api-access-6lrnw\") pod \"nova-cell1-cell-mapping-mlbcp\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.864193 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-scripts\") pod \"nova-cell1-cell-mapping-mlbcp\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.864232 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mlbcp\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.864259 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-config-data\") pod \"nova-cell1-cell-mapping-mlbcp\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.966050 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lrnw\" (UniqueName: \"kubernetes.io/projected/4323bc60-603e-45a2-81fb-7a8fa07d07c7-kube-api-access-6lrnw\") pod \"nova-cell1-cell-mapping-mlbcp\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.966145 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-scripts\") pod \"nova-cell1-cell-mapping-mlbcp\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.966176 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mlbcp\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.966195 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-config-data\") pod \"nova-cell1-cell-mapping-mlbcp\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.973493 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-config-data\") pod \"nova-cell1-cell-mapping-mlbcp\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.974635 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-scripts\") pod \"nova-cell1-cell-mapping-mlbcp\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.976509 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mlbcp\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:43 crc kubenswrapper[5047]: I0223 08:52:43.986973 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lrnw\" (UniqueName: \"kubernetes.io/projected/4323bc60-603e-45a2-81fb-7a8fa07d07c7-kube-api-access-6lrnw\") pod \"nova-cell1-cell-mapping-mlbcp\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:44 crc kubenswrapper[5047]: I0223 08:52:44.051854 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:44 crc kubenswrapper[5047]: I0223 08:52:44.539621 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mlbcp"] Feb 23 08:52:45 crc kubenswrapper[5047]: I0223 08:52:45.474223 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mlbcp" event={"ID":"4323bc60-603e-45a2-81fb-7a8fa07d07c7","Type":"ContainerStarted","Data":"7eaea98d0755487d3bd7e40d9df640eabeb421e14f34fb1d51127b4360c754c2"} Feb 23 08:52:45 crc kubenswrapper[5047]: I0223 08:52:45.474986 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mlbcp" event={"ID":"4323bc60-603e-45a2-81fb-7a8fa07d07c7","Type":"ContainerStarted","Data":"f4bb1af880eb4c6ffb76155c4aaa49d250ead38f7703e0d4b61904354d3c1d53"} Feb 23 08:52:45 crc kubenswrapper[5047]: I0223 08:52:45.506966 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mlbcp" podStartSLOduration=2.506945213 podStartE2EDuration="2.506945213s" podCreationTimestamp="2026-02-23 08:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:52:45.495219307 +0000 UTC m=+7687.746546461" watchObservedRunningTime="2026-02-23 08:52:45.506945213 +0000 UTC m=+7687.758272347" Feb 23 08:52:45 crc kubenswrapper[5047]: I0223 08:52:45.766945 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 08:52:45 crc kubenswrapper[5047]: I0223 08:52:45.797872 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 08:52:46 crc kubenswrapper[5047]: I0223 08:52:46.519134 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 08:52:46 crc kubenswrapper[5047]: I0223 08:52:46.964112 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.100:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:52:46 crc kubenswrapper[5047]: I0223 08:52:46.964201 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.100:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:52:50 crc kubenswrapper[5047]: I0223 08:52:50.545806 5047 generic.go:334] "Generic (PLEG): container finished" podID="4323bc60-603e-45a2-81fb-7a8fa07d07c7" containerID="7eaea98d0755487d3bd7e40d9df640eabeb421e14f34fb1d51127b4360c754c2" exitCode=0 Feb 23 08:52:50 crc kubenswrapper[5047]: I0223 08:52:50.545979 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mlbcp" event={"ID":"4323bc60-603e-45a2-81fb-7a8fa07d07c7","Type":"ContainerDied","Data":"7eaea98d0755487d3bd7e40d9df640eabeb421e14f34fb1d51127b4360c754c2"} Feb 23 08:52:51 crc kubenswrapper[5047]: I0223 08:52:51.952404 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.072501 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-combined-ca-bundle\") pod \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.072686 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-scripts\") pod \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.072856 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lrnw\" (UniqueName: \"kubernetes.io/projected/4323bc60-603e-45a2-81fb-7a8fa07d07c7-kube-api-access-6lrnw\") pod \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.072949 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-config-data\") pod \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\" (UID: \"4323bc60-603e-45a2-81fb-7a8fa07d07c7\") " Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.081006 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4323bc60-603e-45a2-81fb-7a8fa07d07c7-kube-api-access-6lrnw" (OuterVolumeSpecName: "kube-api-access-6lrnw") pod "4323bc60-603e-45a2-81fb-7a8fa07d07c7" (UID: "4323bc60-603e-45a2-81fb-7a8fa07d07c7"). InnerVolumeSpecName "kube-api-access-6lrnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.081666 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-scripts" (OuterVolumeSpecName: "scripts") pod "4323bc60-603e-45a2-81fb-7a8fa07d07c7" (UID: "4323bc60-603e-45a2-81fb-7a8fa07d07c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.107004 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-config-data" (OuterVolumeSpecName: "config-data") pod "4323bc60-603e-45a2-81fb-7a8fa07d07c7" (UID: "4323bc60-603e-45a2-81fb-7a8fa07d07c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.107479 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4323bc60-603e-45a2-81fb-7a8fa07d07c7" (UID: "4323bc60-603e-45a2-81fb-7a8fa07d07c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.176770 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lrnw\" (UniqueName: \"kubernetes.io/projected/4323bc60-603e-45a2-81fb-7a8fa07d07c7-kube-api-access-6lrnw\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.177105 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.177233 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.177327 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4323bc60-603e-45a2-81fb-7a8fa07d07c7-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.575299 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mlbcp" event={"ID":"4323bc60-603e-45a2-81fb-7a8fa07d07c7","Type":"ContainerDied","Data":"f4bb1af880eb4c6ffb76155c4aaa49d250ead38f7703e0d4b61904354d3c1d53"} Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.575382 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4bb1af880eb4c6ffb76155c4aaa49d250ead38f7703e0d4b61904354d3c1d53" Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.575497 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mlbcp" Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.782669 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.783134 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-log" containerID="cri-o://59dfcbeeffabc99d039d88010bb5dbb9ae513cd15cc4d2873ae9b5dbbbe1aea4" gracePeriod=30 Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.783562 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-api" containerID="cri-o://e91f6fc2a51b6dd890ddd0cd906848298a6cfb611dc8571a601ec0e8de7c70bd" gracePeriod=30 Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.798032 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.798281 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="46dcfef0-cc05-4de1-977f-622b93900d64" containerName="nova-scheduler-scheduler" containerID="cri-o://4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" gracePeriod=30 Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.829507 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.829851 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerName="nova-metadata-log" containerID="cri-o://990d530341f70397379fe224e343c016e6e159f6712f07a61694c646dab09e15" gracePeriod=30 Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.830011 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerName="nova-metadata-metadata" containerID="cri-o://fe25c9c5d07dc38acc41cd64e8eebf2bad34c1053763c03e932f4786a28248cb" gracePeriod=30 Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.840391 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.102:8775/\": EOF" Feb 23 08:52:52 crc kubenswrapper[5047]: I0223 08:52:52.845748 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.102:8775/\": EOF" Feb 23 08:52:53 crc kubenswrapper[5047]: I0223 08:52:53.588894 5047 generic.go:334] "Generic (PLEG): container finished" podID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerID="990d530341f70397379fe224e343c016e6e159f6712f07a61694c646dab09e15" exitCode=143 Feb 23 08:52:53 crc kubenswrapper[5047]: I0223 08:52:53.589600 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"099c24fd-6452-4bfb-99bb-08b1304412b3","Type":"ContainerDied","Data":"990d530341f70397379fe224e343c016e6e159f6712f07a61694c646dab09e15"} Feb 23 08:52:53 crc kubenswrapper[5047]: I0223 08:52:53.592283 5047 generic.go:334] "Generic (PLEG): container finished" podID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerID="59dfcbeeffabc99d039d88010bb5dbb9ae513cd15cc4d2873ae9b5dbbbe1aea4" exitCode=143 Feb 23 08:52:53 crc kubenswrapper[5047]: I0223 08:52:53.592363 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf","Type":"ContainerDied","Data":"59dfcbeeffabc99d039d88010bb5dbb9ae513cd15cc4d2873ae9b5dbbbe1aea4"} Feb 23 08:52:55 crc kubenswrapper[5047]: E0223 08:52:55.768525 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:52:55 crc kubenswrapper[5047]: E0223 08:52:55.771617 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:52:55 crc kubenswrapper[5047]: E0223 08:52:55.773473 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:52:55 crc kubenswrapper[5047]: E0223 08:52:55.773891 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="46dcfef0-cc05-4de1-977f-622b93900d64" containerName="nova-scheduler-scheduler" Feb 23 08:52:57 crc kubenswrapper[5047]: I0223 08:52:57.304012 5047 scope.go:117] "RemoveContainer" containerID="5f07377a165e12fbd1188e6620e4071673d8d697c7450db3dcd1239ec9b501a5" Feb 23 08:52:57 crc kubenswrapper[5047]: I0223 08:52:57.341110 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:52:57 crc kubenswrapper[5047]: E0223 08:52:57.341567 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:52:57 crc kubenswrapper[5047]: I0223 08:52:57.348632 5047 scope.go:117] "RemoveContainer" containerID="5537e05cd0355fd5f0df0c83cefaaf1ce918ee0b53b90094a0be5217922e10bf" Feb 23 08:52:57 crc kubenswrapper[5047]: I0223 08:52:57.375306 5047 scope.go:117] "RemoveContainer" containerID="f6f1b79ce137127b65dabf259d8de39db00053284a39ac300bf2af86e9705136" Feb 23 08:52:57 crc kubenswrapper[5047]: I0223 08:52:57.420138 5047 scope.go:117] "RemoveContainer" containerID="b8db065030ca1be9880114e8eba32cadb35bd96e02839eebdeace90dc4243c79" Feb 23 08:53:00 crc kubenswrapper[5047]: E0223 08:53:00.768709 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:00 crc kubenswrapper[5047]: E0223 08:53:00.771042 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:00 crc kubenswrapper[5047]: E0223 08:53:00.772713 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:00 crc kubenswrapper[5047]: E0223 08:53:00.772843 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="46dcfef0-cc05-4de1-977f-622b93900d64" containerName="nova-scheduler-scheduler" Feb 23 08:53:05 crc kubenswrapper[5047]: E0223 08:53:05.768834 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:05 crc kubenswrapper[5047]: E0223 08:53:05.772122 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:05 crc kubenswrapper[5047]: E0223 08:53:05.774308 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:05 crc kubenswrapper[5047]: E0223 08:53:05.774419 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="46dcfef0-cc05-4de1-977f-622b93900d64" containerName="nova-scheduler-scheduler" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.753923 5047 generic.go:334] "Generic (PLEG): container finished" podID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerID="fe25c9c5d07dc38acc41cd64e8eebf2bad34c1053763c03e932f4786a28248cb" exitCode=0 Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.754042 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"099c24fd-6452-4bfb-99bb-08b1304412b3","Type":"ContainerDied","Data":"fe25c9c5d07dc38acc41cd64e8eebf2bad34c1053763c03e932f4786a28248cb"} Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.754278 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"099c24fd-6452-4bfb-99bb-08b1304412b3","Type":"ContainerDied","Data":"5d75e09e6b11c9acd3c70741bf8d77988f67438fed8cc0ebfaf49ba170d5ad78"} Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.754297 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d75e09e6b11c9acd3c70741bf8d77988f67438fed8cc0ebfaf49ba170d5ad78" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.755878 5047 generic.go:334] "Generic (PLEG): container finished" podID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerID="e91f6fc2a51b6dd890ddd0cd906848298a6cfb611dc8571a601ec0e8de7c70bd" exitCode=0 Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.755922 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf","Type":"ContainerDied","Data":"e91f6fc2a51b6dd890ddd0cd906848298a6cfb611dc8571a601ec0e8de7c70bd"} Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.755986 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf","Type":"ContainerDied","Data":"66228b74688ac16182c4b891cb9b1ea87902277de99a4c5391fd3c32d61b30f0"} Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.756012 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66228b74688ac16182c4b891cb9b1ea87902277de99a4c5391fd3c32d61b30f0" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.807046 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.824439 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.875708 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-config-data\") pod \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.875821 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-combined-ca-bundle\") pod \"099c24fd-6452-4bfb-99bb-08b1304412b3\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.875880 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/099c24fd-6452-4bfb-99bb-08b1304412b3-logs\") pod \"099c24fd-6452-4bfb-99bb-08b1304412b3\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.876655 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/099c24fd-6452-4bfb-99bb-08b1304412b3-logs" (OuterVolumeSpecName: "logs") pod "099c24fd-6452-4bfb-99bb-08b1304412b3" (UID: "099c24fd-6452-4bfb-99bb-08b1304412b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.876807 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-logs\") pod \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.877328 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-logs" (OuterVolumeSpecName: "logs") pod "37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" (UID: "37115b42-ccfb-4aad-8df3-9e15ae5d2dcf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.877407 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-nova-metadata-tls-certs\") pod \"099c24fd-6452-4bfb-99bb-08b1304412b3\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.877881 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-combined-ca-bundle\") pod \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.877977 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qn9s\" (UniqueName: \"kubernetes.io/projected/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-kube-api-access-6qn9s\") pod \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\" (UID: \"37115b42-ccfb-4aad-8df3-9e15ae5d2dcf\") " Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.878018 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-config-data\") pod \"099c24fd-6452-4bfb-99bb-08b1304412b3\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.878049 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pvnd\" (UniqueName: \"kubernetes.io/projected/099c24fd-6452-4bfb-99bb-08b1304412b3-kube-api-access-6pvnd\") pod \"099c24fd-6452-4bfb-99bb-08b1304412b3\" (UID: \"099c24fd-6452-4bfb-99bb-08b1304412b3\") " Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.878703 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/099c24fd-6452-4bfb-99bb-08b1304412b3-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.878720 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.884292 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/099c24fd-6452-4bfb-99bb-08b1304412b3-kube-api-access-6pvnd" (OuterVolumeSpecName: "kube-api-access-6pvnd") pod "099c24fd-6452-4bfb-99bb-08b1304412b3" (UID: "099c24fd-6452-4bfb-99bb-08b1304412b3"). InnerVolumeSpecName "kube-api-access-6pvnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.885121 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-kube-api-access-6qn9s" (OuterVolumeSpecName: "kube-api-access-6qn9s") pod "37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" (UID: "37115b42-ccfb-4aad-8df3-9e15ae5d2dcf"). InnerVolumeSpecName "kube-api-access-6qn9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.910126 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-config-data" (OuterVolumeSpecName: "config-data") pod "099c24fd-6452-4bfb-99bb-08b1304412b3" (UID: "099c24fd-6452-4bfb-99bb-08b1304412b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.913458 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-config-data" (OuterVolumeSpecName: "config-data") pod "37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" (UID: "37115b42-ccfb-4aad-8df3-9e15ae5d2dcf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.923235 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" (UID: "37115b42-ccfb-4aad-8df3-9e15ae5d2dcf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.936874 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "099c24fd-6452-4bfb-99bb-08b1304412b3" (UID: "099c24fd-6452-4bfb-99bb-08b1304412b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.942799 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "099c24fd-6452-4bfb-99bb-08b1304412b3" (UID: "099c24fd-6452-4bfb-99bb-08b1304412b3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.980883 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.980926 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.980938 5047 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.980947 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.980956 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qn9s\" (UniqueName: \"kubernetes.io/projected/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf-kube-api-access-6qn9s\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.980965 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/099c24fd-6452-4bfb-99bb-08b1304412b3-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:06 crc kubenswrapper[5047]: I0223 08:53:06.980972 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pvnd\" (UniqueName: \"kubernetes.io/projected/099c24fd-6452-4bfb-99bb-08b1304412b3-kube-api-access-6pvnd\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.765458 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.765543 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.813876 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.825761 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.837359 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.847078 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.857831 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 08:53:07 crc kubenswrapper[5047]: E0223 08:53:07.858556 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerName="nova-metadata-log" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.858592 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerName="nova-metadata-log" Feb 23 08:53:07 crc kubenswrapper[5047]: E0223 08:53:07.858665 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerName="nova-metadata-metadata" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.858677 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerName="nova-metadata-metadata" Feb 23 08:53:07 crc kubenswrapper[5047]: E0223 08:53:07.858690 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4323bc60-603e-45a2-81fb-7a8fa07d07c7" containerName="nova-manage" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.858705 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4323bc60-603e-45a2-81fb-7a8fa07d07c7" containerName="nova-manage" Feb 23 08:53:07 crc kubenswrapper[5047]: E0223 08:53:07.858724 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-log" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.858735 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-log" Feb 23 08:53:07 crc kubenswrapper[5047]: E0223 08:53:07.858758 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-api" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.858769 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-api" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.859071 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerName="nova-metadata-log" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.859118 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-log" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.859157 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="099c24fd-6452-4bfb-99bb-08b1304412b3" containerName="nova-metadata-metadata" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.859174 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4323bc60-603e-45a2-81fb-7a8fa07d07c7" containerName="nova-manage" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.859200 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" containerName="nova-api-api" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.860944 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.869118 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.869459 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.905873 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.906450 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-config-data\") pod \"nova-api-0\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " pod="openstack/nova-api-0" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.906535 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " pod="openstack/nova-api-0" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.906598 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2hnw\" (UniqueName: \"kubernetes.io/projected/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-kube-api-access-m2hnw\") pod \"nova-api-0\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " pod="openstack/nova-api-0" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.906650 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-logs\") pod \"nova-api-0\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " pod="openstack/nova-api-0" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.907522 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.911782 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.912442 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 08:53:07 crc kubenswrapper[5047]: I0223 08:53:07.928789 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.009259 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-config-data\") pod \"nova-api-0\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " pod="openstack/nova-api-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.009337 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.009366 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-config-data\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.009389 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfq4j\" (UniqueName: \"kubernetes.io/projected/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-kube-api-access-wfq4j\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.009422 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " pod="openstack/nova-api-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.009463 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.009751 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-logs\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.009878 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2hnw\" (UniqueName: \"kubernetes.io/projected/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-kube-api-access-m2hnw\") pod \"nova-api-0\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " pod="openstack/nova-api-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.010143 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-logs\") pod \"nova-api-0\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " pod="openstack/nova-api-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.010759 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-logs\") pod \"nova-api-0\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " pod="openstack/nova-api-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.019932 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-config-data\") pod \"nova-api-0\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " pod="openstack/nova-api-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.020256 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " pod="openstack/nova-api-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.037840 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2hnw\" (UniqueName: \"kubernetes.io/projected/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-kube-api-access-m2hnw\") pod \"nova-api-0\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " pod="openstack/nova-api-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.113039 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.113106 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-config-data\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.113157 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfq4j\" (UniqueName: \"kubernetes.io/projected/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-kube-api-access-wfq4j\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.113216 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.113298 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-logs\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.113925 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-logs\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.116842 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.116939 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-config-data\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.117805 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.130337 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfq4j\" (UniqueName: \"kubernetes.io/projected/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-kube-api-access-wfq4j\") pod \"nova-metadata-0\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.186946 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.224223 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.364479 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="099c24fd-6452-4bfb-99bb-08b1304412b3" path="/var/lib/kubelet/pods/099c24fd-6452-4bfb-99bb-08b1304412b3/volumes" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.366515 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37115b42-ccfb-4aad-8df3-9e15ae5d2dcf" path="/var/lib/kubelet/pods/37115b42-ccfb-4aad-8df3-9e15ae5d2dcf/volumes" Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.749781 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.820516 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba1ab19e-e607-4e67-8f7a-aac99104a8c1","Type":"ContainerStarted","Data":"f9bc0f9d943a6678a3dbd389ab5cf65d65bf8e1058d9b0df0bf3edaa116b86ed"} Feb 23 08:53:08 crc kubenswrapper[5047]: I0223 08:53:08.822781 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:53:09 crc kubenswrapper[5047]: I0223 08:53:09.833930 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba1ab19e-e607-4e67-8f7a-aac99104a8c1","Type":"ContainerStarted","Data":"f3ff93d103df99f49d592c1bb86586e134f25b80fc2aee65938f62ed836549ee"} Feb 23 08:53:09 crc kubenswrapper[5047]: I0223 08:53:09.834178 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba1ab19e-e607-4e67-8f7a-aac99104a8c1","Type":"ContainerStarted","Data":"618016af1ac66d1f2dbd02aae9dd62989f19fb1d90a8b42efb2203a75c178846"} Feb 23 08:53:09 crc kubenswrapper[5047]: I0223 08:53:09.837806 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4051daa4-7ea3-4ab2-ad1f-52353e0ad995","Type":"ContainerStarted","Data":"d75d9c5c9f819a4fef6f8458366281a7a7b56194dad2149a7dac9c6288a64b3e"} Feb 23 08:53:09 crc kubenswrapper[5047]: I0223 08:53:09.837867 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4051daa4-7ea3-4ab2-ad1f-52353e0ad995","Type":"ContainerStarted","Data":"c349d4019c79a38336f264dd317d2df1febd4f029273706fb5fc837b0d162f2a"} Feb 23 08:53:09 crc kubenswrapper[5047]: I0223 08:53:09.837883 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4051daa4-7ea3-4ab2-ad1f-52353e0ad995","Type":"ContainerStarted","Data":"0e668d66a5382cbefe7b815098f939d1374da758802c5066e2cf912cdf260755"} Feb 23 08:53:09 crc kubenswrapper[5047]: I0223 08:53:09.856101 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.856070564 podStartE2EDuration="2.856070564s" podCreationTimestamp="2026-02-23 08:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:53:09.853170477 +0000 UTC m=+7712.104497611" watchObservedRunningTime="2026-02-23 08:53:09.856070564 +0000 UTC m=+7712.107397698" Feb 23 08:53:09 crc kubenswrapper[5047]: I0223 08:53:09.877137 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.877103489 podStartE2EDuration="2.877103489s" podCreationTimestamp="2026-02-23 08:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:53:09.873193594 +0000 UTC m=+7712.124520748" watchObservedRunningTime="2026-02-23 08:53:09.877103489 +0000 UTC m=+7712.128430623" Feb 23 08:53:10 crc kubenswrapper[5047]: E0223 08:53:10.767449 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:10 crc kubenswrapper[5047]: E0223 08:53:10.769636 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:10 crc kubenswrapper[5047]: E0223 08:53:10.771581 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:10 crc kubenswrapper[5047]: E0223 08:53:10.771659 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="46dcfef0-cc05-4de1-977f-622b93900d64" containerName="nova-scheduler-scheduler" Feb 23 08:53:11 crc kubenswrapper[5047]: I0223 08:53:11.342051 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:53:11 crc kubenswrapper[5047]: E0223 08:53:11.342349 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:53:13 crc kubenswrapper[5047]: I0223 08:53:13.224377 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:53:13 crc kubenswrapper[5047]: I0223 08:53:13.224746 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:53:15 crc kubenswrapper[5047]: E0223 08:53:15.769007 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:15 crc kubenswrapper[5047]: E0223 08:53:15.771201 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:15 crc kubenswrapper[5047]: E0223 08:53:15.773452 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:15 crc kubenswrapper[5047]: E0223 08:53:15.773551 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="46dcfef0-cc05-4de1-977f-622b93900d64" containerName="nova-scheduler-scheduler" Feb 23 08:53:18 crc kubenswrapper[5047]: I0223 08:53:18.188246 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 08:53:18 crc kubenswrapper[5047]: I0223 08:53:18.188717 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 08:53:18 crc kubenswrapper[5047]: I0223 08:53:18.225327 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 08:53:18 crc kubenswrapper[5047]: I0223 08:53:18.225393 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 08:53:19 crc kubenswrapper[5047]: I0223 08:53:19.271135 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ba1ab19e-e607-4e67-8f7a-aac99104a8c1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.105:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:53:19 crc kubenswrapper[5047]: I0223 08:53:19.287327 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ba1ab19e-e607-4e67-8f7a-aac99104a8c1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.105:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:53:19 crc kubenswrapper[5047]: I0223 08:53:19.287417 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.106:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 08:53:19 crc kubenswrapper[5047]: I0223 08:53:19.287364 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.106:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 08:53:20 crc kubenswrapper[5047]: E0223 08:53:20.793687 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:20 crc kubenswrapper[5047]: E0223 08:53:20.835383 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:20 crc kubenswrapper[5047]: E0223 08:53:20.841950 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:53:20 crc kubenswrapper[5047]: E0223 08:53:20.842040 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="46dcfef0-cc05-4de1-977f-622b93900d64" containerName="nova-scheduler-scheduler" Feb 23 08:53:22 crc kubenswrapper[5047]: I0223 08:53:22.971025 5047 generic.go:334] "Generic (PLEG): container finished" podID="46dcfef0-cc05-4de1-977f-622b93900d64" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" exitCode=137 Feb 23 08:53:22 crc kubenswrapper[5047]: I0223 08:53:22.971214 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46dcfef0-cc05-4de1-977f-622b93900d64","Type":"ContainerDied","Data":"4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d"} Feb 23 08:53:23 crc kubenswrapper[5047]: I0223 08:53:23.298714 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:53:23 crc kubenswrapper[5047]: I0223 08:53:23.377513 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw2tg\" (UniqueName: \"kubernetes.io/projected/46dcfef0-cc05-4de1-977f-622b93900d64-kube-api-access-mw2tg\") pod \"46dcfef0-cc05-4de1-977f-622b93900d64\" (UID: \"46dcfef0-cc05-4de1-977f-622b93900d64\") " Feb 23 08:53:23 crc kubenswrapper[5047]: I0223 08:53:23.377976 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dcfef0-cc05-4de1-977f-622b93900d64-combined-ca-bundle\") pod \"46dcfef0-cc05-4de1-977f-622b93900d64\" (UID: \"46dcfef0-cc05-4de1-977f-622b93900d64\") " Feb 23 08:53:23 crc kubenswrapper[5047]: I0223 08:53:23.378275 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46dcfef0-cc05-4de1-977f-622b93900d64-config-data\") pod \"46dcfef0-cc05-4de1-977f-622b93900d64\" (UID: \"46dcfef0-cc05-4de1-977f-622b93900d64\") " Feb 23 08:53:23 crc kubenswrapper[5047]: I0223 08:53:23.389198 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46dcfef0-cc05-4de1-977f-622b93900d64-kube-api-access-mw2tg" (OuterVolumeSpecName: "kube-api-access-mw2tg") pod "46dcfef0-cc05-4de1-977f-622b93900d64" (UID: "46dcfef0-cc05-4de1-977f-622b93900d64"). InnerVolumeSpecName "kube-api-access-mw2tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:53:23 crc kubenswrapper[5047]: I0223 08:53:23.417104 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46dcfef0-cc05-4de1-977f-622b93900d64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46dcfef0-cc05-4de1-977f-622b93900d64" (UID: "46dcfef0-cc05-4de1-977f-622b93900d64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:53:23 crc kubenswrapper[5047]: I0223 08:53:23.417247 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46dcfef0-cc05-4de1-977f-622b93900d64-config-data" (OuterVolumeSpecName: "config-data") pod "46dcfef0-cc05-4de1-977f-622b93900d64" (UID: "46dcfef0-cc05-4de1-977f-622b93900d64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:53:23 crc kubenswrapper[5047]: I0223 08:53:23.481298 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw2tg\" (UniqueName: \"kubernetes.io/projected/46dcfef0-cc05-4de1-977f-622b93900d64-kube-api-access-mw2tg\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:23 crc kubenswrapper[5047]: I0223 08:53:23.481344 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dcfef0-cc05-4de1-977f-622b93900d64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:23 crc kubenswrapper[5047]: I0223 08:53:23.481360 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46dcfef0-cc05-4de1-977f-622b93900d64-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:23 crc kubenswrapper[5047]: I0223 08:53:23.984436 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46dcfef0-cc05-4de1-977f-622b93900d64","Type":"ContainerDied","Data":"0af00cb3447a308c778d79461b822a0683c4ab285e68c6cb08115b2e0faf314e"} Feb 23 08:53:23 crc kubenswrapper[5047]: I0223 08:53:23.984501 5047 scope.go:117] "RemoveContainer" containerID="4eca0ba7f02f08864a7ca6a1b3ae58d1234a4a730c0afbe0fb5960e27b7fbe4d" Feb 23 08:53:23 crc kubenswrapper[5047]: I0223 08:53:23.984505 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.048652 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.065440 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.077985 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:53:24 crc kubenswrapper[5047]: E0223 08:53:24.078503 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46dcfef0-cc05-4de1-977f-622b93900d64" containerName="nova-scheduler-scheduler" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.078524 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="46dcfef0-cc05-4de1-977f-622b93900d64" containerName="nova-scheduler-scheduler" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.078843 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="46dcfef0-cc05-4de1-977f-622b93900d64" containerName="nova-scheduler-scheduler" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.079738 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.084390 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.089617 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.197142 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5czz\" (UniqueName: \"kubernetes.io/projected/3464c846-13b9-479e-b9af-3d571f03b284-kube-api-access-r5czz\") pod \"nova-scheduler-0\" (UID: \"3464c846-13b9-479e-b9af-3d571f03b284\") " pod="openstack/nova-scheduler-0" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.197612 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3464c846-13b9-479e-b9af-3d571f03b284-config-data\") pod \"nova-scheduler-0\" (UID: \"3464c846-13b9-479e-b9af-3d571f03b284\") " pod="openstack/nova-scheduler-0" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.197747 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3464c846-13b9-479e-b9af-3d571f03b284-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3464c846-13b9-479e-b9af-3d571f03b284\") " pod="openstack/nova-scheduler-0" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.300884 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3464c846-13b9-479e-b9af-3d571f03b284-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3464c846-13b9-479e-b9af-3d571f03b284\") " pod="openstack/nova-scheduler-0" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.301168 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5czz\" (UniqueName: \"kubernetes.io/projected/3464c846-13b9-479e-b9af-3d571f03b284-kube-api-access-r5czz\") pod \"nova-scheduler-0\" (UID: \"3464c846-13b9-479e-b9af-3d571f03b284\") " pod="openstack/nova-scheduler-0" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.301226 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3464c846-13b9-479e-b9af-3d571f03b284-config-data\") pod \"nova-scheduler-0\" (UID: \"3464c846-13b9-479e-b9af-3d571f03b284\") " pod="openstack/nova-scheduler-0" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.311062 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3464c846-13b9-479e-b9af-3d571f03b284-config-data\") pod \"nova-scheduler-0\" (UID: \"3464c846-13b9-479e-b9af-3d571f03b284\") " pod="openstack/nova-scheduler-0" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.312838 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3464c846-13b9-479e-b9af-3d571f03b284-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3464c846-13b9-479e-b9af-3d571f03b284\") " pod="openstack/nova-scheduler-0" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.323456 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5czz\" (UniqueName: \"kubernetes.io/projected/3464c846-13b9-479e-b9af-3d571f03b284-kube-api-access-r5czz\") pod \"nova-scheduler-0\" (UID: \"3464c846-13b9-479e-b9af-3d571f03b284\") " pod="openstack/nova-scheduler-0" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.341238 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:53:24 crc kubenswrapper[5047]: E0223 08:53:24.341778 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.359092 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46dcfef0-cc05-4de1-977f-622b93900d64" path="/var/lib/kubelet/pods/46dcfef0-cc05-4de1-977f-622b93900d64/volumes" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.447891 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.903260 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:53:24 crc kubenswrapper[5047]: I0223 08:53:24.995560 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3464c846-13b9-479e-b9af-3d571f03b284","Type":"ContainerStarted","Data":"94047716d66aceb29fd35f503786f8bac1aca08ab50d7f10030a5a48d9126a7f"} Feb 23 08:53:26 crc kubenswrapper[5047]: I0223 08:53:26.005969 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3464c846-13b9-479e-b9af-3d571f03b284","Type":"ContainerStarted","Data":"86a16a531bca49c39e0367bf3d4345deb39c5463a8f65beb54493af30860a9d3"} Feb 23 08:53:28 crc kubenswrapper[5047]: I0223 08:53:28.191588 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 08:53:28 crc kubenswrapper[5047]: I0223 08:53:28.194177 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 08:53:28 crc kubenswrapper[5047]: I0223 08:53:28.194442 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 08:53:28 crc kubenswrapper[5047]: I0223 08:53:28.201522 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 08:53:28 crc kubenswrapper[5047]: I0223 08:53:28.220437 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.220407883 podStartE2EDuration="4.220407883s" podCreationTimestamp="2026-02-23 08:53:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:53:26.026622654 +0000 UTC m=+7728.277949828" watchObservedRunningTime="2026-02-23 08:53:28.220407883 +0000 UTC m=+7730.471735027" Feb 23 08:53:28 crc kubenswrapper[5047]: I0223 08:53:28.231364 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 08:53:28 crc kubenswrapper[5047]: I0223 08:53:28.245753 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 08:53:28 crc kubenswrapper[5047]: I0223 08:53:28.246872 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.042585 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.204553 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.260672 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.448934 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.470361 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-749799ffdc-q8kjx"] Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.471992 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.509265 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-749799ffdc-q8kjx"] Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.627366 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-config\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.627520 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-ovsdbserver-nb\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.627569 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxts9\" (UniqueName: \"kubernetes.io/projected/51062b9d-51b2-4e47-b577-3cdc144cf0d1-kube-api-access-sxts9\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.627593 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-ovsdbserver-sb\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.627924 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-dns-svc\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.730559 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-ovsdbserver-nb\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.730689 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxts9\" (UniqueName: \"kubernetes.io/projected/51062b9d-51b2-4e47-b577-3cdc144cf0d1-kube-api-access-sxts9\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.730726 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-ovsdbserver-sb\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.730816 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-dns-svc\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.730885 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-config\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.731771 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-ovsdbserver-nb\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.732133 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-config\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.732232 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-dns-svc\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.732333 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-ovsdbserver-sb\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.753080 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxts9\" (UniqueName: \"kubernetes.io/projected/51062b9d-51b2-4e47-b577-3cdc144cf0d1-kube-api-access-sxts9\") pod \"dnsmasq-dns-749799ffdc-q8kjx\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:29 crc kubenswrapper[5047]: I0223 08:53:29.800467 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:30 crc kubenswrapper[5047]: I0223 08:53:30.338320 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-749799ffdc-q8kjx"] Feb 23 08:53:31 crc kubenswrapper[5047]: I0223 08:53:31.065973 5047 generic.go:334] "Generic (PLEG): container finished" podID="51062b9d-51b2-4e47-b577-3cdc144cf0d1" containerID="f5e5fd8547d7964dd7a8489415f62d3412f9ced9de30736cdacd9fe566d2cd9e" exitCode=0 Feb 23 08:53:31 crc kubenswrapper[5047]: I0223 08:53:31.066047 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" event={"ID":"51062b9d-51b2-4e47-b577-3cdc144cf0d1","Type":"ContainerDied","Data":"f5e5fd8547d7964dd7a8489415f62d3412f9ced9de30736cdacd9fe566d2cd9e"} Feb 23 08:53:31 crc kubenswrapper[5047]: I0223 08:53:31.066783 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" event={"ID":"51062b9d-51b2-4e47-b577-3cdc144cf0d1","Type":"ContainerStarted","Data":"cd82de8cc6bdaf684ad109dbf7950ce05eda902f0be6f3625e0bed3dacdb29d5"} Feb 23 08:53:32 crc kubenswrapper[5047]: I0223 08:53:32.081157 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" event={"ID":"51062b9d-51b2-4e47-b577-3cdc144cf0d1","Type":"ContainerStarted","Data":"071c4bbb4d57ad6d8007e0dcce8f552bbb5d21daa763338dd2cc6338d661611c"} Feb 23 08:53:32 crc kubenswrapper[5047]: I0223 08:53:32.081544 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:32 crc kubenswrapper[5047]: I0223 08:53:32.107831 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" podStartSLOduration=3.107792595 podStartE2EDuration="3.107792595s" podCreationTimestamp="2026-02-23 08:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:53:32.104898837 +0000 UTC m=+7734.356225971" watchObservedRunningTime="2026-02-23 08:53:32.107792595 +0000 UTC m=+7734.359119779" Feb 23 08:53:33 crc kubenswrapper[5047]: I0223 08:53:33.324890 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:53:33 crc kubenswrapper[5047]: I0223 08:53:33.325636 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ba1ab19e-e607-4e67-8f7a-aac99104a8c1" containerName="nova-api-api" containerID="cri-o://f3ff93d103df99f49d592c1bb86586e134f25b80fc2aee65938f62ed836549ee" gracePeriod=30 Feb 23 08:53:33 crc kubenswrapper[5047]: I0223 08:53:33.325461 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ba1ab19e-e607-4e67-8f7a-aac99104a8c1" containerName="nova-api-log" containerID="cri-o://618016af1ac66d1f2dbd02aae9dd62989f19fb1d90a8b42efb2203a75c178846" gracePeriod=30 Feb 23 08:53:34 crc kubenswrapper[5047]: I0223 08:53:34.102802 5047 generic.go:334] "Generic (PLEG): container finished" podID="ba1ab19e-e607-4e67-8f7a-aac99104a8c1" containerID="618016af1ac66d1f2dbd02aae9dd62989f19fb1d90a8b42efb2203a75c178846" exitCode=143 Feb 23 08:53:34 crc kubenswrapper[5047]: I0223 08:53:34.102886 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba1ab19e-e607-4e67-8f7a-aac99104a8c1","Type":"ContainerDied","Data":"618016af1ac66d1f2dbd02aae9dd62989f19fb1d90a8b42efb2203a75c178846"} Feb 23 08:53:34 crc kubenswrapper[5047]: I0223 08:53:34.448607 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 08:53:34 crc kubenswrapper[5047]: I0223 08:53:34.479692 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 08:53:35 crc kubenswrapper[5047]: I0223 08:53:35.147519 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 08:53:36 crc kubenswrapper[5047]: I0223 08:53:36.342824 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:53:36 crc kubenswrapper[5047]: E0223 08:53:36.343324 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:53:36 crc kubenswrapper[5047]: I0223 08:53:36.931023 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:53:36 crc kubenswrapper[5047]: I0223 08:53:36.990419 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-logs\") pod \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " Feb 23 08:53:36 crc kubenswrapper[5047]: I0223 08:53:36.990486 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-config-data\") pod \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " Feb 23 08:53:36 crc kubenswrapper[5047]: I0223 08:53:36.990632 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2hnw\" (UniqueName: \"kubernetes.io/projected/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-kube-api-access-m2hnw\") pod \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " Feb 23 08:53:36 crc kubenswrapper[5047]: I0223 08:53:36.990681 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-combined-ca-bundle\") pod \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\" (UID: \"ba1ab19e-e607-4e67-8f7a-aac99104a8c1\") " Feb 23 08:53:36 crc kubenswrapper[5047]: I0223 08:53:36.991082 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-logs" (OuterVolumeSpecName: "logs") pod "ba1ab19e-e607-4e67-8f7a-aac99104a8c1" (UID: "ba1ab19e-e607-4e67-8f7a-aac99104a8c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:53:36 crc kubenswrapper[5047]: I0223 08:53:36.991630 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:36 crc kubenswrapper[5047]: I0223 08:53:36.997248 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-kube-api-access-m2hnw" (OuterVolumeSpecName: "kube-api-access-m2hnw") pod "ba1ab19e-e607-4e67-8f7a-aac99104a8c1" (UID: "ba1ab19e-e607-4e67-8f7a-aac99104a8c1"). InnerVolumeSpecName "kube-api-access-m2hnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.024858 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-config-data" (OuterVolumeSpecName: "config-data") pod "ba1ab19e-e607-4e67-8f7a-aac99104a8c1" (UID: "ba1ab19e-e607-4e67-8f7a-aac99104a8c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.034350 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba1ab19e-e607-4e67-8f7a-aac99104a8c1" (UID: "ba1ab19e-e607-4e67-8f7a-aac99104a8c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.093735 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.093763 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2hnw\" (UniqueName: \"kubernetes.io/projected/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-kube-api-access-m2hnw\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.093793 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba1ab19e-e607-4e67-8f7a-aac99104a8c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.133182 5047 generic.go:334] "Generic (PLEG): container finished" podID="ba1ab19e-e607-4e67-8f7a-aac99104a8c1" containerID="f3ff93d103df99f49d592c1bb86586e134f25b80fc2aee65938f62ed836549ee" exitCode=0 Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.133243 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba1ab19e-e607-4e67-8f7a-aac99104a8c1","Type":"ContainerDied","Data":"f3ff93d103df99f49d592c1bb86586e134f25b80fc2aee65938f62ed836549ee"} Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.133255 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.133277 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba1ab19e-e607-4e67-8f7a-aac99104a8c1","Type":"ContainerDied","Data":"f9bc0f9d943a6678a3dbd389ab5cf65d65bf8e1058d9b0df0bf3edaa116b86ed"} Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.133311 5047 scope.go:117] "RemoveContainer" containerID="f3ff93d103df99f49d592c1bb86586e134f25b80fc2aee65938f62ed836549ee" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.173945 5047 scope.go:117] "RemoveContainer" containerID="618016af1ac66d1f2dbd02aae9dd62989f19fb1d90a8b42efb2203a75c178846" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.213251 5047 scope.go:117] "RemoveContainer" containerID="f3ff93d103df99f49d592c1bb86586e134f25b80fc2aee65938f62ed836549ee" Feb 23 08:53:37 crc kubenswrapper[5047]: E0223 08:53:37.214384 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3ff93d103df99f49d592c1bb86586e134f25b80fc2aee65938f62ed836549ee\": container with ID starting with f3ff93d103df99f49d592c1bb86586e134f25b80fc2aee65938f62ed836549ee not found: ID does not exist" containerID="f3ff93d103df99f49d592c1bb86586e134f25b80fc2aee65938f62ed836549ee" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.214421 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ff93d103df99f49d592c1bb86586e134f25b80fc2aee65938f62ed836549ee"} err="failed to get container status \"f3ff93d103df99f49d592c1bb86586e134f25b80fc2aee65938f62ed836549ee\": rpc error: code = NotFound desc = could not find container \"f3ff93d103df99f49d592c1bb86586e134f25b80fc2aee65938f62ed836549ee\": container with ID starting with f3ff93d103df99f49d592c1bb86586e134f25b80fc2aee65938f62ed836549ee not found: ID does not exist" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.214452 5047 scope.go:117] "RemoveContainer" containerID="618016af1ac66d1f2dbd02aae9dd62989f19fb1d90a8b42efb2203a75c178846" Feb 23 08:53:37 crc kubenswrapper[5047]: E0223 08:53:37.220324 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"618016af1ac66d1f2dbd02aae9dd62989f19fb1d90a8b42efb2203a75c178846\": container with ID starting with 618016af1ac66d1f2dbd02aae9dd62989f19fb1d90a8b42efb2203a75c178846 not found: ID does not exist" containerID="618016af1ac66d1f2dbd02aae9dd62989f19fb1d90a8b42efb2203a75c178846" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.220378 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"618016af1ac66d1f2dbd02aae9dd62989f19fb1d90a8b42efb2203a75c178846"} err="failed to get container status \"618016af1ac66d1f2dbd02aae9dd62989f19fb1d90a8b42efb2203a75c178846\": rpc error: code = NotFound desc = could not find container \"618016af1ac66d1f2dbd02aae9dd62989f19fb1d90a8b42efb2203a75c178846\": container with ID starting with 618016af1ac66d1f2dbd02aae9dd62989f19fb1d90a8b42efb2203a75c178846 not found: ID does not exist" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.229273 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.246573 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.256057 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 08:53:37 crc kubenswrapper[5047]: E0223 08:53:37.256704 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba1ab19e-e607-4e67-8f7a-aac99104a8c1" containerName="nova-api-api" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.256724 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba1ab19e-e607-4e67-8f7a-aac99104a8c1" containerName="nova-api-api" Feb 23 08:53:37 crc kubenswrapper[5047]: E0223 08:53:37.256769 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba1ab19e-e607-4e67-8f7a-aac99104a8c1" containerName="nova-api-log" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.256779 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba1ab19e-e607-4e67-8f7a-aac99104a8c1" containerName="nova-api-log" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.257028 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba1ab19e-e607-4e67-8f7a-aac99104a8c1" containerName="nova-api-log" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.257046 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba1ab19e-e607-4e67-8f7a-aac99104a8c1" containerName="nova-api-api" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.259059 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.262167 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.262237 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.262440 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.268148 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.408949 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/570bec3d-603c-4f92-b183-c5abb7e799d8-logs\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.409009 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-public-tls-certs\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.409051 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-config-data\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.409130 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq6kc\" (UniqueName: \"kubernetes.io/projected/570bec3d-603c-4f92-b183-c5abb7e799d8-kube-api-access-dq6kc\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.409217 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.409258 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.510518 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.510593 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.515331 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/570bec3d-603c-4f92-b183-c5abb7e799d8-logs\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.515388 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-public-tls-certs\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.515439 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-config-data\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.515499 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6kc\" (UniqueName: \"kubernetes.io/projected/570bec3d-603c-4f92-b183-c5abb7e799d8-kube-api-access-dq6kc\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.517518 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/570bec3d-603c-4f92-b183-c5abb7e799d8-logs\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.518430 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.518452 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-public-tls-certs\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.518901 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.520971 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-config-data\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.536796 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq6kc\" (UniqueName: \"kubernetes.io/projected/570bec3d-603c-4f92-b183-c5abb7e799d8-kube-api-access-dq6kc\") pod \"nova-api-0\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " pod="openstack/nova-api-0" Feb 23 08:53:37 crc kubenswrapper[5047]: I0223 08:53:37.583017 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:53:38 crc kubenswrapper[5047]: I0223 08:53:38.128239 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:53:38 crc kubenswrapper[5047]: W0223 08:53:38.135263 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod570bec3d_603c_4f92_b183_c5abb7e799d8.slice/crio-ff62d255a189ffb7e6fe77844e53d3e02120440be58166aca5aaf2e16703b99a WatchSource:0}: Error finding container ff62d255a189ffb7e6fe77844e53d3e02120440be58166aca5aaf2e16703b99a: Status 404 returned error can't find the container with id ff62d255a189ffb7e6fe77844e53d3e02120440be58166aca5aaf2e16703b99a Feb 23 08:53:38 crc kubenswrapper[5047]: I0223 08:53:38.352995 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba1ab19e-e607-4e67-8f7a-aac99104a8c1" path="/var/lib/kubelet/pods/ba1ab19e-e607-4e67-8f7a-aac99104a8c1/volumes" Feb 23 08:53:39 crc kubenswrapper[5047]: I0223 08:53:39.166161 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"570bec3d-603c-4f92-b183-c5abb7e799d8","Type":"ContainerStarted","Data":"0aa17ecda203fdcd836c6d88a867e21da21bf1e6661eebac5135520ec1cd39ff"} Feb 23 08:53:39 crc kubenswrapper[5047]: I0223 08:53:39.166560 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"570bec3d-603c-4f92-b183-c5abb7e799d8","Type":"ContainerStarted","Data":"a5b2ffecbcd348e06e8ff7bb5a39ad7d2084d7e9a705dfca2cffe091ad053ce8"} Feb 23 08:53:39 crc kubenswrapper[5047]: I0223 08:53:39.166572 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"570bec3d-603c-4f92-b183-c5abb7e799d8","Type":"ContainerStarted","Data":"ff62d255a189ffb7e6fe77844e53d3e02120440be58166aca5aaf2e16703b99a"} Feb 23 08:53:39 crc kubenswrapper[5047]: I0223 08:53:39.215430 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.21541008 podStartE2EDuration="2.21541008s" podCreationTimestamp="2026-02-23 08:53:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:53:39.211975747 +0000 UTC m=+7741.463302881" watchObservedRunningTime="2026-02-23 08:53:39.21541008 +0000 UTC m=+7741.466737214" Feb 23 08:53:39 crc kubenswrapper[5047]: I0223 08:53:39.802236 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 08:53:39 crc kubenswrapper[5047]: I0223 08:53:39.887835 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dd599d8f5-jwm7f"] Feb 23 08:53:39 crc kubenswrapper[5047]: I0223 08:53:39.888278 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" podUID="29eb3c3a-fe49-482a-95f2-dcb88bd99631" containerName="dnsmasq-dns" containerID="cri-o://6159200a71b24cc7fa6af5ac76985c1fa1ccf0657187df605681b308fa0d502c" gracePeriod=10 Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.174842 5047 generic.go:334] "Generic (PLEG): container finished" podID="29eb3c3a-fe49-482a-95f2-dcb88bd99631" containerID="6159200a71b24cc7fa6af5ac76985c1fa1ccf0657187df605681b308fa0d502c" exitCode=0 Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.175739 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" event={"ID":"29eb3c3a-fe49-482a-95f2-dcb88bd99631","Type":"ContainerDied","Data":"6159200a71b24cc7fa6af5ac76985c1fa1ccf0657187df605681b308fa0d502c"} Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.427849 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.607053 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-ovsdbserver-sb\") pod \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.607175 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9596w\" (UniqueName: \"kubernetes.io/projected/29eb3c3a-fe49-482a-95f2-dcb88bd99631-kube-api-access-9596w\") pod \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.607239 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-dns-svc\") pod \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.607336 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-ovsdbserver-nb\") pod \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.607495 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-config\") pod \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\" (UID: \"29eb3c3a-fe49-482a-95f2-dcb88bd99631\") " Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.612972 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29eb3c3a-fe49-482a-95f2-dcb88bd99631-kube-api-access-9596w" (OuterVolumeSpecName: "kube-api-access-9596w") pod "29eb3c3a-fe49-482a-95f2-dcb88bd99631" (UID: "29eb3c3a-fe49-482a-95f2-dcb88bd99631"). InnerVolumeSpecName "kube-api-access-9596w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.660388 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "29eb3c3a-fe49-482a-95f2-dcb88bd99631" (UID: "29eb3c3a-fe49-482a-95f2-dcb88bd99631"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.668162 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "29eb3c3a-fe49-482a-95f2-dcb88bd99631" (UID: "29eb3c3a-fe49-482a-95f2-dcb88bd99631"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.675534 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-config" (OuterVolumeSpecName: "config") pod "29eb3c3a-fe49-482a-95f2-dcb88bd99631" (UID: "29eb3c3a-fe49-482a-95f2-dcb88bd99631"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.676956 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29eb3c3a-fe49-482a-95f2-dcb88bd99631" (UID: "29eb3c3a-fe49-482a-95f2-dcb88bd99631"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.709907 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.709959 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.709973 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9596w\" (UniqueName: \"kubernetes.io/projected/29eb3c3a-fe49-482a-95f2-dcb88bd99631-kube-api-access-9596w\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.709982 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:40 crc kubenswrapper[5047]: I0223 08:53:40.709991 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29eb3c3a-fe49-482a-95f2-dcb88bd99631-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:41 crc kubenswrapper[5047]: I0223 08:53:41.185952 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" event={"ID":"29eb3c3a-fe49-482a-95f2-dcb88bd99631","Type":"ContainerDied","Data":"06426d728dd750a5be1cc05b5105273cbbd70455cee2b0b364062e08a0b23506"} Feb 23 08:53:41 crc kubenswrapper[5047]: I0223 08:53:41.186036 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dd599d8f5-jwm7f" Feb 23 08:53:41 crc kubenswrapper[5047]: I0223 08:53:41.186341 5047 scope.go:117] "RemoveContainer" containerID="6159200a71b24cc7fa6af5ac76985c1fa1ccf0657187df605681b308fa0d502c" Feb 23 08:53:41 crc kubenswrapper[5047]: I0223 08:53:41.225824 5047 scope.go:117] "RemoveContainer" containerID="94401db850ab8fb8a52690d1a1abec2f75c85554ac7083e24e6a3ea7a646477e" Feb 23 08:53:41 crc kubenswrapper[5047]: I0223 08:53:41.236450 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dd599d8f5-jwm7f"] Feb 23 08:53:41 crc kubenswrapper[5047]: I0223 08:53:41.248789 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7dd599d8f5-jwm7f"] Feb 23 08:53:42 crc kubenswrapper[5047]: I0223 08:53:42.354069 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29eb3c3a-fe49-482a-95f2-dcb88bd99631" path="/var/lib/kubelet/pods/29eb3c3a-fe49-482a-95f2-dcb88bd99631/volumes" Feb 23 08:53:47 crc kubenswrapper[5047]: I0223 08:53:47.583861 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 08:53:47 crc kubenswrapper[5047]: I0223 08:53:47.584618 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 08:53:48 crc kubenswrapper[5047]: I0223 08:53:48.597190 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="570bec3d-603c-4f92-b183-c5abb7e799d8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.109:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 08:53:48 crc kubenswrapper[5047]: I0223 08:53:48.597282 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="570bec3d-603c-4f92-b183-c5abb7e799d8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.109:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:53:51 crc kubenswrapper[5047]: I0223 08:53:51.341425 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:53:51 crc kubenswrapper[5047]: E0223 08:53:51.342299 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:53:57 crc kubenswrapper[5047]: I0223 08:53:57.594870 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 08:53:57 crc kubenswrapper[5047]: I0223 08:53:57.595747 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 08:53:57 crc kubenswrapper[5047]: I0223 08:53:57.597266 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 08:53:57 crc kubenswrapper[5047]: I0223 08:53:57.630005 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 08:53:57 crc kubenswrapper[5047]: I0223 08:53:57.642845 5047 scope.go:117] "RemoveContainer" containerID="33eae7fd00d3573717637b6d1b27a4f357b68e5237d264e046608aad9d1f4ff8" Feb 23 08:53:57 crc kubenswrapper[5047]: I0223 08:53:57.693880 5047 scope.go:117] "RemoveContainer" containerID="4b22d9a52d0ff7b3a17c2d02abb2ecb330d8044668dd0b051ad6fcd7999c316a" Feb 23 08:53:57 crc kubenswrapper[5047]: I0223 08:53:57.721481 5047 scope.go:117] "RemoveContainer" containerID="d3834b3ad00e430f6121ada4b6bc0f2b772619f5c429be533bf0a83a15d57d54" Feb 23 08:53:58 crc kubenswrapper[5047]: I0223 08:53:58.385214 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 08:53:58 crc kubenswrapper[5047]: I0223 08:53:58.410069 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 08:54:06 crc kubenswrapper[5047]: I0223 08:54:06.341557 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:54:06 crc kubenswrapper[5047]: E0223 08:54:06.342775 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.064775 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7779cb966c-fqwpp"] Feb 23 08:54:09 crc kubenswrapper[5047]: E0223 08:54:09.065642 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29eb3c3a-fe49-482a-95f2-dcb88bd99631" containerName="init" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.065660 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="29eb3c3a-fe49-482a-95f2-dcb88bd99631" containerName="init" Feb 23 08:54:09 crc kubenswrapper[5047]: E0223 08:54:09.065676 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29eb3c3a-fe49-482a-95f2-dcb88bd99631" containerName="dnsmasq-dns" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.065684 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="29eb3c3a-fe49-482a-95f2-dcb88bd99631" containerName="dnsmasq-dns" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.065870 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="29eb3c3a-fe49-482a-95f2-dcb88bd99631" containerName="dnsmasq-dns" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.067092 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.070120 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.070580 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-jrtt5" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.070779 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.071129 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.085432 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7779cb966c-fqwpp"] Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.110751 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6905234-3f71-4206-bd85-d07be4c3b1bb-config-data\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.110803 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8w2l\" (UniqueName: \"kubernetes.io/projected/c6905234-3f71-4206-bd85-d07be4c3b1bb-kube-api-access-x8w2l\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.110979 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6905234-3f71-4206-bd85-d07be4c3b1bb-scripts\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.111053 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6905234-3f71-4206-bd85-d07be4c3b1bb-logs\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.111169 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c6905234-3f71-4206-bd85-d07be4c3b1bb-horizon-secret-key\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.145181 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.145633 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="25721b59-54fc-4bb4-99d3-baeec00b6794" containerName="glance-httpd" containerID="cri-o://2e1c400557585d9a0e53099a74875e3e37dfd67d5c33a46d6a666c6b8800a90d" gracePeriod=30 Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.145458 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="25721b59-54fc-4bb4-99d3-baeec00b6794" containerName="glance-log" containerID="cri-o://d3d9093d8deff26964fc7d5561bb294d0155048f79dbc17801862ad39d212e4e" gracePeriod=30 Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.213207 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c6905234-3f71-4206-bd85-d07be4c3b1bb-horizon-secret-key\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.213596 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6905234-3f71-4206-bd85-d07be4c3b1bb-config-data\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.213621 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8w2l\" (UniqueName: \"kubernetes.io/projected/c6905234-3f71-4206-bd85-d07be4c3b1bb-kube-api-access-x8w2l\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.213665 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6905234-3f71-4206-bd85-d07be4c3b1bb-scripts\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.213703 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6905234-3f71-4206-bd85-d07be4c3b1bb-logs\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.214154 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6905234-3f71-4206-bd85-d07be4c3b1bb-logs\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.215558 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6905234-3f71-4206-bd85-d07be4c3b1bb-config-data\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.217341 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f4589d677-twpgk"] Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.218998 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.220783 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6905234-3f71-4206-bd85-d07be4c3b1bb-scripts\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.238343 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f4589d677-twpgk"] Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.239367 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c6905234-3f71-4206-bd85-d07be4c3b1bb-horizon-secret-key\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.243467 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8w2l\" (UniqueName: \"kubernetes.io/projected/c6905234-3f71-4206-bd85-d07be4c3b1bb-kube-api-access-x8w2l\") pod \"horizon-7779cb966c-fqwpp\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.252589 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.253045 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9b7fa1af-25c8-42b0-a972-4666fa4e077f" containerName="glance-log" containerID="cri-o://cbabf61df6138f8686107e1586bb8754fc246799ee80efcedfe1302d8c33e2fb" gracePeriod=30 Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.253660 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9b7fa1af-25c8-42b0-a972-4666fa4e077f" containerName="glance-httpd" containerID="cri-o://a9c8d1972e5c3f1cc86887c0e2504a4bc819cd81afddbe8505b9ce83220d4f84" gracePeriod=30 Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.320668 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-logs\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.320932 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-horizon-secret-key\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.321023 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-scripts\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.321270 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndxj5\" (UniqueName: \"kubernetes.io/projected/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-kube-api-access-ndxj5\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.321377 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-config-data\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.395312 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.423268 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndxj5\" (UniqueName: \"kubernetes.io/projected/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-kube-api-access-ndxj5\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.423341 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-config-data\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.423490 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-logs\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.423609 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-horizon-secret-key\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.423660 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-scripts\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.424731 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-logs\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.425511 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-scripts\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.425623 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-config-data\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.430397 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-horizon-secret-key\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.445127 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndxj5\" (UniqueName: \"kubernetes.io/projected/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-kube-api-access-ndxj5\") pod \"horizon-f4589d677-twpgk\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.503246 5047 generic.go:334] "Generic (PLEG): container finished" podID="9b7fa1af-25c8-42b0-a972-4666fa4e077f" containerID="cbabf61df6138f8686107e1586bb8754fc246799ee80efcedfe1302d8c33e2fb" exitCode=143 Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.503452 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b7fa1af-25c8-42b0-a972-4666fa4e077f","Type":"ContainerDied","Data":"cbabf61df6138f8686107e1586bb8754fc246799ee80efcedfe1302d8c33e2fb"} Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.507356 5047 generic.go:334] "Generic (PLEG): container finished" podID="25721b59-54fc-4bb4-99d3-baeec00b6794" containerID="d3d9093d8deff26964fc7d5561bb294d0155048f79dbc17801862ad39d212e4e" exitCode=143 Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.507390 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25721b59-54fc-4bb4-99d3-baeec00b6794","Type":"ContainerDied","Data":"d3d9093d8deff26964fc7d5561bb294d0155048f79dbc17801862ad39d212e4e"} Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.643771 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:09 crc kubenswrapper[5047]: I0223 08:54:09.995674 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7779cb966c-fqwpp"] Feb 23 08:54:10 crc kubenswrapper[5047]: I0223 08:54:10.037769 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:54:10 crc kubenswrapper[5047]: I0223 08:54:10.111658 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f4589d677-twpgk"] Feb 23 08:54:10 crc kubenswrapper[5047]: I0223 08:54:10.517185 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f4589d677-twpgk" event={"ID":"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df","Type":"ContainerStarted","Data":"6aa9e689358c9b19cd85a38da3c1f29e1f925a61e4d34b825c91ab97f0bad7fe"} Feb 23 08:54:10 crc kubenswrapper[5047]: I0223 08:54:10.518348 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7779cb966c-fqwpp" event={"ID":"c6905234-3f71-4206-bd85-d07be4c3b1bb","Type":"ContainerStarted","Data":"4d952746a30848a563a21e31f0501d07d0fb20000cdd06f4cdc9b5643d161292"} Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.841993 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7779cb966c-fqwpp"] Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.864395 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d5d76bdc4-cpnp9"] Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.867699 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.873469 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.886801 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d5d76bdc4-cpnp9"] Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.954727 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f4589d677-twpgk"] Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.977197 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7794c55cb8-dz2pc"] Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.979471 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.993223 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7794c55cb8-dz2pc"] Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.999309 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddbc9d78-4505-4215-b28c-69af3df4de72-logs\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.999381 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddbc9d78-4505-4215-b28c-69af3df4de72-scripts\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.999430 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-combined-ca-bundle\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.999479 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ddbc9d78-4505-4215-b28c-69af3df4de72-config-data\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.999508 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5jx4\" (UniqueName: \"kubernetes.io/projected/ddbc9d78-4505-4215-b28c-69af3df4de72-kube-api-access-m5jx4\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.999539 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-horizon-secret-key\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:11 crc kubenswrapper[5047]: I0223 08:54:11.999563 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-horizon-tls-certs\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.101666 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-horizon-secret-key\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102144 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90893ba0-abf6-4d64-be28-115c844f3252-config-data\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102185 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-horizon-tls-certs\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102330 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-combined-ca-bundle\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102417 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90893ba0-abf6-4d64-be28-115c844f3252-logs\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102464 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-horizon-secret-key\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102487 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddbc9d78-4505-4215-b28c-69af3df4de72-logs\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102510 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx296\" (UniqueName: \"kubernetes.io/projected/90893ba0-abf6-4d64-be28-115c844f3252-kube-api-access-vx296\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102581 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddbc9d78-4505-4215-b28c-69af3df4de72-scripts\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102656 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90893ba0-abf6-4d64-be28-115c844f3252-scripts\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102685 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-combined-ca-bundle\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102724 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ddbc9d78-4505-4215-b28c-69af3df4de72-config-data\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102759 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5jx4\" (UniqueName: \"kubernetes.io/projected/ddbc9d78-4505-4215-b28c-69af3df4de72-kube-api-access-m5jx4\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102784 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-horizon-tls-certs\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.102939 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddbc9d78-4505-4215-b28c-69af3df4de72-logs\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.103478 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddbc9d78-4505-4215-b28c-69af3df4de72-scripts\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.104218 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ddbc9d78-4505-4215-b28c-69af3df4de72-config-data\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.110061 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-combined-ca-bundle\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.112458 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-horizon-tls-certs\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.114478 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-horizon-secret-key\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.120210 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5jx4\" (UniqueName: \"kubernetes.io/projected/ddbc9d78-4505-4215-b28c-69af3df4de72-kube-api-access-m5jx4\") pod \"horizon-5d5d76bdc4-cpnp9\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.197606 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.204357 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-combined-ca-bundle\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.204434 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90893ba0-abf6-4d64-be28-115c844f3252-logs\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.204482 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-horizon-secret-key\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.204519 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx296\" (UniqueName: \"kubernetes.io/projected/90893ba0-abf6-4d64-be28-115c844f3252-kube-api-access-vx296\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.204565 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90893ba0-abf6-4d64-be28-115c844f3252-scripts\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.204652 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-horizon-tls-certs\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.204693 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90893ba0-abf6-4d64-be28-115c844f3252-config-data\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.205699 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90893ba0-abf6-4d64-be28-115c844f3252-logs\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.205993 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90893ba0-abf6-4d64-be28-115c844f3252-scripts\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.206822 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90893ba0-abf6-4d64-be28-115c844f3252-config-data\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.209815 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-combined-ca-bundle\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.211035 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-horizon-tls-certs\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.218365 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-horizon-secret-key\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.235791 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx296\" (UniqueName: \"kubernetes.io/projected/90893ba0-abf6-4d64-be28-115c844f3252-kube-api-access-vx296\") pod \"horizon-7794c55cb8-dz2pc\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.315403 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.571102 5047 generic.go:334] "Generic (PLEG): container finished" podID="25721b59-54fc-4bb4-99d3-baeec00b6794" containerID="2e1c400557585d9a0e53099a74875e3e37dfd67d5c33a46d6a666c6b8800a90d" exitCode=0 Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.571558 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25721b59-54fc-4bb4-99d3-baeec00b6794","Type":"ContainerDied","Data":"2e1c400557585d9a0e53099a74875e3e37dfd67d5c33a46d6a666c6b8800a90d"} Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.791111 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d5d76bdc4-cpnp9"] Feb 23 08:54:12 crc kubenswrapper[5047]: I0223 08:54:12.857210 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7794c55cb8-dz2pc"] Feb 23 08:54:12 crc kubenswrapper[5047]: W0223 08:54:12.874024 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90893ba0_abf6_4d64_be28_115c844f3252.slice/crio-e457ce8eeba6435b150c9187e7ab69b876f22b5794652c51d5b371cec0fc0ba5 WatchSource:0}: Error finding container e457ce8eeba6435b150c9187e7ab69b876f22b5794652c51d5b371cec0fc0ba5: Status 404 returned error can't find the container with id e457ce8eeba6435b150c9187e7ab69b876f22b5794652c51d5b371cec0fc0ba5 Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.341548 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.449849 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25721b59-54fc-4bb4-99d3-baeec00b6794-logs\") pod \"25721b59-54fc-4bb4-99d3-baeec00b6794\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.450024 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-combined-ca-bundle\") pod \"25721b59-54fc-4bb4-99d3-baeec00b6794\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.450082 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-public-tls-certs\") pod \"25721b59-54fc-4bb4-99d3-baeec00b6794\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.450112 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4ck8\" (UniqueName: \"kubernetes.io/projected/25721b59-54fc-4bb4-99d3-baeec00b6794-kube-api-access-z4ck8\") pod \"25721b59-54fc-4bb4-99d3-baeec00b6794\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.450207 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25721b59-54fc-4bb4-99d3-baeec00b6794-httpd-run\") pod \"25721b59-54fc-4bb4-99d3-baeec00b6794\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.450369 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-config-data\") pod \"25721b59-54fc-4bb4-99d3-baeec00b6794\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.450400 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-scripts\") pod \"25721b59-54fc-4bb4-99d3-baeec00b6794\" (UID: \"25721b59-54fc-4bb4-99d3-baeec00b6794\") " Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.452710 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25721b59-54fc-4bb4-99d3-baeec00b6794-logs" (OuterVolumeSpecName: "logs") pod "25721b59-54fc-4bb4-99d3-baeec00b6794" (UID: "25721b59-54fc-4bb4-99d3-baeec00b6794"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.457814 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25721b59-54fc-4bb4-99d3-baeec00b6794-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "25721b59-54fc-4bb4-99d3-baeec00b6794" (UID: "25721b59-54fc-4bb4-99d3-baeec00b6794"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.461621 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25721b59-54fc-4bb4-99d3-baeec00b6794-kube-api-access-z4ck8" (OuterVolumeSpecName: "kube-api-access-z4ck8") pod "25721b59-54fc-4bb4-99d3-baeec00b6794" (UID: "25721b59-54fc-4bb4-99d3-baeec00b6794"). InnerVolumeSpecName "kube-api-access-z4ck8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.480245 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-scripts" (OuterVolumeSpecName: "scripts") pod "25721b59-54fc-4bb4-99d3-baeec00b6794" (UID: "25721b59-54fc-4bb4-99d3-baeec00b6794"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.499326 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25721b59-54fc-4bb4-99d3-baeec00b6794" (UID: "25721b59-54fc-4bb4-99d3-baeec00b6794"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.538949 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "25721b59-54fc-4bb4-99d3-baeec00b6794" (UID: "25721b59-54fc-4bb4-99d3-baeec00b6794"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.542188 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-config-data" (OuterVolumeSpecName: "config-data") pod "25721b59-54fc-4bb4-99d3-baeec00b6794" (UID: "25721b59-54fc-4bb4-99d3-baeec00b6794"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.554327 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.554374 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.554387 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25721b59-54fc-4bb4-99d3-baeec00b6794-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.554401 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.554421 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25721b59-54fc-4bb4-99d3-baeec00b6794-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.554435 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4ck8\" (UniqueName: \"kubernetes.io/projected/25721b59-54fc-4bb4-99d3-baeec00b6794-kube-api-access-z4ck8\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.554450 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25721b59-54fc-4bb4-99d3-baeec00b6794-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.586112 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25721b59-54fc-4bb4-99d3-baeec00b6794","Type":"ContainerDied","Data":"1d44fe834b1c58cbe166cac3125e8397d4937d44adfde62a5a54135f7b570bde"} Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.586181 5047 scope.go:117] "RemoveContainer" containerID="2e1c400557585d9a0e53099a74875e3e37dfd67d5c33a46d6a666c6b8800a90d" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.586347 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.590265 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7794c55cb8-dz2pc" event={"ID":"90893ba0-abf6-4d64-be28-115c844f3252","Type":"ContainerStarted","Data":"e457ce8eeba6435b150c9187e7ab69b876f22b5794652c51d5b371cec0fc0ba5"} Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.593187 5047 generic.go:334] "Generic (PLEG): container finished" podID="9b7fa1af-25c8-42b0-a972-4666fa4e077f" containerID="a9c8d1972e5c3f1cc86887c0e2504a4bc819cd81afddbe8505b9ce83220d4f84" exitCode=0 Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.593287 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b7fa1af-25c8-42b0-a972-4666fa4e077f","Type":"ContainerDied","Data":"a9c8d1972e5c3f1cc86887c0e2504a4bc819cd81afddbe8505b9ce83220d4f84"} Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.599456 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5d76bdc4-cpnp9" event={"ID":"ddbc9d78-4505-4215-b28c-69af3df4de72","Type":"ContainerStarted","Data":"d3989c0690715316910da09d2405d581a7394fdfe3fe7e45d50ed75c4b58e087"} Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.632513 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.663397 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.671268 5047 scope.go:117] "RemoveContainer" containerID="d3d9093d8deff26964fc7d5561bb294d0155048f79dbc17801862ad39d212e4e" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.698117 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:54:13 crc kubenswrapper[5047]: E0223 08:54:13.700569 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25721b59-54fc-4bb4-99d3-baeec00b6794" containerName="glance-httpd" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.700599 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="25721b59-54fc-4bb4-99d3-baeec00b6794" containerName="glance-httpd" Feb 23 08:54:13 crc kubenswrapper[5047]: E0223 08:54:13.700651 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25721b59-54fc-4bb4-99d3-baeec00b6794" containerName="glance-log" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.700659 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="25721b59-54fc-4bb4-99d3-baeec00b6794" containerName="glance-log" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.701162 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="25721b59-54fc-4bb4-99d3-baeec00b6794" containerName="glance-httpd" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.701201 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="25721b59-54fc-4bb4-99d3-baeec00b6794" containerName="glance-log" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.703752 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.703882 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.707797 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.708200 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.781381 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95116420-b62b-402c-bcbe-17026cba0354-logs\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.781447 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-scripts\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.781467 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.781494 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.781662 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-config-data\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.782067 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95116420-b62b-402c-bcbe-17026cba0354-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.782198 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t95r9\" (UniqueName: \"kubernetes.io/projected/95116420-b62b-402c-bcbe-17026cba0354-kube-api-access-t95r9\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.884229 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-scripts\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.884272 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.884297 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.884337 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-config-data\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.884394 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95116420-b62b-402c-bcbe-17026cba0354-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.884411 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t95r9\" (UniqueName: \"kubernetes.io/projected/95116420-b62b-402c-bcbe-17026cba0354-kube-api-access-t95r9\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.884485 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95116420-b62b-402c-bcbe-17026cba0354-logs\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.885064 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95116420-b62b-402c-bcbe-17026cba0354-logs\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.885286 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95116420-b62b-402c-bcbe-17026cba0354-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.888832 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-scripts\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.889026 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.892370 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.901615 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-config-data\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:13 crc kubenswrapper[5047]: I0223 08:54:13.924075 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t95r9\" (UniqueName: \"kubernetes.io/projected/95116420-b62b-402c-bcbe-17026cba0354-kube-api-access-t95r9\") pod \"glance-default-external-api-0\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " pod="openstack/glance-default-external-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.034108 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.354421 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25721b59-54fc-4bb4-99d3-baeec00b6794" path="/var/lib/kubelet/pods/25721b59-54fc-4bb4-99d3-baeec00b6794/volumes" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.427733 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.505874 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-scripts\") pod \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.506085 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-combined-ca-bundle\") pod \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.506136 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7fa1af-25c8-42b0-a972-4666fa4e077f-logs\") pod \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.506198 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-config-data\") pod \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.506234 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b7fa1af-25c8-42b0-a972-4666fa4e077f-httpd-run\") pod \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.506282 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-internal-tls-certs\") pod \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.506868 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7fa1af-25c8-42b0-a972-4666fa4e077f-logs" (OuterVolumeSpecName: "logs") pod "9b7fa1af-25c8-42b0-a972-4666fa4e077f" (UID: "9b7fa1af-25c8-42b0-a972-4666fa4e077f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.507635 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7fa1af-25c8-42b0-a972-4666fa4e077f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9b7fa1af-25c8-42b0-a972-4666fa4e077f" (UID: "9b7fa1af-25c8-42b0-a972-4666fa4e077f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.509215 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxfnb\" (UniqueName: \"kubernetes.io/projected/9b7fa1af-25c8-42b0-a972-4666fa4e077f-kube-api-access-cxfnb\") pod \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\" (UID: \"9b7fa1af-25c8-42b0-a972-4666fa4e077f\") " Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.511278 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7fa1af-25c8-42b0-a972-4666fa4e077f-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.511300 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b7fa1af-25c8-42b0-a972-4666fa4e077f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.513323 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7fa1af-25c8-42b0-a972-4666fa4e077f-kube-api-access-cxfnb" (OuterVolumeSpecName: "kube-api-access-cxfnb") pod "9b7fa1af-25c8-42b0-a972-4666fa4e077f" (UID: "9b7fa1af-25c8-42b0-a972-4666fa4e077f"). InnerVolumeSpecName "kube-api-access-cxfnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.519493 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-scripts" (OuterVolumeSpecName: "scripts") pod "9b7fa1af-25c8-42b0-a972-4666fa4e077f" (UID: "9b7fa1af-25c8-42b0-a972-4666fa4e077f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.548210 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b7fa1af-25c8-42b0-a972-4666fa4e077f" (UID: "9b7fa1af-25c8-42b0-a972-4666fa4e077f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.580134 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-config-data" (OuterVolumeSpecName: "config-data") pod "9b7fa1af-25c8-42b0-a972-4666fa4e077f" (UID: "9b7fa1af-25c8-42b0-a972-4666fa4e077f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.597561 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9b7fa1af-25c8-42b0-a972-4666fa4e077f" (UID: "9b7fa1af-25c8-42b0-a972-4666fa4e077f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.614113 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.614163 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.614177 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.614194 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxfnb\" (UniqueName: \"kubernetes.io/projected/9b7fa1af-25c8-42b0-a972-4666fa4e077f-kube-api-access-cxfnb\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.614207 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b7fa1af-25c8-42b0-a972-4666fa4e077f-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.626452 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b7fa1af-25c8-42b0-a972-4666fa4e077f","Type":"ContainerDied","Data":"c5468c8e21bd17c3568a2bcad01e6bedd812a73e027653ac80f8232fb6af1d22"} Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.626530 5047 scope.go:117] "RemoveContainer" containerID="a9c8d1972e5c3f1cc86887c0e2504a4bc819cd81afddbe8505b9ce83220d4f84" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.626694 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.669251 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.687410 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.701416 5047 scope.go:117] "RemoveContainer" containerID="cbabf61df6138f8686107e1586bb8754fc246799ee80efcedfe1302d8c33e2fb" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.703400 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:54:14 crc kubenswrapper[5047]: E0223 08:54:14.703814 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7fa1af-25c8-42b0-a972-4666fa4e077f" containerName="glance-log" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.703835 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7fa1af-25c8-42b0-a972-4666fa4e077f" containerName="glance-log" Feb 23 08:54:14 crc kubenswrapper[5047]: E0223 08:54:14.703873 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7fa1af-25c8-42b0-a972-4666fa4e077f" containerName="glance-httpd" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.703881 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7fa1af-25c8-42b0-a972-4666fa4e077f" containerName="glance-httpd" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.704064 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7fa1af-25c8-42b0-a972-4666fa4e077f" containerName="glance-log" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.704085 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7fa1af-25c8-42b0-a972-4666fa4e077f" containerName="glance-httpd" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.705132 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.717770 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.717779 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.718194 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.718239 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce03d60-441e-4a79-acce-d54444aedfcb-logs\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.718283 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.718303 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.718869 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxwvf\" (UniqueName: \"kubernetes.io/projected/4ce03d60-441e-4a79-acce-d54444aedfcb-kube-api-access-bxwvf\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.719048 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.719193 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ce03d60-441e-4a79-acce-d54444aedfcb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.725814 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.748584 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.821422 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxwvf\" (UniqueName: \"kubernetes.io/projected/4ce03d60-441e-4a79-acce-d54444aedfcb-kube-api-access-bxwvf\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.821481 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.821523 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ce03d60-441e-4a79-acce-d54444aedfcb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.821580 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.821606 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce03d60-441e-4a79-acce-d54444aedfcb-logs\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.821692 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.821710 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.822584 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce03d60-441e-4a79-acce-d54444aedfcb-logs\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.822873 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ce03d60-441e-4a79-acce-d54444aedfcb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.827373 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.832648 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.832770 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.833528 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:14 crc kubenswrapper[5047]: I0223 08:54:14.841285 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxwvf\" (UniqueName: \"kubernetes.io/projected/4ce03d60-441e-4a79-acce-d54444aedfcb-kube-api-access-bxwvf\") pod \"glance-default-internal-api-0\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:54:15 crc kubenswrapper[5047]: I0223 08:54:15.043740 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:54:15 crc kubenswrapper[5047]: I0223 08:54:15.685631 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95116420-b62b-402c-bcbe-17026cba0354","Type":"ContainerStarted","Data":"e459a4cfaf687ea8deefc617116bb6434c5009226114df0c9539ae23a43598d7"} Feb 23 08:54:15 crc kubenswrapper[5047]: I0223 08:54:15.722097 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:54:16 crc kubenswrapper[5047]: I0223 08:54:16.360959 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7fa1af-25c8-42b0-a972-4666fa4e077f" path="/var/lib/kubelet/pods/9b7fa1af-25c8-42b0-a972-4666fa4e077f/volumes" Feb 23 08:54:16 crc kubenswrapper[5047]: I0223 08:54:16.700663 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4ce03d60-441e-4a79-acce-d54444aedfcb","Type":"ContainerStarted","Data":"ec75f8350065896340138084424783a8ac231166ea7fffbf795299442b505721"} Feb 23 08:54:16 crc kubenswrapper[5047]: I0223 08:54:16.700724 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4ce03d60-441e-4a79-acce-d54444aedfcb","Type":"ContainerStarted","Data":"74a9ad11394e36f5cc68ae559052ccef981d45f632432c7f9fd20dee8c669234"} Feb 23 08:54:16 crc kubenswrapper[5047]: I0223 08:54:16.705316 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95116420-b62b-402c-bcbe-17026cba0354","Type":"ContainerStarted","Data":"95c93cfd98d085f0620a320a6bc1bd384f6afe679ed96aea9f76c9b0475d04ab"} Feb 23 08:54:16 crc kubenswrapper[5047]: I0223 08:54:16.705373 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95116420-b62b-402c-bcbe-17026cba0354","Type":"ContainerStarted","Data":"438245a707c3bbe792477e7783157c77ef4990e9609e3fae09f2db6284e24d03"} Feb 23 08:54:17 crc kubenswrapper[5047]: I0223 08:54:17.722968 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4ce03d60-441e-4a79-acce-d54444aedfcb","Type":"ContainerStarted","Data":"8c44a4d060fe0afbdab6a1f05ac20d1249c428efb3b01ca24ed8e720e9d10271"} Feb 23 08:54:17 crc kubenswrapper[5047]: I0223 08:54:17.746120 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.746097329 podStartE2EDuration="4.746097329s" podCreationTimestamp="2026-02-23 08:54:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:54:16.737109154 +0000 UTC m=+7778.988436288" watchObservedRunningTime="2026-02-23 08:54:17.746097329 +0000 UTC m=+7779.997424463" Feb 23 08:54:17 crc kubenswrapper[5047]: I0223 08:54:17.753976 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.753897428 podStartE2EDuration="3.753897428s" podCreationTimestamp="2026-02-23 08:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:54:17.74206579 +0000 UTC m=+7779.993392924" watchObservedRunningTime="2026-02-23 08:54:17.753897428 +0000 UTC m=+7780.005224562" Feb 23 08:54:18 crc kubenswrapper[5047]: I0223 08:54:18.351419 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:54:18 crc kubenswrapper[5047]: E0223 08:54:18.351697 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.035274 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.036546 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.075195 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.091459 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.796693 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f4589d677-twpgk" event={"ID":"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df","Type":"ContainerStarted","Data":"0a1cf17da38d37e7e19cdcffb87509632fbe3fe36725bdb70121dd2da341db94"} Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.797023 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f4589d677-twpgk" event={"ID":"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df","Type":"ContainerStarted","Data":"3d09d52a458a6760ccf705d49c5dc5b61c3622bf94abcfefa00b55275e63a4aa"} Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.797188 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f4589d677-twpgk" podUID="d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" containerName="horizon-log" containerID="cri-o://3d09d52a458a6760ccf705d49c5dc5b61c3622bf94abcfefa00b55275e63a4aa" gracePeriod=30 Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.797225 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f4589d677-twpgk" podUID="d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" containerName="horizon" containerID="cri-o://0a1cf17da38d37e7e19cdcffb87509632fbe3fe36725bdb70121dd2da341db94" gracePeriod=30 Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.804336 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7794c55cb8-dz2pc" event={"ID":"90893ba0-abf6-4d64-be28-115c844f3252","Type":"ContainerStarted","Data":"a5cf62004fd25e3f3f20a2d2d9f3077f72bdd7c9b10511b881a6633d5f07383a"} Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.804405 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7794c55cb8-dz2pc" event={"ID":"90893ba0-abf6-4d64-be28-115c844f3252","Type":"ContainerStarted","Data":"70e7711cd78711a953a35e90a7e4aa5585094c69e0b06440ef4d0935f4c9720d"} Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.807377 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7779cb966c-fqwpp" event={"ID":"c6905234-3f71-4206-bd85-d07be4c3b1bb","Type":"ContainerStarted","Data":"b98f20d2ec60c3534f8277cf33c22e615f730b22832c9fbf00f49936a2f9b091"} Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.807432 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7779cb966c-fqwpp" event={"ID":"c6905234-3f71-4206-bd85-d07be4c3b1bb","Type":"ContainerStarted","Data":"0bdf234921f9ed5f40c39095e37eb4ae30b6cbbdc0bdd170ea8a987ef21440b8"} Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.807541 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7779cb966c-fqwpp" podUID="c6905234-3f71-4206-bd85-d07be4c3b1bb" containerName="horizon" containerID="cri-o://b98f20d2ec60c3534f8277cf33c22e615f730b22832c9fbf00f49936a2f9b091" gracePeriod=30 Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.807534 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7779cb966c-fqwpp" podUID="c6905234-3f71-4206-bd85-d07be4c3b1bb" containerName="horizon-log" containerID="cri-o://0bdf234921f9ed5f40c39095e37eb4ae30b6cbbdc0bdd170ea8a987ef21440b8" gracePeriod=30 Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.811159 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5d76bdc4-cpnp9" event={"ID":"ddbc9d78-4505-4215-b28c-69af3df4de72","Type":"ContainerStarted","Data":"c17e0859db1cf49bd932cc5593a1692765c7b4cd7741539470b990a88f18badf"} Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.811599 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.811814 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.811838 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5d76bdc4-cpnp9" event={"ID":"ddbc9d78-4505-4215-b28c-69af3df4de72","Type":"ContainerStarted","Data":"26017ece747974ffdb2e8ee6331e3bd6ff778aee7fd505cebbd28b7a52be3c0f"} Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.859341 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5d5d76bdc4-cpnp9" podStartSLOduration=2.796007244 podStartE2EDuration="13.859313514s" podCreationTimestamp="2026-02-23 08:54:11 +0000 UTC" firstStartedPulling="2026-02-23 08:54:12.835094069 +0000 UTC m=+7775.086421203" lastFinishedPulling="2026-02-23 08:54:23.898400339 +0000 UTC m=+7786.149727473" observedRunningTime="2026-02-23 08:54:24.853568259 +0000 UTC m=+7787.104895413" watchObservedRunningTime="2026-02-23 08:54:24.859313514 +0000 UTC m=+7787.110640658" Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.860807 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f4589d677-twpgk" podStartSLOduration=2.134140307 podStartE2EDuration="15.860797223s" podCreationTimestamp="2026-02-23 08:54:09 +0000 UTC" firstStartedPulling="2026-02-23 08:54:10.129155281 +0000 UTC m=+7772.380482425" lastFinishedPulling="2026-02-23 08:54:23.855812207 +0000 UTC m=+7786.107139341" observedRunningTime="2026-02-23 08:54:24.827123919 +0000 UTC m=+7787.078451053" watchObservedRunningTime="2026-02-23 08:54:24.860797223 +0000 UTC m=+7787.112124377" Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.880206 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7794c55cb8-dz2pc" podStartSLOduration=2.912305114 podStartE2EDuration="13.880185893s" podCreationTimestamp="2026-02-23 08:54:11 +0000 UTC" firstStartedPulling="2026-02-23 08:54:12.889007896 +0000 UTC m=+7775.140335030" lastFinishedPulling="2026-02-23 08:54:23.856888675 +0000 UTC m=+7786.108215809" observedRunningTime="2026-02-23 08:54:24.876453554 +0000 UTC m=+7787.127780698" watchObservedRunningTime="2026-02-23 08:54:24.880185893 +0000 UTC m=+7787.131513027" Feb 23 08:54:24 crc kubenswrapper[5047]: I0223 08:54:24.898644 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7779cb966c-fqwpp" podStartSLOduration=2.036860438 podStartE2EDuration="15.898624839s" podCreationTimestamp="2026-02-23 08:54:09 +0000 UTC" firstStartedPulling="2026-02-23 08:54:10.037545752 +0000 UTC m=+7772.288872886" lastFinishedPulling="2026-02-23 08:54:23.899310163 +0000 UTC m=+7786.150637287" observedRunningTime="2026-02-23 08:54:24.897080397 +0000 UTC m=+7787.148407551" watchObservedRunningTime="2026-02-23 08:54:24.898624839 +0000 UTC m=+7787.149951973" Feb 23 08:54:25 crc kubenswrapper[5047]: I0223 08:54:25.044767 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 08:54:25 crc kubenswrapper[5047]: I0223 08:54:25.045243 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 08:54:25 crc kubenswrapper[5047]: I0223 08:54:25.079600 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 08:54:25 crc kubenswrapper[5047]: I0223 08:54:25.111872 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 08:54:25 crc kubenswrapper[5047]: I0223 08:54:25.819550 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 08:54:25 crc kubenswrapper[5047]: I0223 08:54:25.820077 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 08:54:26 crc kubenswrapper[5047]: I0223 08:54:26.782731 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 08:54:26 crc kubenswrapper[5047]: I0223 08:54:26.798613 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 08:54:27 crc kubenswrapper[5047]: I0223 08:54:27.836475 5047 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 08:54:27 crc kubenswrapper[5047]: I0223 08:54:27.836521 5047 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 08:54:28 crc kubenswrapper[5047]: I0223 08:54:28.034868 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 08:54:28 crc kubenswrapper[5047]: I0223 08:54:28.361480 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 08:54:29 crc kubenswrapper[5047]: I0223 08:54:29.396186 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:29 crc kubenswrapper[5047]: I0223 08:54:29.644330 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:30 crc kubenswrapper[5047]: I0223 08:54:30.342493 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:54:30 crc kubenswrapper[5047]: E0223 08:54:30.342757 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:54:32 crc kubenswrapper[5047]: I0223 08:54:32.198586 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:32 crc kubenswrapper[5047]: I0223 08:54:32.198951 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:32 crc kubenswrapper[5047]: I0223 08:54:32.316328 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:32 crc kubenswrapper[5047]: I0223 08:54:32.316555 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:42 crc kubenswrapper[5047]: I0223 08:54:42.200897 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d5d76bdc4-cpnp9" podUID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.112:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8443: connect: connection refused" Feb 23 08:54:42 crc kubenswrapper[5047]: I0223 08:54:42.318259 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7794c55cb8-dz2pc" podUID="90893ba0-abf6-4d64-be28-115c844f3252" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.113:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8443: connect: connection refused" Feb 23 08:54:45 crc kubenswrapper[5047]: I0223 08:54:45.341541 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:54:45 crc kubenswrapper[5047]: E0223 08:54:45.342460 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 08:54:52 crc kubenswrapper[5047]: I0223 08:54:52.071934 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b60b-account-create-update-qclq5"] Feb 23 08:54:52 crc kubenswrapper[5047]: I0223 08:54:52.089218 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-k2z59"] Feb 23 08:54:52 crc kubenswrapper[5047]: I0223 08:54:52.106209 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-k2z59"] Feb 23 08:54:52 crc kubenswrapper[5047]: I0223 08:54:52.116230 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b60b-account-create-update-qclq5"] Feb 23 08:54:52 crc kubenswrapper[5047]: I0223 08:54:52.357577 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e78237-727f-4a10-bc58-53cbb650f5e2" path="/var/lib/kubelet/pods/18e78237-727f-4a10-bc58-53cbb650f5e2/volumes" Feb 23 08:54:52 crc kubenswrapper[5047]: I0223 08:54:52.358849 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d3eac5f-445c-426a-8b40-da353ac3e4d7" path="/var/lib/kubelet/pods/2d3eac5f-445c-426a-8b40-da353ac3e4d7/volumes" Feb 23 08:54:54 crc kubenswrapper[5047]: I0223 08:54:54.058965 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:54 crc kubenswrapper[5047]: I0223 08:54:54.158220 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.179190 5047 generic.go:334] "Generic (PLEG): container finished" podID="d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" containerID="0a1cf17da38d37e7e19cdcffb87509632fbe3fe36725bdb70121dd2da341db94" exitCode=137 Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.179518 5047 generic.go:334] "Generic (PLEG): container finished" podID="d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" containerID="3d09d52a458a6760ccf705d49c5dc5b61c3622bf94abcfefa00b55275e63a4aa" exitCode=137 Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.179305 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f4589d677-twpgk" event={"ID":"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df","Type":"ContainerDied","Data":"0a1cf17da38d37e7e19cdcffb87509632fbe3fe36725bdb70121dd2da341db94"} Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.179594 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f4589d677-twpgk" event={"ID":"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df","Type":"ContainerDied","Data":"3d09d52a458a6760ccf705d49c5dc5b61c3622bf94abcfefa00b55275e63a4aa"} Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.181868 5047 generic.go:334] "Generic (PLEG): container finished" podID="c6905234-3f71-4206-bd85-d07be4c3b1bb" containerID="b98f20d2ec60c3534f8277cf33c22e615f730b22832c9fbf00f49936a2f9b091" exitCode=137 Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.181887 5047 generic.go:334] "Generic (PLEG): container finished" podID="c6905234-3f71-4206-bd85-d07be4c3b1bb" containerID="0bdf234921f9ed5f40c39095e37eb4ae30b6cbbdc0bdd170ea8a987ef21440b8" exitCode=137 Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.181929 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7779cb966c-fqwpp" event={"ID":"c6905234-3f71-4206-bd85-d07be4c3b1bb","Type":"ContainerDied","Data":"b98f20d2ec60c3534f8277cf33c22e615f730b22832c9fbf00f49936a2f9b091"} Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.181945 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7779cb966c-fqwpp" event={"ID":"c6905234-3f71-4206-bd85-d07be4c3b1bb","Type":"ContainerDied","Data":"0bdf234921f9ed5f40c39095e37eb4ae30b6cbbdc0bdd170ea8a987ef21440b8"} Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.331652 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.338123 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.438179 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-logs\") pod \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.438261 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-scripts\") pod \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.438290 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-horizon-secret-key\") pod \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.438348 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndxj5\" (UniqueName: \"kubernetes.io/projected/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-kube-api-access-ndxj5\") pod \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.438461 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-config-data\") pod \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\" (UID: \"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df\") " Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.439098 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-logs" (OuterVolumeSpecName: "logs") pod "d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" (UID: "d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.447110 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" (UID: "d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.448050 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-kube-api-access-ndxj5" (OuterVolumeSpecName: "kube-api-access-ndxj5") pod "d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" (UID: "d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df"). InnerVolumeSpecName "kube-api-access-ndxj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.466478 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-config-data" (OuterVolumeSpecName: "config-data") pod "d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" (UID: "d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.469766 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-scripts" (OuterVolumeSpecName: "scripts") pod "d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" (UID: "d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.539819 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6905234-3f71-4206-bd85-d07be4c3b1bb-config-data\") pod \"c6905234-3f71-4206-bd85-d07be4c3b1bb\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.540423 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6905234-3f71-4206-bd85-d07be4c3b1bb-logs\") pod \"c6905234-3f71-4206-bd85-d07be4c3b1bb\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.540596 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c6905234-3f71-4206-bd85-d07be4c3b1bb-horizon-secret-key\") pod \"c6905234-3f71-4206-bd85-d07be4c3b1bb\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.540782 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8w2l\" (UniqueName: \"kubernetes.io/projected/c6905234-3f71-4206-bd85-d07be4c3b1bb-kube-api-access-x8w2l\") pod \"c6905234-3f71-4206-bd85-d07be4c3b1bb\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.541228 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6905234-3f71-4206-bd85-d07be4c3b1bb-scripts\") pod \"c6905234-3f71-4206-bd85-d07be4c3b1bb\" (UID: \"c6905234-3f71-4206-bd85-d07be4c3b1bb\") " Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.542152 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6905234-3f71-4206-bd85-d07be4c3b1bb-logs" (OuterVolumeSpecName: "logs") pod "c6905234-3f71-4206-bd85-d07be4c3b1bb" (UID: "c6905234-3f71-4206-bd85-d07be4c3b1bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.542323 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.542417 5047 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.542570 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndxj5\" (UniqueName: \"kubernetes.io/projected/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-kube-api-access-ndxj5\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.542605 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.542628 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.546220 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6905234-3f71-4206-bd85-d07be4c3b1bb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c6905234-3f71-4206-bd85-d07be4c3b1bb" (UID: "c6905234-3f71-4206-bd85-d07be4c3b1bb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.546285 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6905234-3f71-4206-bd85-d07be4c3b1bb-kube-api-access-x8w2l" (OuterVolumeSpecName: "kube-api-access-x8w2l") pod "c6905234-3f71-4206-bd85-d07be4c3b1bb" (UID: "c6905234-3f71-4206-bd85-d07be4c3b1bb"). InnerVolumeSpecName "kube-api-access-x8w2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.570156 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6905234-3f71-4206-bd85-d07be4c3b1bb-config-data" (OuterVolumeSpecName: "config-data") pod "c6905234-3f71-4206-bd85-d07be4c3b1bb" (UID: "c6905234-3f71-4206-bd85-d07be4c3b1bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.589544 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6905234-3f71-4206-bd85-d07be4c3b1bb-scripts" (OuterVolumeSpecName: "scripts") pod "c6905234-3f71-4206-bd85-d07be4c3b1bb" (UID: "c6905234-3f71-4206-bd85-d07be4c3b1bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.645592 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6905234-3f71-4206-bd85-d07be4c3b1bb-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.645705 5047 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c6905234-3f71-4206-bd85-d07be4c3b1bb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.645732 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8w2l\" (UniqueName: \"kubernetes.io/projected/c6905234-3f71-4206-bd85-d07be4c3b1bb-kube-api-access-x8w2l\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.645747 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6905234-3f71-4206-bd85-d07be4c3b1bb-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.645760 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6905234-3f71-4206-bd85-d07be4c3b1bb-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.750895 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:54:55 crc kubenswrapper[5047]: I0223 08:54:55.932018 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.010164 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d5d76bdc4-cpnp9"] Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.194326 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f4589d677-twpgk" event={"ID":"d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df","Type":"ContainerDied","Data":"6aa9e689358c9b19cd85a38da3c1f29e1f925a61e4d34b825c91ab97f0bad7fe"} Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.194632 5047 scope.go:117] "RemoveContainer" containerID="0a1cf17da38d37e7e19cdcffb87509632fbe3fe36725bdb70121dd2da341db94" Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.194515 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f4589d677-twpgk" Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.197087 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d5d76bdc4-cpnp9" podUID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerName="horizon-log" containerID="cri-o://26017ece747974ffdb2e8ee6331e3bd6ff778aee7fd505cebbd28b7a52be3c0f" gracePeriod=30 Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.197407 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7779cb966c-fqwpp" Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.199453 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d5d76bdc4-cpnp9" podUID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerName="horizon" containerID="cri-o://c17e0859db1cf49bd932cc5593a1692765c7b4cd7741539470b990a88f18badf" gracePeriod=30 Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.199475 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7779cb966c-fqwpp" event={"ID":"c6905234-3f71-4206-bd85-d07be4c3b1bb","Type":"ContainerDied","Data":"4d952746a30848a563a21e31f0501d07d0fb20000cdd06f4cdc9b5643d161292"} Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.231023 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f4589d677-twpgk"] Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.243746 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f4589d677-twpgk"] Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.251174 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7779cb966c-fqwpp"] Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.259626 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7779cb966c-fqwpp"] Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.341660 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.359458 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6905234-3f71-4206-bd85-d07be4c3b1bb" path="/var/lib/kubelet/pods/c6905234-3f71-4206-bd85-d07be4c3b1bb/volumes" Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.360450 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" path="/var/lib/kubelet/pods/d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df/volumes" Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.403856 5047 scope.go:117] "RemoveContainer" containerID="3d09d52a458a6760ccf705d49c5dc5b61c3622bf94abcfefa00b55275e63a4aa" Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.474232 5047 scope.go:117] "RemoveContainer" containerID="b98f20d2ec60c3534f8277cf33c22e615f730b22832c9fbf00f49936a2f9b091" Feb 23 08:54:56 crc kubenswrapper[5047]: I0223 08:54:56.737361 5047 scope.go:117] "RemoveContainer" containerID="0bdf234921f9ed5f40c39095e37eb4ae30b6cbbdc0bdd170ea8a987ef21440b8" Feb 23 08:54:57 crc kubenswrapper[5047]: I0223 08:54:57.213482 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"5ec7e4d435932fd8405e5cc018a986356ba6984287b8f0f9c351f6397165ec6c"} Feb 23 08:54:57 crc kubenswrapper[5047]: I0223 08:54:57.897572 5047 scope.go:117] "RemoveContainer" containerID="b51e998165c084679a97f5c7335ff4b3417b8dfb0bf00b6799909acce411d059" Feb 23 08:54:57 crc kubenswrapper[5047]: I0223 08:54:57.950950 5047 scope.go:117] "RemoveContainer" containerID="0c4ae39a993f67b6c8a66c49094ec07dcf60eaafc3fdcbdb2029f7d11ffa0950" Feb 23 08:54:58 crc kubenswrapper[5047]: I0223 08:54:58.010999 5047 scope.go:117] "RemoveContainer" containerID="4aa359bcaac449785a7d3d3df0a1dbebab24968a3cefad0445fde787bfdb57e3" Feb 23 08:54:58 crc kubenswrapper[5047]: I0223 08:54:58.043050 5047 scope.go:117] "RemoveContainer" containerID="924b774bf422115ee2d85401ee8918ca65c2669a4a178bdb31127a433d44f634" Feb 23 08:54:58 crc kubenswrapper[5047]: I0223 08:54:58.079704 5047 scope.go:117] "RemoveContainer" containerID="9177b4b644eb8eea46c3856893d8c1c6bff7c11a802fa3f7fe8f88a20ff6d1f7" Feb 23 08:55:00 crc kubenswrapper[5047]: I0223 08:55:00.266295 5047 generic.go:334] "Generic (PLEG): container finished" podID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerID="c17e0859db1cf49bd932cc5593a1692765c7b4cd7741539470b990a88f18badf" exitCode=0 Feb 23 08:55:00 crc kubenswrapper[5047]: I0223 08:55:00.266383 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5d76bdc4-cpnp9" event={"ID":"ddbc9d78-4505-4215-b28c-69af3df4de72","Type":"ContainerDied","Data":"c17e0859db1cf49bd932cc5593a1692765c7b4cd7741539470b990a88f18badf"} Feb 23 08:55:02 crc kubenswrapper[5047]: I0223 08:55:02.198730 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d5d76bdc4-cpnp9" podUID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.112:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8443: connect: connection refused" Feb 23 08:55:04 crc kubenswrapper[5047]: I0223 08:55:04.069081 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-kdn4n"] Feb 23 08:55:04 crc kubenswrapper[5047]: I0223 08:55:04.078957 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-kdn4n"] Feb 23 08:55:04 crc kubenswrapper[5047]: I0223 08:55:04.352073 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c60b58d-c0c1-470c-b84d-79528f882fb6" path="/var/lib/kubelet/pods/0c60b58d-c0c1-470c-b84d-79528f882fb6/volumes" Feb 23 08:55:12 crc kubenswrapper[5047]: I0223 08:55:12.198852 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d5d76bdc4-cpnp9" podUID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.112:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8443: connect: connection refused" Feb 23 08:55:22 crc kubenswrapper[5047]: I0223 08:55:22.199446 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d5d76bdc4-cpnp9" podUID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.112:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8443: connect: connection refused" Feb 23 08:55:22 crc kubenswrapper[5047]: I0223 08:55:22.200628 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.538207 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fcdmb"] Feb 23 08:55:24 crc kubenswrapper[5047]: E0223 08:55:24.539176 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6905234-3f71-4206-bd85-d07be4c3b1bb" containerName="horizon" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.539200 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6905234-3f71-4206-bd85-d07be4c3b1bb" containerName="horizon" Feb 23 08:55:24 crc kubenswrapper[5047]: E0223 08:55:24.539223 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" containerName="horizon" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.539232 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" containerName="horizon" Feb 23 08:55:24 crc kubenswrapper[5047]: E0223 08:55:24.539255 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6905234-3f71-4206-bd85-d07be4c3b1bb" containerName="horizon-log" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.539264 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6905234-3f71-4206-bd85-d07be4c3b1bb" containerName="horizon-log" Feb 23 08:55:24 crc kubenswrapper[5047]: E0223 08:55:24.539306 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" containerName="horizon-log" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.539315 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" containerName="horizon-log" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.539581 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6905234-3f71-4206-bd85-d07be4c3b1bb" containerName="horizon-log" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.539613 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" containerName="horizon-log" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.539634 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6905234-3f71-4206-bd85-d07be4c3b1bb" containerName="horizon" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.539668 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c87f59-46f9-41a5-9d24-3a3c6d0cc5df" containerName="horizon" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.541472 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.562064 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcdmb"] Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.711781 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1a76f7-dee9-4bbf-adb7-87256896eff8-utilities\") pod \"community-operators-fcdmb\" (UID: \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\") " pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.712286 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv547\" (UniqueName: \"kubernetes.io/projected/ac1a76f7-dee9-4bbf-adb7-87256896eff8-kube-api-access-jv547\") pod \"community-operators-fcdmb\" (UID: \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\") " pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.712419 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1a76f7-dee9-4bbf-adb7-87256896eff8-catalog-content\") pod \"community-operators-fcdmb\" (UID: \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\") " pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.815233 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv547\" (UniqueName: \"kubernetes.io/projected/ac1a76f7-dee9-4bbf-adb7-87256896eff8-kube-api-access-jv547\") pod \"community-operators-fcdmb\" (UID: \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\") " pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.815864 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1a76f7-dee9-4bbf-adb7-87256896eff8-catalog-content\") pod \"community-operators-fcdmb\" (UID: \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\") " pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.816241 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1a76f7-dee9-4bbf-adb7-87256896eff8-utilities\") pod \"community-operators-fcdmb\" (UID: \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\") " pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.816459 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1a76f7-dee9-4bbf-adb7-87256896eff8-catalog-content\") pod \"community-operators-fcdmb\" (UID: \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\") " pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.816589 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1a76f7-dee9-4bbf-adb7-87256896eff8-utilities\") pod \"community-operators-fcdmb\" (UID: \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\") " pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.845683 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv547\" (UniqueName: \"kubernetes.io/projected/ac1a76f7-dee9-4bbf-adb7-87256896eff8-kube-api-access-jv547\") pod \"community-operators-fcdmb\" (UID: \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\") " pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:24 crc kubenswrapper[5047]: I0223 08:55:24.925653 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:25 crc kubenswrapper[5047]: I0223 08:55:25.440050 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcdmb"] Feb 23 08:55:25 crc kubenswrapper[5047]: I0223 08:55:25.567498 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcdmb" event={"ID":"ac1a76f7-dee9-4bbf-adb7-87256896eff8","Type":"ContainerStarted","Data":"2aa6bc483ed7f9535ce2d0d9637c106b8975ecbbfc348f77015a86aa3c73f322"} Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.581755 5047 generic.go:334] "Generic (PLEG): container finished" podID="ac1a76f7-dee9-4bbf-adb7-87256896eff8" containerID="6457de0e7ecee411c1f7868461469b4fed9fca0eaafcf093f4cbb89242c67af1" exitCode=0 Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.582036 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcdmb" event={"ID":"ac1a76f7-dee9-4bbf-adb7-87256896eff8","Type":"ContainerDied","Data":"6457de0e7ecee411c1f7868461469b4fed9fca0eaafcf093f4cbb89242c67af1"} Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.593983 5047 generic.go:334] "Generic (PLEG): container finished" podID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerID="26017ece747974ffdb2e8ee6331e3bd6ff778aee7fd505cebbd28b7a52be3c0f" exitCode=137 Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.594083 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5d76bdc4-cpnp9" event={"ID":"ddbc9d78-4505-4215-b28c-69af3df4de72","Type":"ContainerDied","Data":"26017ece747974ffdb2e8ee6331e3bd6ff778aee7fd505cebbd28b7a52be3c0f"} Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.708849 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.874480 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-horizon-secret-key\") pod \"ddbc9d78-4505-4215-b28c-69af3df4de72\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.874563 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-combined-ca-bundle\") pod \"ddbc9d78-4505-4215-b28c-69af3df4de72\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.874624 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddbc9d78-4505-4215-b28c-69af3df4de72-logs\") pod \"ddbc9d78-4505-4215-b28c-69af3df4de72\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.874730 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-horizon-tls-certs\") pod \"ddbc9d78-4505-4215-b28c-69af3df4de72\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.874777 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5jx4\" (UniqueName: \"kubernetes.io/projected/ddbc9d78-4505-4215-b28c-69af3df4de72-kube-api-access-m5jx4\") pod \"ddbc9d78-4505-4215-b28c-69af3df4de72\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.875460 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddbc9d78-4505-4215-b28c-69af3df4de72-logs" (OuterVolumeSpecName: "logs") pod "ddbc9d78-4505-4215-b28c-69af3df4de72" (UID: "ddbc9d78-4505-4215-b28c-69af3df4de72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.875794 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ddbc9d78-4505-4215-b28c-69af3df4de72-config-data\") pod \"ddbc9d78-4505-4215-b28c-69af3df4de72\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.875942 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddbc9d78-4505-4215-b28c-69af3df4de72-scripts\") pod \"ddbc9d78-4505-4215-b28c-69af3df4de72\" (UID: \"ddbc9d78-4505-4215-b28c-69af3df4de72\") " Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.876475 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddbc9d78-4505-4215-b28c-69af3df4de72-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.880938 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddbc9d78-4505-4215-b28c-69af3df4de72-kube-api-access-m5jx4" (OuterVolumeSpecName: "kube-api-access-m5jx4") pod "ddbc9d78-4505-4215-b28c-69af3df4de72" (UID: "ddbc9d78-4505-4215-b28c-69af3df4de72"). InnerVolumeSpecName "kube-api-access-m5jx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.884005 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ddbc9d78-4505-4215-b28c-69af3df4de72" (UID: "ddbc9d78-4505-4215-b28c-69af3df4de72"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.921130 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddbc9d78-4505-4215-b28c-69af3df4de72" (UID: "ddbc9d78-4505-4215-b28c-69af3df4de72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.929392 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddbc9d78-4505-4215-b28c-69af3df4de72-scripts" (OuterVolumeSpecName: "scripts") pod "ddbc9d78-4505-4215-b28c-69af3df4de72" (UID: "ddbc9d78-4505-4215-b28c-69af3df4de72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.935308 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ddbc9d78-4505-4215-b28c-69af3df4de72" (UID: "ddbc9d78-4505-4215-b28c-69af3df4de72"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.938203 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zb25t"] Feb 23 08:55:26 crc kubenswrapper[5047]: E0223 08:55:26.938745 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerName="horizon" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.938769 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerName="horizon" Feb 23 08:55:26 crc kubenswrapper[5047]: E0223 08:55:26.938788 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerName="horizon-log" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.938796 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerName="horizon-log" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.939102 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerName="horizon-log" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.939131 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddbc9d78-4505-4215-b28c-69af3df4de72" containerName="horizon" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.940861 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.947004 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zb25t"] Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.951539 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddbc9d78-4505-4215-b28c-69af3df4de72-config-data" (OuterVolumeSpecName: "config-data") pod "ddbc9d78-4505-4215-b28c-69af3df4de72" (UID: "ddbc9d78-4505-4215-b28c-69af3df4de72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.978081 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftxcl\" (UniqueName: \"kubernetes.io/projected/3052f6fb-2684-49e5-a796-87f35378dae1-kube-api-access-ftxcl\") pod \"certified-operators-zb25t\" (UID: \"3052f6fb-2684-49e5-a796-87f35378dae1\") " pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.978201 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3052f6fb-2684-49e5-a796-87f35378dae1-utilities\") pod \"certified-operators-zb25t\" (UID: \"3052f6fb-2684-49e5-a796-87f35378dae1\") " pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.978247 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3052f6fb-2684-49e5-a796-87f35378dae1-catalog-content\") pod \"certified-operators-zb25t\" (UID: \"3052f6fb-2684-49e5-a796-87f35378dae1\") " pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.978360 5047 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.978378 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.978388 5047 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddbc9d78-4505-4215-b28c-69af3df4de72-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.978399 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5jx4\" (UniqueName: \"kubernetes.io/projected/ddbc9d78-4505-4215-b28c-69af3df4de72-kube-api-access-m5jx4\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.978413 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ddbc9d78-4505-4215-b28c-69af3df4de72-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:26 crc kubenswrapper[5047]: I0223 08:55:26.978424 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddbc9d78-4505-4215-b28c-69af3df4de72-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.079299 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3052f6fb-2684-49e5-a796-87f35378dae1-utilities\") pod \"certified-operators-zb25t\" (UID: \"3052f6fb-2684-49e5-a796-87f35378dae1\") " pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.079375 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3052f6fb-2684-49e5-a796-87f35378dae1-catalog-content\") pod \"certified-operators-zb25t\" (UID: \"3052f6fb-2684-49e5-a796-87f35378dae1\") " pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.079484 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftxcl\" (UniqueName: \"kubernetes.io/projected/3052f6fb-2684-49e5-a796-87f35378dae1-kube-api-access-ftxcl\") pod \"certified-operators-zb25t\" (UID: \"3052f6fb-2684-49e5-a796-87f35378dae1\") " pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.079892 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3052f6fb-2684-49e5-a796-87f35378dae1-utilities\") pod \"certified-operators-zb25t\" (UID: \"3052f6fb-2684-49e5-a796-87f35378dae1\") " pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.080290 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3052f6fb-2684-49e5-a796-87f35378dae1-catalog-content\") pod \"certified-operators-zb25t\" (UID: \"3052f6fb-2684-49e5-a796-87f35378dae1\") " pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.098795 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftxcl\" (UniqueName: \"kubernetes.io/projected/3052f6fb-2684-49e5-a796-87f35378dae1-kube-api-access-ftxcl\") pod \"certified-operators-zb25t\" (UID: \"3052f6fb-2684-49e5-a796-87f35378dae1\") " pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.330761 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.658503 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5d76bdc4-cpnp9" event={"ID":"ddbc9d78-4505-4215-b28c-69af3df4de72","Type":"ContainerDied","Data":"d3989c0690715316910da09d2405d581a7394fdfe3fe7e45d50ed75c4b58e087"} Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.658561 5047 scope.go:117] "RemoveContainer" containerID="c17e0859db1cf49bd932cc5593a1692765c7b4cd7741539470b990a88f18badf" Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.658718 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5d76bdc4-cpnp9" Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.666390 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcdmb" event={"ID":"ac1a76f7-dee9-4bbf-adb7-87256896eff8","Type":"ContainerStarted","Data":"10e28b45b927b8aa4909cbc4fa906d9332146f251891453869b0a2ab02eda0e0"} Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.684480 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zb25t"] Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.770821 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d5d76bdc4-cpnp9"] Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.779450 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5d5d76bdc4-cpnp9"] Feb 23 08:55:27 crc kubenswrapper[5047]: I0223 08:55:27.851830 5047 scope.go:117] "RemoveContainer" containerID="26017ece747974ffdb2e8ee6331e3bd6ff778aee7fd505cebbd28b7a52be3c0f" Feb 23 08:55:28 crc kubenswrapper[5047]: I0223 08:55:28.354624 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddbc9d78-4505-4215-b28c-69af3df4de72" path="/var/lib/kubelet/pods/ddbc9d78-4505-4215-b28c-69af3df4de72/volumes" Feb 23 08:55:28 crc kubenswrapper[5047]: I0223 08:55:28.683863 5047 generic.go:334] "Generic (PLEG): container finished" podID="ac1a76f7-dee9-4bbf-adb7-87256896eff8" containerID="10e28b45b927b8aa4909cbc4fa906d9332146f251891453869b0a2ab02eda0e0" exitCode=0 Feb 23 08:55:28 crc kubenswrapper[5047]: I0223 08:55:28.684019 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcdmb" event={"ID":"ac1a76f7-dee9-4bbf-adb7-87256896eff8","Type":"ContainerDied","Data":"10e28b45b927b8aa4909cbc4fa906d9332146f251891453869b0a2ab02eda0e0"} Feb 23 08:55:28 crc kubenswrapper[5047]: I0223 08:55:28.687497 5047 generic.go:334] "Generic (PLEG): container finished" podID="3052f6fb-2684-49e5-a796-87f35378dae1" containerID="4cdd8d6656749ced66937727690058d0a77e8862e46071832fd5b441cd9ab2e3" exitCode=0 Feb 23 08:55:28 crc kubenswrapper[5047]: I0223 08:55:28.687580 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb25t" event={"ID":"3052f6fb-2684-49e5-a796-87f35378dae1","Type":"ContainerDied","Data":"4cdd8d6656749ced66937727690058d0a77e8862e46071832fd5b441cd9ab2e3"} Feb 23 08:55:28 crc kubenswrapper[5047]: I0223 08:55:28.687631 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb25t" event={"ID":"3052f6fb-2684-49e5-a796-87f35378dae1","Type":"ContainerStarted","Data":"af206306f1acd6d72a74df095d6cd3cbd2f5dc489eb2612f1daf5fd3053acb07"} Feb 23 08:55:29 crc kubenswrapper[5047]: I0223 08:55:29.702848 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcdmb" event={"ID":"ac1a76f7-dee9-4bbf-adb7-87256896eff8","Type":"ContainerStarted","Data":"2204fe3da296b1746216190ae737175be5bef5cae63bb6af85922c994b48ffa6"} Feb 23 08:55:29 crc kubenswrapper[5047]: I0223 08:55:29.746720 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fcdmb" podStartSLOduration=3.233687756 podStartE2EDuration="5.746689756s" podCreationTimestamp="2026-02-23 08:55:24 +0000 UTC" firstStartedPulling="2026-02-23 08:55:26.586471293 +0000 UTC m=+7848.837798427" lastFinishedPulling="2026-02-23 08:55:29.099473253 +0000 UTC m=+7851.350800427" observedRunningTime="2026-02-23 08:55:29.734307533 +0000 UTC m=+7851.985634677" watchObservedRunningTime="2026-02-23 08:55:29.746689756 +0000 UTC m=+7851.998016930" Feb 23 08:55:30 crc kubenswrapper[5047]: I0223 08:55:30.716117 5047 generic.go:334] "Generic (PLEG): container finished" podID="3052f6fb-2684-49e5-a796-87f35378dae1" containerID="aea40276c3bd66b32f4e38bddd38d4d16f4e8256f0c99016ad4b2c207eddf13e" exitCode=0 Feb 23 08:55:30 crc kubenswrapper[5047]: I0223 08:55:30.716210 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb25t" event={"ID":"3052f6fb-2684-49e5-a796-87f35378dae1","Type":"ContainerDied","Data":"aea40276c3bd66b32f4e38bddd38d4d16f4e8256f0c99016ad4b2c207eddf13e"} Feb 23 08:55:31 crc kubenswrapper[5047]: I0223 08:55:31.072865 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-246gf"] Feb 23 08:55:31 crc kubenswrapper[5047]: I0223 08:55:31.083321 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-317c-account-create-update-88hlp"] Feb 23 08:55:31 crc kubenswrapper[5047]: I0223 08:55:31.092639 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-246gf"] Feb 23 08:55:31 crc kubenswrapper[5047]: I0223 08:55:31.102476 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-317c-account-create-update-88hlp"] Feb 23 08:55:31 crc kubenswrapper[5047]: I0223 08:55:31.727257 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb25t" event={"ID":"3052f6fb-2684-49e5-a796-87f35378dae1","Type":"ContainerStarted","Data":"d5cb26ed76a8fa2bd66c41fa2028f33446638198e146f6a485d30679921372d7"} Feb 23 08:55:31 crc kubenswrapper[5047]: I0223 08:55:31.751996 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zb25t" podStartSLOduration=3.233214182 podStartE2EDuration="5.751973985s" podCreationTimestamp="2026-02-23 08:55:26 +0000 UTC" firstStartedPulling="2026-02-23 08:55:28.691016958 +0000 UTC m=+7850.942344112" lastFinishedPulling="2026-02-23 08:55:31.209776781 +0000 UTC m=+7853.461103915" observedRunningTime="2026-02-23 08:55:31.748079771 +0000 UTC m=+7853.999406905" watchObservedRunningTime="2026-02-23 08:55:31.751973985 +0000 UTC m=+7854.003301119" Feb 23 08:55:32 crc kubenswrapper[5047]: I0223 08:55:32.352400 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f0974c-f916-44fb-92ff-4e3c7dafadd1" path="/var/lib/kubelet/pods/24f0974c-f916-44fb-92ff-4e3c7dafadd1/volumes" Feb 23 08:55:32 crc kubenswrapper[5047]: I0223 08:55:32.353633 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb" path="/var/lib/kubelet/pods/e3bb78a6-36a8-41d7-bc0f-67c24db2fcfb/volumes" Feb 23 08:55:34 crc kubenswrapper[5047]: I0223 08:55:34.926739 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:34 crc kubenswrapper[5047]: I0223 08:55:34.927354 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:35 crc kubenswrapper[5047]: I0223 08:55:35.987669 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fcdmb" podUID="ac1a76f7-dee9-4bbf-adb7-87256896eff8" containerName="registry-server" probeResult="failure" output=< Feb 23 08:55:35 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 08:55:35 crc kubenswrapper[5047]: > Feb 23 08:55:37 crc kubenswrapper[5047]: I0223 08:55:37.331550 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:37 crc kubenswrapper[5047]: I0223 08:55:37.332151 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:37 crc kubenswrapper[5047]: I0223 08:55:37.410350 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:37 crc kubenswrapper[5047]: I0223 08:55:37.859376 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:37 crc kubenswrapper[5047]: I0223 08:55:37.939189 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zb25t"] Feb 23 08:55:39 crc kubenswrapper[5047]: I0223 08:55:39.828322 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zb25t" podUID="3052f6fb-2684-49e5-a796-87f35378dae1" containerName="registry-server" containerID="cri-o://d5cb26ed76a8fa2bd66c41fa2028f33446638198e146f6a485d30679921372d7" gracePeriod=2 Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.034965 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-w76mz"] Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.054698 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-w76mz"] Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.319242 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.365729 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d77e4cfd-baf4-4d7f-bc0e-a28608a27638" path="/var/lib/kubelet/pods/d77e4cfd-baf4-4d7f-bc0e-a28608a27638/volumes" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.500218 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3052f6fb-2684-49e5-a796-87f35378dae1-catalog-content\") pod \"3052f6fb-2684-49e5-a796-87f35378dae1\" (UID: \"3052f6fb-2684-49e5-a796-87f35378dae1\") " Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.500322 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3052f6fb-2684-49e5-a796-87f35378dae1-utilities\") pod \"3052f6fb-2684-49e5-a796-87f35378dae1\" (UID: \"3052f6fb-2684-49e5-a796-87f35378dae1\") " Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.500350 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftxcl\" (UniqueName: \"kubernetes.io/projected/3052f6fb-2684-49e5-a796-87f35378dae1-kube-api-access-ftxcl\") pod \"3052f6fb-2684-49e5-a796-87f35378dae1\" (UID: \"3052f6fb-2684-49e5-a796-87f35378dae1\") " Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.502682 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3052f6fb-2684-49e5-a796-87f35378dae1-utilities" (OuterVolumeSpecName: "utilities") pod "3052f6fb-2684-49e5-a796-87f35378dae1" (UID: "3052f6fb-2684-49e5-a796-87f35378dae1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.503440 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3052f6fb-2684-49e5-a796-87f35378dae1-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.508216 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3052f6fb-2684-49e5-a796-87f35378dae1-kube-api-access-ftxcl" (OuterVolumeSpecName: "kube-api-access-ftxcl") pod "3052f6fb-2684-49e5-a796-87f35378dae1" (UID: "3052f6fb-2684-49e5-a796-87f35378dae1"). InnerVolumeSpecName "kube-api-access-ftxcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.547428 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3052f6fb-2684-49e5-a796-87f35378dae1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3052f6fb-2684-49e5-a796-87f35378dae1" (UID: "3052f6fb-2684-49e5-a796-87f35378dae1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.605723 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftxcl\" (UniqueName: \"kubernetes.io/projected/3052f6fb-2684-49e5-a796-87f35378dae1-kube-api-access-ftxcl\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.605765 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3052f6fb-2684-49e5-a796-87f35378dae1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.842317 5047 generic.go:334] "Generic (PLEG): container finished" podID="3052f6fb-2684-49e5-a796-87f35378dae1" containerID="d5cb26ed76a8fa2bd66c41fa2028f33446638198e146f6a485d30679921372d7" exitCode=0 Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.842379 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb25t" event={"ID":"3052f6fb-2684-49e5-a796-87f35378dae1","Type":"ContainerDied","Data":"d5cb26ed76a8fa2bd66c41fa2028f33446638198e146f6a485d30679921372d7"} Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.842419 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb25t" event={"ID":"3052f6fb-2684-49e5-a796-87f35378dae1","Type":"ContainerDied","Data":"af206306f1acd6d72a74df095d6cd3cbd2f5dc489eb2612f1daf5fd3053acb07"} Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.842449 5047 scope.go:117] "RemoveContainer" containerID="d5cb26ed76a8fa2bd66c41fa2028f33446638198e146f6a485d30679921372d7" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.842651 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zb25t" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.866647 5047 scope.go:117] "RemoveContainer" containerID="aea40276c3bd66b32f4e38bddd38d4d16f4e8256f0c99016ad4b2c207eddf13e" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.894002 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zb25t"] Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.912979 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zb25t"] Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.913398 5047 scope.go:117] "RemoveContainer" containerID="4cdd8d6656749ced66937727690058d0a77e8862e46071832fd5b441cd9ab2e3" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.943710 5047 scope.go:117] "RemoveContainer" containerID="d5cb26ed76a8fa2bd66c41fa2028f33446638198e146f6a485d30679921372d7" Feb 23 08:55:40 crc kubenswrapper[5047]: E0223 08:55:40.944196 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5cb26ed76a8fa2bd66c41fa2028f33446638198e146f6a485d30679921372d7\": container with ID starting with d5cb26ed76a8fa2bd66c41fa2028f33446638198e146f6a485d30679921372d7 not found: ID does not exist" containerID="d5cb26ed76a8fa2bd66c41fa2028f33446638198e146f6a485d30679921372d7" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.944237 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5cb26ed76a8fa2bd66c41fa2028f33446638198e146f6a485d30679921372d7"} err="failed to get container status \"d5cb26ed76a8fa2bd66c41fa2028f33446638198e146f6a485d30679921372d7\": rpc error: code = NotFound desc = could not find container \"d5cb26ed76a8fa2bd66c41fa2028f33446638198e146f6a485d30679921372d7\": container with ID starting with d5cb26ed76a8fa2bd66c41fa2028f33446638198e146f6a485d30679921372d7 not found: ID does not exist" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.944262 5047 scope.go:117] "RemoveContainer" containerID="aea40276c3bd66b32f4e38bddd38d4d16f4e8256f0c99016ad4b2c207eddf13e" Feb 23 08:55:40 crc kubenswrapper[5047]: E0223 08:55:40.944658 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea40276c3bd66b32f4e38bddd38d4d16f4e8256f0c99016ad4b2c207eddf13e\": container with ID starting with aea40276c3bd66b32f4e38bddd38d4d16f4e8256f0c99016ad4b2c207eddf13e not found: ID does not exist" containerID="aea40276c3bd66b32f4e38bddd38d4d16f4e8256f0c99016ad4b2c207eddf13e" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.944715 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea40276c3bd66b32f4e38bddd38d4d16f4e8256f0c99016ad4b2c207eddf13e"} err="failed to get container status \"aea40276c3bd66b32f4e38bddd38d4d16f4e8256f0c99016ad4b2c207eddf13e\": rpc error: code = NotFound desc = could not find container \"aea40276c3bd66b32f4e38bddd38d4d16f4e8256f0c99016ad4b2c207eddf13e\": container with ID starting with aea40276c3bd66b32f4e38bddd38d4d16f4e8256f0c99016ad4b2c207eddf13e not found: ID does not exist" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.944760 5047 scope.go:117] "RemoveContainer" containerID="4cdd8d6656749ced66937727690058d0a77e8862e46071832fd5b441cd9ab2e3" Feb 23 08:55:40 crc kubenswrapper[5047]: E0223 08:55:40.945207 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cdd8d6656749ced66937727690058d0a77e8862e46071832fd5b441cd9ab2e3\": container with ID starting with 4cdd8d6656749ced66937727690058d0a77e8862e46071832fd5b441cd9ab2e3 not found: ID does not exist" containerID="4cdd8d6656749ced66937727690058d0a77e8862e46071832fd5b441cd9ab2e3" Feb 23 08:55:40 crc kubenswrapper[5047]: I0223 08:55:40.945266 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cdd8d6656749ced66937727690058d0a77e8862e46071832fd5b441cd9ab2e3"} err="failed to get container status \"4cdd8d6656749ced66937727690058d0a77e8862e46071832fd5b441cd9ab2e3\": rpc error: code = NotFound desc = could not find container \"4cdd8d6656749ced66937727690058d0a77e8862e46071832fd5b441cd9ab2e3\": container with ID starting with 4cdd8d6656749ced66937727690058d0a77e8862e46071832fd5b441cd9ab2e3 not found: ID does not exist" Feb 23 08:55:42 crc kubenswrapper[5047]: I0223 08:55:42.368978 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3052f6fb-2684-49e5-a796-87f35378dae1" path="/var/lib/kubelet/pods/3052f6fb-2684-49e5-a796-87f35378dae1/volumes" Feb 23 08:55:45 crc kubenswrapper[5047]: I0223 08:55:45.059802 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:45 crc kubenswrapper[5047]: I0223 08:55:45.149038 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:45 crc kubenswrapper[5047]: I0223 08:55:45.321573 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fcdmb"] Feb 23 08:55:46 crc kubenswrapper[5047]: I0223 08:55:46.918569 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fcdmb" podUID="ac1a76f7-dee9-4bbf-adb7-87256896eff8" containerName="registry-server" containerID="cri-o://2204fe3da296b1746216190ae737175be5bef5cae63bb6af85922c994b48ffa6" gracePeriod=2 Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.578083 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.678435 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1a76f7-dee9-4bbf-adb7-87256896eff8-utilities\") pod \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\" (UID: \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\") " Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.678500 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1a76f7-dee9-4bbf-adb7-87256896eff8-catalog-content\") pod \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\" (UID: \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\") " Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.678632 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv547\" (UniqueName: \"kubernetes.io/projected/ac1a76f7-dee9-4bbf-adb7-87256896eff8-kube-api-access-jv547\") pod \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\" (UID: \"ac1a76f7-dee9-4bbf-adb7-87256896eff8\") " Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.680216 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac1a76f7-dee9-4bbf-adb7-87256896eff8-utilities" (OuterVolumeSpecName: "utilities") pod "ac1a76f7-dee9-4bbf-adb7-87256896eff8" (UID: "ac1a76f7-dee9-4bbf-adb7-87256896eff8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.699944 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac1a76f7-dee9-4bbf-adb7-87256896eff8-kube-api-access-jv547" (OuterVolumeSpecName: "kube-api-access-jv547") pod "ac1a76f7-dee9-4bbf-adb7-87256896eff8" (UID: "ac1a76f7-dee9-4bbf-adb7-87256896eff8"). InnerVolumeSpecName "kube-api-access-jv547". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.733557 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac1a76f7-dee9-4bbf-adb7-87256896eff8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac1a76f7-dee9-4bbf-adb7-87256896eff8" (UID: "ac1a76f7-dee9-4bbf-adb7-87256896eff8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.781600 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1a76f7-dee9-4bbf-adb7-87256896eff8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.781635 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv547\" (UniqueName: \"kubernetes.io/projected/ac1a76f7-dee9-4bbf-adb7-87256896eff8-kube-api-access-jv547\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.781647 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1a76f7-dee9-4bbf-adb7-87256896eff8-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.931768 5047 generic.go:334] "Generic (PLEG): container finished" podID="ac1a76f7-dee9-4bbf-adb7-87256896eff8" containerID="2204fe3da296b1746216190ae737175be5bef5cae63bb6af85922c994b48ffa6" exitCode=0 Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.931844 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcdmb" Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.931872 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcdmb" event={"ID":"ac1a76f7-dee9-4bbf-adb7-87256896eff8","Type":"ContainerDied","Data":"2204fe3da296b1746216190ae737175be5bef5cae63bb6af85922c994b48ffa6"} Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.933925 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcdmb" event={"ID":"ac1a76f7-dee9-4bbf-adb7-87256896eff8","Type":"ContainerDied","Data":"2aa6bc483ed7f9535ce2d0d9637c106b8975ecbbfc348f77015a86aa3c73f322"} Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.933972 5047 scope.go:117] "RemoveContainer" containerID="2204fe3da296b1746216190ae737175be5bef5cae63bb6af85922c994b48ffa6" Feb 23 08:55:47 crc kubenswrapper[5047]: I0223 08:55:47.975785 5047 scope.go:117] "RemoveContainer" containerID="10e28b45b927b8aa4909cbc4fa906d9332146f251891453869b0a2ab02eda0e0" Feb 23 08:55:48 crc kubenswrapper[5047]: I0223 08:55:48.001282 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fcdmb"] Feb 23 08:55:48 crc kubenswrapper[5047]: I0223 08:55:48.015622 5047 scope.go:117] "RemoveContainer" containerID="6457de0e7ecee411c1f7868461469b4fed9fca0eaafcf093f4cbb89242c67af1" Feb 23 08:55:48 crc kubenswrapper[5047]: I0223 08:55:48.016254 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fcdmb"] Feb 23 08:55:48 crc kubenswrapper[5047]: I0223 08:55:48.087262 5047 scope.go:117] "RemoveContainer" containerID="2204fe3da296b1746216190ae737175be5bef5cae63bb6af85922c994b48ffa6" Feb 23 08:55:48 crc kubenswrapper[5047]: E0223 08:55:48.088079 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2204fe3da296b1746216190ae737175be5bef5cae63bb6af85922c994b48ffa6\": container with ID starting with 2204fe3da296b1746216190ae737175be5bef5cae63bb6af85922c994b48ffa6 not found: ID does not exist" containerID="2204fe3da296b1746216190ae737175be5bef5cae63bb6af85922c994b48ffa6" Feb 23 08:55:48 crc kubenswrapper[5047]: I0223 08:55:48.088142 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2204fe3da296b1746216190ae737175be5bef5cae63bb6af85922c994b48ffa6"} err="failed to get container status \"2204fe3da296b1746216190ae737175be5bef5cae63bb6af85922c994b48ffa6\": rpc error: code = NotFound desc = could not find container \"2204fe3da296b1746216190ae737175be5bef5cae63bb6af85922c994b48ffa6\": container with ID starting with 2204fe3da296b1746216190ae737175be5bef5cae63bb6af85922c994b48ffa6 not found: ID does not exist" Feb 23 08:55:48 crc kubenswrapper[5047]: I0223 08:55:48.088184 5047 scope.go:117] "RemoveContainer" containerID="10e28b45b927b8aa4909cbc4fa906d9332146f251891453869b0a2ab02eda0e0" Feb 23 08:55:48 crc kubenswrapper[5047]: E0223 08:55:48.088745 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e28b45b927b8aa4909cbc4fa906d9332146f251891453869b0a2ab02eda0e0\": container with ID starting with 10e28b45b927b8aa4909cbc4fa906d9332146f251891453869b0a2ab02eda0e0 not found: ID does not exist" containerID="10e28b45b927b8aa4909cbc4fa906d9332146f251891453869b0a2ab02eda0e0" Feb 23 08:55:48 crc kubenswrapper[5047]: I0223 08:55:48.088819 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e28b45b927b8aa4909cbc4fa906d9332146f251891453869b0a2ab02eda0e0"} err="failed to get container status \"10e28b45b927b8aa4909cbc4fa906d9332146f251891453869b0a2ab02eda0e0\": rpc error: code = NotFound desc = could not find container \"10e28b45b927b8aa4909cbc4fa906d9332146f251891453869b0a2ab02eda0e0\": container with ID starting with 10e28b45b927b8aa4909cbc4fa906d9332146f251891453869b0a2ab02eda0e0 not found: ID does not exist" Feb 23 08:55:48 crc kubenswrapper[5047]: I0223 08:55:48.088859 5047 scope.go:117] "RemoveContainer" containerID="6457de0e7ecee411c1f7868461469b4fed9fca0eaafcf093f4cbb89242c67af1" Feb 23 08:55:48 crc kubenswrapper[5047]: E0223 08:55:48.089319 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6457de0e7ecee411c1f7868461469b4fed9fca0eaafcf093f4cbb89242c67af1\": container with ID starting with 6457de0e7ecee411c1f7868461469b4fed9fca0eaafcf093f4cbb89242c67af1 not found: ID does not exist" containerID="6457de0e7ecee411c1f7868461469b4fed9fca0eaafcf093f4cbb89242c67af1" Feb 23 08:55:48 crc kubenswrapper[5047]: I0223 08:55:48.089362 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6457de0e7ecee411c1f7868461469b4fed9fca0eaafcf093f4cbb89242c67af1"} err="failed to get container status \"6457de0e7ecee411c1f7868461469b4fed9fca0eaafcf093f4cbb89242c67af1\": rpc error: code = NotFound desc = could not find container \"6457de0e7ecee411c1f7868461469b4fed9fca0eaafcf093f4cbb89242c67af1\": container with ID starting with 6457de0e7ecee411c1f7868461469b4fed9fca0eaafcf093f4cbb89242c67af1 not found: ID does not exist" Feb 23 08:55:48 crc kubenswrapper[5047]: I0223 08:55:48.354400 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac1a76f7-dee9-4bbf-adb7-87256896eff8" path="/var/lib/kubelet/pods/ac1a76f7-dee9-4bbf-adb7-87256896eff8/volumes" Feb 23 08:55:58 crc kubenswrapper[5047]: I0223 08:55:58.288330 5047 scope.go:117] "RemoveContainer" containerID="e7f35f7cdd514ebfa9ff8bac6c35833daf8e0cbdaf1da0233acf88329b6c931d" Feb 23 08:55:58 crc kubenswrapper[5047]: I0223 08:55:58.322750 5047 scope.go:117] "RemoveContainer" containerID="1221a96ef2e1d143401f421d9dbd9acc36d207a6a2d1b24a62529ccca05df63f" Feb 23 08:55:58 crc kubenswrapper[5047]: I0223 08:55:58.413786 5047 scope.go:117] "RemoveContainer" containerID="640785e4e41685161f7d1ad957f6b772163bce0d1962206bf9d40200db078131" Feb 23 08:55:58 crc kubenswrapper[5047]: I0223 08:55:58.450166 5047 scope.go:117] "RemoveContainer" containerID="aa4b9f2c9aa9e2fe310ec55c2272fdcdb5d561e7689f4ce33ddea1abbb0a12c3" Feb 23 08:55:58 crc kubenswrapper[5047]: I0223 08:55:58.474526 5047 scope.go:117] "RemoveContainer" containerID="a6186d215ad7a57d35d2f665f282fc8f3c5f56b3ee5f6bf2f1bdca7762d547e9" Feb 23 08:55:58 crc kubenswrapper[5047]: I0223 08:55:58.510314 5047 scope.go:117] "RemoveContainer" containerID="25a65cae038fba539bc27589cc7c4f44444fed5df321cb4a5f40a5de73068b1e" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.088844 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86dd65c656-6n5cp"] Feb 23 08:56:02 crc kubenswrapper[5047]: E0223 08:56:02.089758 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1a76f7-dee9-4bbf-adb7-87256896eff8" containerName="extract-utilities" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.089782 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1a76f7-dee9-4bbf-adb7-87256896eff8" containerName="extract-utilities" Feb 23 08:56:02 crc kubenswrapper[5047]: E0223 08:56:02.089816 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3052f6fb-2684-49e5-a796-87f35378dae1" containerName="extract-utilities" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.089828 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3052f6fb-2684-49e5-a796-87f35378dae1" containerName="extract-utilities" Feb 23 08:56:02 crc kubenswrapper[5047]: E0223 08:56:02.089868 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1a76f7-dee9-4bbf-adb7-87256896eff8" containerName="extract-content" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.089882 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1a76f7-dee9-4bbf-adb7-87256896eff8" containerName="extract-content" Feb 23 08:56:02 crc kubenswrapper[5047]: E0223 08:56:02.089932 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3052f6fb-2684-49e5-a796-87f35378dae1" containerName="extract-content" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.089946 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3052f6fb-2684-49e5-a796-87f35378dae1" containerName="extract-content" Feb 23 08:56:02 crc kubenswrapper[5047]: E0223 08:56:02.089973 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3052f6fb-2684-49e5-a796-87f35378dae1" containerName="registry-server" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.089982 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3052f6fb-2684-49e5-a796-87f35378dae1" containerName="registry-server" Feb 23 08:56:02 crc kubenswrapper[5047]: E0223 08:56:02.089998 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1a76f7-dee9-4bbf-adb7-87256896eff8" containerName="registry-server" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.090009 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1a76f7-dee9-4bbf-adb7-87256896eff8" containerName="registry-server" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.090277 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="3052f6fb-2684-49e5-a796-87f35378dae1" containerName="registry-server" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.090314 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac1a76f7-dee9-4bbf-adb7-87256896eff8" containerName="registry-server" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.091751 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.118403 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86dd65c656-6n5cp"] Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.229533 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adff4079-41bc-4cde-bc30-4a29f5302568-logs\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.229609 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-horizon-secret-key\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.229814 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adff4079-41bc-4cde-bc30-4a29f5302568-scripts\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.229975 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adff4079-41bc-4cde-bc30-4a29f5302568-config-data\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.230025 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-horizon-tls-certs\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.230059 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-combined-ca-bundle\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.230164 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq7ln\" (UniqueName: \"kubernetes.io/projected/adff4079-41bc-4cde-bc30-4a29f5302568-kube-api-access-gq7ln\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.332202 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adff4079-41bc-4cde-bc30-4a29f5302568-config-data\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.332271 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-horizon-tls-certs\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.332320 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-combined-ca-bundle\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.332383 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq7ln\" (UniqueName: \"kubernetes.io/projected/adff4079-41bc-4cde-bc30-4a29f5302568-kube-api-access-gq7ln\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.332426 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adff4079-41bc-4cde-bc30-4a29f5302568-logs\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.332474 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-horizon-secret-key\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.332533 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adff4079-41bc-4cde-bc30-4a29f5302568-scripts\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.333338 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adff4079-41bc-4cde-bc30-4a29f5302568-logs\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.333605 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adff4079-41bc-4cde-bc30-4a29f5302568-scripts\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.334163 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adff4079-41bc-4cde-bc30-4a29f5302568-config-data\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.339466 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-horizon-secret-key\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.355980 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq7ln\" (UniqueName: \"kubernetes.io/projected/adff4079-41bc-4cde-bc30-4a29f5302568-kube-api-access-gq7ln\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.360535 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-horizon-tls-certs\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.361280 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-combined-ca-bundle\") pod \"horizon-86dd65c656-6n5cp\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.412926 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:02 crc kubenswrapper[5047]: I0223 08:56:02.933718 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86dd65c656-6n5cp"] Feb 23 08:56:02 crc kubenswrapper[5047]: W0223 08:56:02.937305 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadff4079_41bc_4cde_bc30_4a29f5302568.slice/crio-3d866e70ebba36c008403cf1441ec6a5555de1d40558d732a15fa81a0bb7be51 WatchSource:0}: Error finding container 3d866e70ebba36c008403cf1441ec6a5555de1d40558d732a15fa81a0bb7be51: Status 404 returned error can't find the container with id 3d866e70ebba36c008403cf1441ec6a5555de1d40558d732a15fa81a0bb7be51 Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.361603 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86dd65c656-6n5cp" event={"ID":"adff4079-41bc-4cde-bc30-4a29f5302568","Type":"ContainerStarted","Data":"902673b5c0f9f5c778122d89416d764f9c657be00747f6d0aaf5e3bce9ebde6d"} Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.361864 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86dd65c656-6n5cp" event={"ID":"adff4079-41bc-4cde-bc30-4a29f5302568","Type":"ContainerStarted","Data":"3d866e70ebba36c008403cf1441ec6a5555de1d40558d732a15fa81a0bb7be51"} Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.495530 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-jppvl"] Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.497431 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-jppvl" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.509988 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-jppvl"] Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.576268 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a32b6fb-abc2-42ed-94ba-0682376fdc51-operator-scripts\") pod \"heat-db-create-jppvl\" (UID: \"0a32b6fb-abc2-42ed-94ba-0682376fdc51\") " pod="openstack/heat-db-create-jppvl" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.576373 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnxv9\" (UniqueName: \"kubernetes.io/projected/0a32b6fb-abc2-42ed-94ba-0682376fdc51-kube-api-access-nnxv9\") pod \"heat-db-create-jppvl\" (UID: \"0a32b6fb-abc2-42ed-94ba-0682376fdc51\") " pod="openstack/heat-db-create-jppvl" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.595934 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-d754-account-create-update-78cx7"] Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.597307 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d754-account-create-update-78cx7" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.599750 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.604819 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-d754-account-create-update-78cx7"] Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.678487 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5stb4\" (UniqueName: \"kubernetes.io/projected/cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553-kube-api-access-5stb4\") pod \"heat-d754-account-create-update-78cx7\" (UID: \"cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553\") " pod="openstack/heat-d754-account-create-update-78cx7" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.678540 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a32b6fb-abc2-42ed-94ba-0682376fdc51-operator-scripts\") pod \"heat-db-create-jppvl\" (UID: \"0a32b6fb-abc2-42ed-94ba-0682376fdc51\") " pod="openstack/heat-db-create-jppvl" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.678637 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnxv9\" (UniqueName: \"kubernetes.io/projected/0a32b6fb-abc2-42ed-94ba-0682376fdc51-kube-api-access-nnxv9\") pod \"heat-db-create-jppvl\" (UID: \"0a32b6fb-abc2-42ed-94ba-0682376fdc51\") " pod="openstack/heat-db-create-jppvl" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.678694 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553-operator-scripts\") pod \"heat-d754-account-create-update-78cx7\" (UID: \"cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553\") " pod="openstack/heat-d754-account-create-update-78cx7" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.679706 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a32b6fb-abc2-42ed-94ba-0682376fdc51-operator-scripts\") pod \"heat-db-create-jppvl\" (UID: \"0a32b6fb-abc2-42ed-94ba-0682376fdc51\") " pod="openstack/heat-db-create-jppvl" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.696653 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnxv9\" (UniqueName: \"kubernetes.io/projected/0a32b6fb-abc2-42ed-94ba-0682376fdc51-kube-api-access-nnxv9\") pod \"heat-db-create-jppvl\" (UID: \"0a32b6fb-abc2-42ed-94ba-0682376fdc51\") " pod="openstack/heat-db-create-jppvl" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.781197 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5stb4\" (UniqueName: \"kubernetes.io/projected/cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553-kube-api-access-5stb4\") pod \"heat-d754-account-create-update-78cx7\" (UID: \"cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553\") " pod="openstack/heat-d754-account-create-update-78cx7" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.781385 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553-operator-scripts\") pod \"heat-d754-account-create-update-78cx7\" (UID: \"cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553\") " pod="openstack/heat-d754-account-create-update-78cx7" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.782211 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553-operator-scripts\") pod \"heat-d754-account-create-update-78cx7\" (UID: \"cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553\") " pod="openstack/heat-d754-account-create-update-78cx7" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.800015 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5stb4\" (UniqueName: \"kubernetes.io/projected/cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553-kube-api-access-5stb4\") pod \"heat-d754-account-create-update-78cx7\" (UID: \"cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553\") " pod="openstack/heat-d754-account-create-update-78cx7" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.823848 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-jppvl" Feb 23 08:56:03 crc kubenswrapper[5047]: I0223 08:56:03.957289 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d754-account-create-update-78cx7" Feb 23 08:56:04 crc kubenswrapper[5047]: I0223 08:56:04.304968 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-jppvl"] Feb 23 08:56:04 crc kubenswrapper[5047]: I0223 08:56:04.407289 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86dd65c656-6n5cp" event={"ID":"adff4079-41bc-4cde-bc30-4a29f5302568","Type":"ContainerStarted","Data":"47ff5d6db1f45daec44c8d4bbc86c28325f137beacfc6286b66cd85eba48a741"} Feb 23 08:56:04 crc kubenswrapper[5047]: I0223 08:56:04.415426 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-jppvl" event={"ID":"0a32b6fb-abc2-42ed-94ba-0682376fdc51","Type":"ContainerStarted","Data":"223e5067aa8b67a5447f52ef7b3476780db304785cb18246174b71a5e84a93ef"} Feb 23 08:56:04 crc kubenswrapper[5047]: I0223 08:56:04.449497 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-d754-account-create-update-78cx7"] Feb 23 08:56:04 crc kubenswrapper[5047]: I0223 08:56:04.452616 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-86dd65c656-6n5cp" podStartSLOduration=2.452590477 podStartE2EDuration="2.452590477s" podCreationTimestamp="2026-02-23 08:56:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:56:04.440579865 +0000 UTC m=+7886.691906999" watchObservedRunningTime="2026-02-23 08:56:04.452590477 +0000 UTC m=+7886.703917611" Feb 23 08:56:05 crc kubenswrapper[5047]: E0223 08:56:05.004199 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd5a5bd3_90cf_4e07_ae1c_53f13ee3a553.slice/crio-conmon-9f6508e2706dcc1d9229284a5dbaed31e788597ffade321a44957814cb190cdb.scope\": RecentStats: unable to find data in memory cache]" Feb 23 08:56:05 crc kubenswrapper[5047]: I0223 08:56:05.428570 5047 generic.go:334] "Generic (PLEG): container finished" podID="0a32b6fb-abc2-42ed-94ba-0682376fdc51" containerID="b6d42e4e1c915ab7ad50cbcf97e861f1ad954fa7b65115fca83232f8d0cd78e5" exitCode=0 Feb 23 08:56:05 crc kubenswrapper[5047]: I0223 08:56:05.428694 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-jppvl" event={"ID":"0a32b6fb-abc2-42ed-94ba-0682376fdc51","Type":"ContainerDied","Data":"b6d42e4e1c915ab7ad50cbcf97e861f1ad954fa7b65115fca83232f8d0cd78e5"} Feb 23 08:56:05 crc kubenswrapper[5047]: I0223 08:56:05.431875 5047 generic.go:334] "Generic (PLEG): container finished" podID="cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553" containerID="9f6508e2706dcc1d9229284a5dbaed31e788597ffade321a44957814cb190cdb" exitCode=0 Feb 23 08:56:05 crc kubenswrapper[5047]: I0223 08:56:05.431939 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-d754-account-create-update-78cx7" event={"ID":"cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553","Type":"ContainerDied","Data":"9f6508e2706dcc1d9229284a5dbaed31e788597ffade321a44957814cb190cdb"} Feb 23 08:56:05 crc kubenswrapper[5047]: I0223 08:56:05.431988 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-d754-account-create-update-78cx7" event={"ID":"cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553","Type":"ContainerStarted","Data":"7939048d1b6bb1f42f09cff43d454bcd65f02ac94bad372d56a3e292ce037c2f"} Feb 23 08:56:06 crc kubenswrapper[5047]: I0223 08:56:06.926587 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-jppvl" Feb 23 08:56:06 crc kubenswrapper[5047]: I0223 08:56:06.934970 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d754-account-create-update-78cx7" Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.074172 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5stb4\" (UniqueName: \"kubernetes.io/projected/cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553-kube-api-access-5stb4\") pod \"cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553\" (UID: \"cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553\") " Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.074425 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553-operator-scripts\") pod \"cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553\" (UID: \"cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553\") " Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.074489 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a32b6fb-abc2-42ed-94ba-0682376fdc51-operator-scripts\") pod \"0a32b6fb-abc2-42ed-94ba-0682376fdc51\" (UID: \"0a32b6fb-abc2-42ed-94ba-0682376fdc51\") " Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.075208 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553" (UID: "cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.075433 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a32b6fb-abc2-42ed-94ba-0682376fdc51-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0a32b6fb-abc2-42ed-94ba-0682376fdc51" (UID: "0a32b6fb-abc2-42ed-94ba-0682376fdc51"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.075616 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnxv9\" (UniqueName: \"kubernetes.io/projected/0a32b6fb-abc2-42ed-94ba-0682376fdc51-kube-api-access-nnxv9\") pod \"0a32b6fb-abc2-42ed-94ba-0682376fdc51\" (UID: \"0a32b6fb-abc2-42ed-94ba-0682376fdc51\") " Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.077849 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.077952 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a32b6fb-abc2-42ed-94ba-0682376fdc51-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.081232 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553-kube-api-access-5stb4" (OuterVolumeSpecName: "kube-api-access-5stb4") pod "cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553" (UID: "cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553"). InnerVolumeSpecName "kube-api-access-5stb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.091496 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a32b6fb-abc2-42ed-94ba-0682376fdc51-kube-api-access-nnxv9" (OuterVolumeSpecName: "kube-api-access-nnxv9") pod "0a32b6fb-abc2-42ed-94ba-0682376fdc51" (UID: "0a32b6fb-abc2-42ed-94ba-0682376fdc51"). InnerVolumeSpecName "kube-api-access-nnxv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.180804 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnxv9\" (UniqueName: \"kubernetes.io/projected/0a32b6fb-abc2-42ed-94ba-0682376fdc51-kube-api-access-nnxv9\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.180855 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5stb4\" (UniqueName: \"kubernetes.io/projected/cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553-kube-api-access-5stb4\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.465618 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-d754-account-create-update-78cx7" event={"ID":"cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553","Type":"ContainerDied","Data":"7939048d1b6bb1f42f09cff43d454bcd65f02ac94bad372d56a3e292ce037c2f"} Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.466009 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7939048d1b6bb1f42f09cff43d454bcd65f02ac94bad372d56a3e292ce037c2f" Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.466103 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d754-account-create-update-78cx7" Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.470711 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-jppvl" event={"ID":"0a32b6fb-abc2-42ed-94ba-0682376fdc51","Type":"ContainerDied","Data":"223e5067aa8b67a5447f52ef7b3476780db304785cb18246174b71a5e84a93ef"} Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.470749 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="223e5067aa8b67a5447f52ef7b3476780db304785cb18246174b71a5e84a93ef" Feb 23 08:56:07 crc kubenswrapper[5047]: I0223 08:56:07.470776 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-jppvl" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.717638 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-jkd76"] Feb 23 08:56:08 crc kubenswrapper[5047]: E0223 08:56:08.718171 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553" containerName="mariadb-account-create-update" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.718189 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553" containerName="mariadb-account-create-update" Feb 23 08:56:08 crc kubenswrapper[5047]: E0223 08:56:08.718223 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a32b6fb-abc2-42ed-94ba-0682376fdc51" containerName="mariadb-database-create" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.718234 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a32b6fb-abc2-42ed-94ba-0682376fdc51" containerName="mariadb-database-create" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.718445 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553" containerName="mariadb-account-create-update" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.718474 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a32b6fb-abc2-42ed-94ba-0682376fdc51" containerName="mariadb-database-create" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.719306 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-jkd76" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.722251 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.737087 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-jkd76"] Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.738771 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-bt4ln" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.814929 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49355e96-b059-44c8-8a2e-f4f0b3486526-combined-ca-bundle\") pod \"heat-db-sync-jkd76\" (UID: \"49355e96-b059-44c8-8a2e-f4f0b3486526\") " pod="openstack/heat-db-sync-jkd76" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.815147 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49355e96-b059-44c8-8a2e-f4f0b3486526-config-data\") pod \"heat-db-sync-jkd76\" (UID: \"49355e96-b059-44c8-8a2e-f4f0b3486526\") " pod="openstack/heat-db-sync-jkd76" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.815359 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jftlf\" (UniqueName: \"kubernetes.io/projected/49355e96-b059-44c8-8a2e-f4f0b3486526-kube-api-access-jftlf\") pod \"heat-db-sync-jkd76\" (UID: \"49355e96-b059-44c8-8a2e-f4f0b3486526\") " pod="openstack/heat-db-sync-jkd76" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.917147 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49355e96-b059-44c8-8a2e-f4f0b3486526-combined-ca-bundle\") pod \"heat-db-sync-jkd76\" (UID: \"49355e96-b059-44c8-8a2e-f4f0b3486526\") " pod="openstack/heat-db-sync-jkd76" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.917276 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49355e96-b059-44c8-8a2e-f4f0b3486526-config-data\") pod \"heat-db-sync-jkd76\" (UID: \"49355e96-b059-44c8-8a2e-f4f0b3486526\") " pod="openstack/heat-db-sync-jkd76" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.917353 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jftlf\" (UniqueName: \"kubernetes.io/projected/49355e96-b059-44c8-8a2e-f4f0b3486526-kube-api-access-jftlf\") pod \"heat-db-sync-jkd76\" (UID: \"49355e96-b059-44c8-8a2e-f4f0b3486526\") " pod="openstack/heat-db-sync-jkd76" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.922932 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49355e96-b059-44c8-8a2e-f4f0b3486526-combined-ca-bundle\") pod \"heat-db-sync-jkd76\" (UID: \"49355e96-b059-44c8-8a2e-f4f0b3486526\") " pod="openstack/heat-db-sync-jkd76" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.923438 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49355e96-b059-44c8-8a2e-f4f0b3486526-config-data\") pod \"heat-db-sync-jkd76\" (UID: \"49355e96-b059-44c8-8a2e-f4f0b3486526\") " pod="openstack/heat-db-sync-jkd76" Feb 23 08:56:08 crc kubenswrapper[5047]: I0223 08:56:08.949193 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jftlf\" (UniqueName: \"kubernetes.io/projected/49355e96-b059-44c8-8a2e-f4f0b3486526-kube-api-access-jftlf\") pod \"heat-db-sync-jkd76\" (UID: \"49355e96-b059-44c8-8a2e-f4f0b3486526\") " pod="openstack/heat-db-sync-jkd76" Feb 23 08:56:09 crc kubenswrapper[5047]: I0223 08:56:09.045105 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-jkd76" Feb 23 08:56:09 crc kubenswrapper[5047]: I0223 08:56:09.496957 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-jkd76"] Feb 23 08:56:10 crc kubenswrapper[5047]: I0223 08:56:10.504653 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-jkd76" event={"ID":"49355e96-b059-44c8-8a2e-f4f0b3486526","Type":"ContainerStarted","Data":"d0277efecf4051985099a08106b3ef1687a105d0a9f9b4ce7da84300f3a4eba8"} Feb 23 08:56:12 crc kubenswrapper[5047]: I0223 08:56:12.414315 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:12 crc kubenswrapper[5047]: I0223 08:56:12.414634 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:19 crc kubenswrapper[5047]: I0223 08:56:19.601073 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-jkd76" event={"ID":"49355e96-b059-44c8-8a2e-f4f0b3486526","Type":"ContainerStarted","Data":"71886f23f03852059f3e2fb43eb1d7d43f524af30918a11409712a16fb14f4a7"} Feb 23 08:56:19 crc kubenswrapper[5047]: I0223 08:56:19.625621 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-jkd76" podStartSLOduration=2.211651452 podStartE2EDuration="11.625602539s" podCreationTimestamp="2026-02-23 08:56:08 +0000 UTC" firstStartedPulling="2026-02-23 08:56:09.508221269 +0000 UTC m=+7891.759548403" lastFinishedPulling="2026-02-23 08:56:18.922172356 +0000 UTC m=+7901.173499490" observedRunningTime="2026-02-23 08:56:19.621691473 +0000 UTC m=+7901.873018607" watchObservedRunningTime="2026-02-23 08:56:19.625602539 +0000 UTC m=+7901.876929673" Feb 23 08:56:21 crc kubenswrapper[5047]: I0223 08:56:21.625655 5047 generic.go:334] "Generic (PLEG): container finished" podID="49355e96-b059-44c8-8a2e-f4f0b3486526" containerID="71886f23f03852059f3e2fb43eb1d7d43f524af30918a11409712a16fb14f4a7" exitCode=0 Feb 23 08:56:21 crc kubenswrapper[5047]: I0223 08:56:21.626438 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-jkd76" event={"ID":"49355e96-b059-44c8-8a2e-f4f0b3486526","Type":"ContainerDied","Data":"71886f23f03852059f3e2fb43eb1d7d43f524af30918a11409712a16fb14f4a7"} Feb 23 08:56:22 crc kubenswrapper[5047]: I0223 08:56:22.968263 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-jkd76" Feb 23 08:56:23 crc kubenswrapper[5047]: I0223 08:56:23.044802 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49355e96-b059-44c8-8a2e-f4f0b3486526-combined-ca-bundle\") pod \"49355e96-b059-44c8-8a2e-f4f0b3486526\" (UID: \"49355e96-b059-44c8-8a2e-f4f0b3486526\") " Feb 23 08:56:23 crc kubenswrapper[5047]: I0223 08:56:23.044947 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jftlf\" (UniqueName: \"kubernetes.io/projected/49355e96-b059-44c8-8a2e-f4f0b3486526-kube-api-access-jftlf\") pod \"49355e96-b059-44c8-8a2e-f4f0b3486526\" (UID: \"49355e96-b059-44c8-8a2e-f4f0b3486526\") " Feb 23 08:56:23 crc kubenswrapper[5047]: I0223 08:56:23.044975 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49355e96-b059-44c8-8a2e-f4f0b3486526-config-data\") pod \"49355e96-b059-44c8-8a2e-f4f0b3486526\" (UID: \"49355e96-b059-44c8-8a2e-f4f0b3486526\") " Feb 23 08:56:23 crc kubenswrapper[5047]: I0223 08:56:23.052471 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49355e96-b059-44c8-8a2e-f4f0b3486526-kube-api-access-jftlf" (OuterVolumeSpecName: "kube-api-access-jftlf") pod "49355e96-b059-44c8-8a2e-f4f0b3486526" (UID: "49355e96-b059-44c8-8a2e-f4f0b3486526"). InnerVolumeSpecName "kube-api-access-jftlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:23 crc kubenswrapper[5047]: I0223 08:56:23.081703 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49355e96-b059-44c8-8a2e-f4f0b3486526-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49355e96-b059-44c8-8a2e-f4f0b3486526" (UID: "49355e96-b059-44c8-8a2e-f4f0b3486526"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:23 crc kubenswrapper[5047]: I0223 08:56:23.133179 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49355e96-b059-44c8-8a2e-f4f0b3486526-config-data" (OuterVolumeSpecName: "config-data") pod "49355e96-b059-44c8-8a2e-f4f0b3486526" (UID: "49355e96-b059-44c8-8a2e-f4f0b3486526"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:23 crc kubenswrapper[5047]: I0223 08:56:23.147185 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jftlf\" (UniqueName: \"kubernetes.io/projected/49355e96-b059-44c8-8a2e-f4f0b3486526-kube-api-access-jftlf\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:23 crc kubenswrapper[5047]: I0223 08:56:23.147216 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49355e96-b059-44c8-8a2e-f4f0b3486526-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:23 crc kubenswrapper[5047]: I0223 08:56:23.147226 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49355e96-b059-44c8-8a2e-f4f0b3486526-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:23 crc kubenswrapper[5047]: I0223 08:56:23.648129 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-jkd76" event={"ID":"49355e96-b059-44c8-8a2e-f4f0b3486526","Type":"ContainerDied","Data":"d0277efecf4051985099a08106b3ef1687a105d0a9f9b4ce7da84300f3a4eba8"} Feb 23 08:56:23 crc kubenswrapper[5047]: I0223 08:56:23.648431 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0277efecf4051985099a08106b3ef1687a105d0a9f9b4ce7da84300f3a4eba8" Feb 23 08:56:23 crc kubenswrapper[5047]: I0223 08:56:23.648222 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-jkd76" Feb 23 08:56:24 crc kubenswrapper[5047]: I0223 08:56:24.727378 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:24 crc kubenswrapper[5047]: I0223 08:56:24.872118 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6f7f7b59bc-ft6jn"] Feb 23 08:56:24 crc kubenswrapper[5047]: E0223 08:56:24.872686 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49355e96-b059-44c8-8a2e-f4f0b3486526" containerName="heat-db-sync" Feb 23 08:56:24 crc kubenswrapper[5047]: I0223 08:56:24.872710 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="49355e96-b059-44c8-8a2e-f4f0b3486526" containerName="heat-db-sync" Feb 23 08:56:24 crc kubenswrapper[5047]: I0223 08:56:24.872939 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="49355e96-b059-44c8-8a2e-f4f0b3486526" containerName="heat-db-sync" Feb 23 08:56:24 crc kubenswrapper[5047]: I0223 08:56:24.873959 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:24 crc kubenswrapper[5047]: I0223 08:56:24.877714 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 23 08:56:24 crc kubenswrapper[5047]: I0223 08:56:24.877979 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-bt4ln" Feb 23 08:56:24 crc kubenswrapper[5047]: I0223 08:56:24.878209 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 23 08:56:24 crc kubenswrapper[5047]: I0223 08:56:24.914106 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6f7f7b59bc-ft6jn"] Feb 23 08:56:24 crc kubenswrapper[5047]: I0223 08:56:24.996865 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-combined-ca-bundle\") pod \"heat-engine-6f7f7b59bc-ft6jn\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:24 crc kubenswrapper[5047]: I0223 08:56:24.996969 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-config-data\") pod \"heat-engine-6f7f7b59bc-ft6jn\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:24 crc kubenswrapper[5047]: I0223 08:56:24.997063 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4hj6\" (UniqueName: \"kubernetes.io/projected/b819f642-7df6-4fe0-81b1-531a509215d5-kube-api-access-f4hj6\") pod \"heat-engine-6f7f7b59bc-ft6jn\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:24 crc kubenswrapper[5047]: I0223 08:56:24.997186 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-config-data-custom\") pod \"heat-engine-6f7f7b59bc-ft6jn\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.029932 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-79f65cbcb8-nvmfp"] Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.031156 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.035688 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.065670 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-79f65cbcb8-nvmfp"] Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.099145 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-config-data\") pod \"heat-cfnapi-79f65cbcb8-nvmfp\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.099251 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4hj6\" (UniqueName: \"kubernetes.io/projected/b819f642-7df6-4fe0-81b1-531a509215d5-kube-api-access-f4hj6\") pod \"heat-engine-6f7f7b59bc-ft6jn\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.099322 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9pkf\" (UniqueName: \"kubernetes.io/projected/92c8be36-e854-4f95-b334-6871c09d4e70-kube-api-access-g9pkf\") pod \"heat-cfnapi-79f65cbcb8-nvmfp\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.099342 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-config-data-custom\") pod \"heat-cfnapi-79f65cbcb8-nvmfp\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.099385 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-combined-ca-bundle\") pod \"heat-cfnapi-79f65cbcb8-nvmfp\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.099406 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-config-data-custom\") pod \"heat-engine-6f7f7b59bc-ft6jn\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.099465 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-combined-ca-bundle\") pod \"heat-engine-6f7f7b59bc-ft6jn\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.099486 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-config-data\") pod \"heat-engine-6f7f7b59bc-ft6jn\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.111371 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-config-data-custom\") pod \"heat-engine-6f7f7b59bc-ft6jn\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.112109 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-combined-ca-bundle\") pod \"heat-engine-6f7f7b59bc-ft6jn\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.115314 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-config-data\") pod \"heat-engine-6f7f7b59bc-ft6jn\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.123673 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-754f89c4b4-fntms"] Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.124926 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.133417 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4hj6\" (UniqueName: \"kubernetes.io/projected/b819f642-7df6-4fe0-81b1-531a509215d5-kube-api-access-f4hj6\") pod \"heat-engine-6f7f7b59bc-ft6jn\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.133730 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.175363 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-754f89c4b4-fntms"] Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.202400 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.203070 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9pkf\" (UniqueName: \"kubernetes.io/projected/92c8be36-e854-4f95-b334-6871c09d4e70-kube-api-access-g9pkf\") pod \"heat-cfnapi-79f65cbcb8-nvmfp\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.203120 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-config-data-custom\") pod \"heat-cfnapi-79f65cbcb8-nvmfp\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.203171 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-combined-ca-bundle\") pod \"heat-cfnapi-79f65cbcb8-nvmfp\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.203236 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-combined-ca-bundle\") pod \"heat-api-754f89c4b4-fntms\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.203266 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47j4p\" (UniqueName: \"kubernetes.io/projected/cf95689d-1932-4302-ac77-331ba7a72b25-kube-api-access-47j4p\") pod \"heat-api-754f89c4b4-fntms\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.203320 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-config-data-custom\") pod \"heat-api-754f89c4b4-fntms\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.203343 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-config-data\") pod \"heat-api-754f89c4b4-fntms\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.203379 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-config-data\") pod \"heat-cfnapi-79f65cbcb8-nvmfp\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.207353 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-config-data\") pod \"heat-cfnapi-79f65cbcb8-nvmfp\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.210873 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-combined-ca-bundle\") pod \"heat-cfnapi-79f65cbcb8-nvmfp\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.212698 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-config-data-custom\") pod \"heat-cfnapi-79f65cbcb8-nvmfp\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.232802 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9pkf\" (UniqueName: \"kubernetes.io/projected/92c8be36-e854-4f95-b334-6871c09d4e70-kube-api-access-g9pkf\") pod \"heat-cfnapi-79f65cbcb8-nvmfp\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.306102 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-combined-ca-bundle\") pod \"heat-api-754f89c4b4-fntms\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.306151 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47j4p\" (UniqueName: \"kubernetes.io/projected/cf95689d-1932-4302-ac77-331ba7a72b25-kube-api-access-47j4p\") pod \"heat-api-754f89c4b4-fntms\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.306207 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-config-data-custom\") pod \"heat-api-754f89c4b4-fntms\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.306228 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-config-data\") pod \"heat-api-754f89c4b4-fntms\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.313802 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-combined-ca-bundle\") pod \"heat-api-754f89c4b4-fntms\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.314171 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-config-data\") pod \"heat-api-754f89c4b4-fntms\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.314846 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-config-data-custom\") pod \"heat-api-754f89c4b4-fntms\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.324834 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47j4p\" (UniqueName: \"kubernetes.io/projected/cf95689d-1932-4302-ac77-331ba7a72b25-kube-api-access-47j4p\") pod \"heat-api-754f89c4b4-fntms\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.353861 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.382851 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.787864 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6f7f7b59bc-ft6jn"] Feb 23 08:56:25 crc kubenswrapper[5047]: I0223 08:56:25.988333 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-79f65cbcb8-nvmfp"] Feb 23 08:56:26 crc kubenswrapper[5047]: I0223 08:56:26.066327 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-754f89c4b4-fntms"] Feb 23 08:56:26 crc kubenswrapper[5047]: I0223 08:56:26.696406 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6f7f7b59bc-ft6jn" event={"ID":"b819f642-7df6-4fe0-81b1-531a509215d5","Type":"ContainerStarted","Data":"38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6"} Feb 23 08:56:26 crc kubenswrapper[5047]: I0223 08:56:26.696759 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6f7f7b59bc-ft6jn" event={"ID":"b819f642-7df6-4fe0-81b1-531a509215d5","Type":"ContainerStarted","Data":"482565e6ea56875c298f46fc460066aac8d1171925b123787169ee9ee596b2db"} Feb 23 08:56:26 crc kubenswrapper[5047]: I0223 08:56:26.696782 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:26 crc kubenswrapper[5047]: I0223 08:56:26.699460 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" event={"ID":"92c8be36-e854-4f95-b334-6871c09d4e70","Type":"ContainerStarted","Data":"e2aec28786edc7c5d820cc3aac7ed9ffa23690428cb36635ce2a2d00aa2494ae"} Feb 23 08:56:26 crc kubenswrapper[5047]: I0223 08:56:26.702913 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-754f89c4b4-fntms" event={"ID":"cf95689d-1932-4302-ac77-331ba7a72b25","Type":"ContainerStarted","Data":"2609dc7a901773787a39c800cf9394bdc569b673fad17e845f58794e3137a30f"} Feb 23 08:56:26 crc kubenswrapper[5047]: I0223 08:56:26.724157 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6f7f7b59bc-ft6jn" podStartSLOduration=2.72413896 podStartE2EDuration="2.72413896s" podCreationTimestamp="2026-02-23 08:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:56:26.71558249 +0000 UTC m=+7908.966909624" watchObservedRunningTime="2026-02-23 08:56:26.72413896 +0000 UTC m=+7908.975466094" Feb 23 08:56:26 crc kubenswrapper[5047]: I0223 08:56:26.841505 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 08:56:26 crc kubenswrapper[5047]: I0223 08:56:26.955878 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7794c55cb8-dz2pc"] Feb 23 08:56:26 crc kubenswrapper[5047]: I0223 08:56:26.956312 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7794c55cb8-dz2pc" podUID="90893ba0-abf6-4d64-be28-115c844f3252" containerName="horizon-log" containerID="cri-o://70e7711cd78711a953a35e90a7e4aa5585094c69e0b06440ef4d0935f4c9720d" gracePeriod=30 Feb 23 08:56:26 crc kubenswrapper[5047]: I0223 08:56:26.957023 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7794c55cb8-dz2pc" podUID="90893ba0-abf6-4d64-be28-115c844f3252" containerName="horizon" containerID="cri-o://a5cf62004fd25e3f3f20a2d2d9f3077f72bdd7c9b10511b881a6633d5f07383a" gracePeriod=30 Feb 23 08:56:29 crc kubenswrapper[5047]: I0223 08:56:29.736929 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-754f89c4b4-fntms" event={"ID":"cf95689d-1932-4302-ac77-331ba7a72b25","Type":"ContainerStarted","Data":"14feb2c529a658106eb2624dafa44f45891ee7a5b71324690b2497d70f22fb41"} Feb 23 08:56:29 crc kubenswrapper[5047]: I0223 08:56:29.737698 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:29 crc kubenswrapper[5047]: I0223 08:56:29.741071 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" event={"ID":"92c8be36-e854-4f95-b334-6871c09d4e70","Type":"ContainerStarted","Data":"a23e441c2fd57c6b1ebf7e21f571c8fb6537733e1afe33717923e3eef6a88152"} Feb 23 08:56:29 crc kubenswrapper[5047]: I0223 08:56:29.741258 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:29 crc kubenswrapper[5047]: I0223 08:56:29.755668 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-754f89c4b4-fntms" podStartSLOduration=1.502271424 podStartE2EDuration="4.755638167s" podCreationTimestamp="2026-02-23 08:56:25 +0000 UTC" firstStartedPulling="2026-02-23 08:56:26.071046368 +0000 UTC m=+7908.322373502" lastFinishedPulling="2026-02-23 08:56:29.324413111 +0000 UTC m=+7911.575740245" observedRunningTime="2026-02-23 08:56:29.75278261 +0000 UTC m=+7912.004109744" watchObservedRunningTime="2026-02-23 08:56:29.755638167 +0000 UTC m=+7912.006965301" Feb 23 08:56:29 crc kubenswrapper[5047]: I0223 08:56:29.788330 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" podStartSLOduration=1.465755643 podStartE2EDuration="4.788301673s" podCreationTimestamp="2026-02-23 08:56:25 +0000 UTC" firstStartedPulling="2026-02-23 08:56:26.006647949 +0000 UTC m=+7908.257975083" lastFinishedPulling="2026-02-23 08:56:29.329193979 +0000 UTC m=+7911.580521113" observedRunningTime="2026-02-23 08:56:29.776952838 +0000 UTC m=+7912.028279972" watchObservedRunningTime="2026-02-23 08:56:29.788301673 +0000 UTC m=+7912.039628837" Feb 23 08:56:30 crc kubenswrapper[5047]: I0223 08:56:30.757520 5047 generic.go:334] "Generic (PLEG): container finished" podID="90893ba0-abf6-4d64-be28-115c844f3252" containerID="a5cf62004fd25e3f3f20a2d2d9f3077f72bdd7c9b10511b881a6633d5f07383a" exitCode=0 Feb 23 08:56:30 crc kubenswrapper[5047]: I0223 08:56:30.757995 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7794c55cb8-dz2pc" event={"ID":"90893ba0-abf6-4d64-be28-115c844f3252","Type":"ContainerDied","Data":"a5cf62004fd25e3f3f20a2d2d9f3077f72bdd7c9b10511b881a6633d5f07383a"} Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.191564 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6db8c9f77f-vfbhs"] Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.194210 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.211404 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-866f6f8686-qwqpr"] Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.212712 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.234429 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6db8c9f77f-vfbhs"] Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.252211 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-866f6f8686-qwqpr"] Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.263468 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-79cb79dfbb-s8kqv"] Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.265417 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.284019 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgm8k\" (UniqueName: \"kubernetes.io/projected/5adc996d-c7bb-49a2-bea0-909b84c93353-kube-api-access-xgm8k\") pod \"heat-engine-6db8c9f77f-vfbhs\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.284072 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjwd5\" (UniqueName: \"kubernetes.io/projected/901e124c-e075-45a1-b768-f7a62a892c36-kube-api-access-cjwd5\") pod \"heat-api-866f6f8686-qwqpr\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.284105 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7h52\" (UniqueName: \"kubernetes.io/projected/355d03ff-6601-4e09-ba7b-2593b0bf0c27-kube-api-access-s7h52\") pod \"heat-cfnapi-79cb79dfbb-s8kqv\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.284126 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-combined-ca-bundle\") pod \"heat-api-866f6f8686-qwqpr\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.284158 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-config-data\") pod \"heat-engine-6db8c9f77f-vfbhs\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.284198 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-combined-ca-bundle\") pod \"heat-cfnapi-79cb79dfbb-s8kqv\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.284222 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-config-data-custom\") pod \"heat-api-866f6f8686-qwqpr\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.284278 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-config-data\") pod \"heat-api-866f6f8686-qwqpr\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.284340 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-config-data\") pod \"heat-cfnapi-79cb79dfbb-s8kqv\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.284368 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-config-data-custom\") pod \"heat-cfnapi-79cb79dfbb-s8kqv\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.284417 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-config-data-custom\") pod \"heat-engine-6db8c9f77f-vfbhs\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.284436 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-combined-ca-bundle\") pod \"heat-engine-6db8c9f77f-vfbhs\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.300917 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-79cb79dfbb-s8kqv"] Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.316951 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7794c55cb8-dz2pc" podUID="90893ba0-abf6-4d64-be28-115c844f3252" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.113:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8443: connect: connection refused" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.385848 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-config-data-custom\") pod \"heat-api-866f6f8686-qwqpr\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.385974 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-config-data\") pod \"heat-api-866f6f8686-qwqpr\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.386052 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-config-data\") pod \"heat-cfnapi-79cb79dfbb-s8kqv\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.386082 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-config-data-custom\") pod \"heat-cfnapi-79cb79dfbb-s8kqv\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.386133 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-config-data-custom\") pod \"heat-engine-6db8c9f77f-vfbhs\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.386156 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-combined-ca-bundle\") pod \"heat-engine-6db8c9f77f-vfbhs\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.386196 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgm8k\" (UniqueName: \"kubernetes.io/projected/5adc996d-c7bb-49a2-bea0-909b84c93353-kube-api-access-xgm8k\") pod \"heat-engine-6db8c9f77f-vfbhs\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.386231 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjwd5\" (UniqueName: \"kubernetes.io/projected/901e124c-e075-45a1-b768-f7a62a892c36-kube-api-access-cjwd5\") pod \"heat-api-866f6f8686-qwqpr\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.386270 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7h52\" (UniqueName: \"kubernetes.io/projected/355d03ff-6601-4e09-ba7b-2593b0bf0c27-kube-api-access-s7h52\") pod \"heat-cfnapi-79cb79dfbb-s8kqv\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.386297 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-combined-ca-bundle\") pod \"heat-api-866f6f8686-qwqpr\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.386336 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-config-data\") pod \"heat-engine-6db8c9f77f-vfbhs\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.386381 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-combined-ca-bundle\") pod \"heat-cfnapi-79cb79dfbb-s8kqv\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.393579 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-config-data-custom\") pod \"heat-engine-6db8c9f77f-vfbhs\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.393935 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-config-data\") pod \"heat-engine-6db8c9f77f-vfbhs\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.402196 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-combined-ca-bundle\") pod \"heat-engine-6db8c9f77f-vfbhs\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.407066 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-combined-ca-bundle\") pod \"heat-api-866f6f8686-qwqpr\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.407344 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-config-data\") pod \"heat-api-866f6f8686-qwqpr\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.408011 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-config-data-custom\") pod \"heat-api-866f6f8686-qwqpr\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.408945 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-config-data-custom\") pod \"heat-cfnapi-79cb79dfbb-s8kqv\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.409956 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjwd5\" (UniqueName: \"kubernetes.io/projected/901e124c-e075-45a1-b768-f7a62a892c36-kube-api-access-cjwd5\") pod \"heat-api-866f6f8686-qwqpr\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.409975 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-config-data\") pod \"heat-cfnapi-79cb79dfbb-s8kqv\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.410272 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-combined-ca-bundle\") pod \"heat-cfnapi-79cb79dfbb-s8kqv\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.412126 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7h52\" (UniqueName: \"kubernetes.io/projected/355d03ff-6601-4e09-ba7b-2593b0bf0c27-kube-api-access-s7h52\") pod \"heat-cfnapi-79cb79dfbb-s8kqv\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.423218 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgm8k\" (UniqueName: \"kubernetes.io/projected/5adc996d-c7bb-49a2-bea0-909b84c93353-kube-api-access-xgm8k\") pod \"heat-engine-6db8c9f77f-vfbhs\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.515172 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.534504 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:32 crc kubenswrapper[5047]: I0223 08:56:32.584674 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.212001 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6db8c9f77f-vfbhs"] Feb 23 08:56:33 crc kubenswrapper[5047]: W0223 08:56:33.227219 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod901e124c_e075_45a1_b768_f7a62a892c36.slice/crio-40dd296d3834b0faa7f08512d591dfbd7674619637bdb52ccff0a9289d626024 WatchSource:0}: Error finding container 40dd296d3834b0faa7f08512d591dfbd7674619637bdb52ccff0a9289d626024: Status 404 returned error can't find the container with id 40dd296d3834b0faa7f08512d591dfbd7674619637bdb52ccff0a9289d626024 Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.227527 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-79cb79dfbb-s8kqv"] Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.257259 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-866f6f8686-qwqpr"] Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.591266 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-79f65cbcb8-nvmfp"] Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.591890 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" podUID="92c8be36-e854-4f95-b334-6871c09d4e70" containerName="heat-cfnapi" containerID="cri-o://a23e441c2fd57c6b1ebf7e21f571c8fb6537733e1afe33717923e3eef6a88152" gracePeriod=60 Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.607373 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-754f89c4b4-fntms"] Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.611061 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-754f89c4b4-fntms" podUID="cf95689d-1932-4302-ac77-331ba7a72b25" containerName="heat-api" containerID="cri-o://14feb2c529a658106eb2624dafa44f45891ee7a5b71324690b2497d70f22fb41" gracePeriod=60 Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.646343 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5f58d8574d-t4gn8"] Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.651627 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.665394 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-58486f655d-lth95"] Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.666016 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.666269 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.666780 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.680567 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.680692 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.686072 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58486f655d-lth95"] Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.710110 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f58d8574d-t4gn8"] Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.827452 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6db8c9f77f-vfbhs" event={"ID":"5adc996d-c7bb-49a2-bea0-909b84c93353","Type":"ContainerStarted","Data":"c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746"} Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.827753 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6db8c9f77f-vfbhs" event={"ID":"5adc996d-c7bb-49a2-bea0-909b84c93353","Type":"ContainerStarted","Data":"c7807c5d9f393a58b4ab5708356fd88c32f39c95edb07851bda6944d7a823ac9"} Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.828355 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.829801 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpx64\" (UniqueName: \"kubernetes.io/projected/8596ec60-89ed-43d9-be63-b7130fd0f937-kube-api-access-gpx64\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.831718 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-config-data\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.831949 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-combined-ca-bundle\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.832055 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-config-data-custom\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.832752 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-combined-ca-bundle\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.833112 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-config-data-custom\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.833200 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-internal-tls-certs\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.833239 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-internal-tls-certs\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.833430 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-config-data\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.833458 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-public-tls-certs\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.833613 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgmlt\" (UniqueName: \"kubernetes.io/projected/50aef737-c888-466c-92d0-9c683267d266-kube-api-access-jgmlt\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.833682 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-public-tls-certs\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.837217 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-866f6f8686-qwqpr" event={"ID":"901e124c-e075-45a1-b768-f7a62a892c36","Type":"ContainerStarted","Data":"edd2f3fdce90bbe1c681df373dd74416e525f96e9a183d3a9d3062bac2d19195"} Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.837268 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-866f6f8686-qwqpr" event={"ID":"901e124c-e075-45a1-b768-f7a62a892c36","Type":"ContainerStarted","Data":"40dd296d3834b0faa7f08512d591dfbd7674619637bdb52ccff0a9289d626024"} Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.838395 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.851045 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" event={"ID":"355d03ff-6601-4e09-ba7b-2593b0bf0c27","Type":"ContainerStarted","Data":"89fe5d95c3a4a2add3a5340db64d93ad13f530d8a941bd83ce9b856cce027a04"} Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.851145 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" event={"ID":"355d03ff-6601-4e09-ba7b-2593b0bf0c27","Type":"ContainerStarted","Data":"f94a30e610477b525689b2f1390755dfa54d71ab3abd4d3c756e801e7ef291cf"} Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.852303 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.854646 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6db8c9f77f-vfbhs" podStartSLOduration=1.854630319 podStartE2EDuration="1.854630319s" podCreationTimestamp="2026-02-23 08:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:56:33.850250211 +0000 UTC m=+7916.101577355" watchObservedRunningTime="2026-02-23 08:56:33.854630319 +0000 UTC m=+7916.105957453" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.882280 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-866f6f8686-qwqpr" podStartSLOduration=1.882262141 podStartE2EDuration="1.882262141s" podCreationTimestamp="2026-02-23 08:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:56:33.877207395 +0000 UTC m=+7916.128534519" watchObservedRunningTime="2026-02-23 08:56:33.882262141 +0000 UTC m=+7916.133589265" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.936587 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-combined-ca-bundle\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.936647 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-config-data-custom\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.936725 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-combined-ca-bundle\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.936813 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-config-data-custom\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.936854 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-internal-tls-certs\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.936876 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-internal-tls-certs\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.936895 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-config-data\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.936934 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-public-tls-certs\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.936979 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgmlt\" (UniqueName: \"kubernetes.io/projected/50aef737-c888-466c-92d0-9c683267d266-kube-api-access-jgmlt\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.937002 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-public-tls-certs\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.937034 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpx64\" (UniqueName: \"kubernetes.io/projected/8596ec60-89ed-43d9-be63-b7130fd0f937-kube-api-access-gpx64\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.937073 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-config-data\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.939696 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" podStartSLOduration=1.93963175 podStartE2EDuration="1.93963175s" podCreationTimestamp="2026-02-23 08:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:56:33.922755297 +0000 UTC m=+7916.174082431" watchObservedRunningTime="2026-02-23 08:56:33.93963175 +0000 UTC m=+7916.190958884" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.951677 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-combined-ca-bundle\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.951807 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-config-data\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.951697 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-config-data-custom\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.952317 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-internal-tls-certs\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.952702 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-config-data\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.953154 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-config-data-custom\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.959757 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-combined-ca-bundle\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.960949 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-public-tls-certs\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.962809 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-internal-tls-certs\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.963461 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-public-tls-certs\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.963672 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgmlt\" (UniqueName: \"kubernetes.io/projected/50aef737-c888-466c-92d0-9c683267d266-kube-api-access-jgmlt\") pod \"heat-api-5f58d8574d-t4gn8\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:33 crc kubenswrapper[5047]: I0223 08:56:33.965676 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpx64\" (UniqueName: \"kubernetes.io/projected/8596ec60-89ed-43d9-be63-b7130fd0f937-kube-api-access-gpx64\") pod \"heat-cfnapi-58486f655d-lth95\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.031607 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.051590 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.378478 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.456769 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-config-data\") pod \"92c8be36-e854-4f95-b334-6871c09d4e70\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.457166 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9pkf\" (UniqueName: \"kubernetes.io/projected/92c8be36-e854-4f95-b334-6871c09d4e70-kube-api-access-g9pkf\") pod \"92c8be36-e854-4f95-b334-6871c09d4e70\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.457268 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-config-data-custom\") pod \"92c8be36-e854-4f95-b334-6871c09d4e70\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.457317 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-combined-ca-bundle\") pod \"92c8be36-e854-4f95-b334-6871c09d4e70\" (UID: \"92c8be36-e854-4f95-b334-6871c09d4e70\") " Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.473768 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "92c8be36-e854-4f95-b334-6871c09d4e70" (UID: "92c8be36-e854-4f95-b334-6871c09d4e70"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.486635 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92c8be36-e854-4f95-b334-6871c09d4e70-kube-api-access-g9pkf" (OuterVolumeSpecName: "kube-api-access-g9pkf") pod "92c8be36-e854-4f95-b334-6871c09d4e70" (UID: "92c8be36-e854-4f95-b334-6871c09d4e70"). InnerVolumeSpecName "kube-api-access-g9pkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.517157 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92c8be36-e854-4f95-b334-6871c09d4e70" (UID: "92c8be36-e854-4f95-b334-6871c09d4e70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.559667 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9pkf\" (UniqueName: \"kubernetes.io/projected/92c8be36-e854-4f95-b334-6871c09d4e70-kube-api-access-g9pkf\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.559707 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.559717 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.574610 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-config-data" (OuterVolumeSpecName: "config-data") pod "92c8be36-e854-4f95-b334-6871c09d4e70" (UID: "92c8be36-e854-4f95-b334-6871c09d4e70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.661115 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92c8be36-e854-4f95-b334-6871c09d4e70-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.692822 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.762558 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f58d8574d-t4gn8"] Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.866712 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-combined-ca-bundle\") pod \"cf95689d-1932-4302-ac77-331ba7a72b25\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.867332 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47j4p\" (UniqueName: \"kubernetes.io/projected/cf95689d-1932-4302-ac77-331ba7a72b25-kube-api-access-47j4p\") pod \"cf95689d-1932-4302-ac77-331ba7a72b25\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.867366 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-config-data\") pod \"cf95689d-1932-4302-ac77-331ba7a72b25\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.867385 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-config-data-custom\") pod \"cf95689d-1932-4302-ac77-331ba7a72b25\" (UID: \"cf95689d-1932-4302-ac77-331ba7a72b25\") " Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.871537 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf95689d-1932-4302-ac77-331ba7a72b25-kube-api-access-47j4p" (OuterVolumeSpecName: "kube-api-access-47j4p") pod "cf95689d-1932-4302-ac77-331ba7a72b25" (UID: "cf95689d-1932-4302-ac77-331ba7a72b25"). InnerVolumeSpecName "kube-api-access-47j4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.872452 5047 generic.go:334] "Generic (PLEG): container finished" podID="355d03ff-6601-4e09-ba7b-2593b0bf0c27" containerID="89fe5d95c3a4a2add3a5340db64d93ad13f530d8a941bd83ce9b856cce027a04" exitCode=1 Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.872534 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" event={"ID":"355d03ff-6601-4e09-ba7b-2593b0bf0c27","Type":"ContainerDied","Data":"89fe5d95c3a4a2add3a5340db64d93ad13f530d8a941bd83ce9b856cce027a04"} Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.873306 5047 scope.go:117] "RemoveContainer" containerID="89fe5d95c3a4a2add3a5340db64d93ad13f530d8a941bd83ce9b856cce027a04" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.873685 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cf95689d-1932-4302-ac77-331ba7a72b25" (UID: "cf95689d-1932-4302-ac77-331ba7a72b25"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.903140 5047 generic.go:334] "Generic (PLEG): container finished" podID="92c8be36-e854-4f95-b334-6871c09d4e70" containerID="a23e441c2fd57c6b1ebf7e21f571c8fb6537733e1afe33717923e3eef6a88152" exitCode=0 Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.903246 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" event={"ID":"92c8be36-e854-4f95-b334-6871c09d4e70","Type":"ContainerDied","Data":"a23e441c2fd57c6b1ebf7e21f571c8fb6537733e1afe33717923e3eef6a88152"} Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.903770 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" event={"ID":"92c8be36-e854-4f95-b334-6871c09d4e70","Type":"ContainerDied","Data":"e2aec28786edc7c5d820cc3aac7ed9ffa23690428cb36635ce2a2d00aa2494ae"} Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.903312 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-79f65cbcb8-nvmfp" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.903811 5047 scope.go:117] "RemoveContainer" containerID="a23e441c2fd57c6b1ebf7e21f571c8fb6537733e1afe33717923e3eef6a88152" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.904309 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf95689d-1932-4302-ac77-331ba7a72b25" (UID: "cf95689d-1932-4302-ac77-331ba7a72b25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.922954 5047 generic.go:334] "Generic (PLEG): container finished" podID="cf95689d-1932-4302-ac77-331ba7a72b25" containerID="14feb2c529a658106eb2624dafa44f45891ee7a5b71324690b2497d70f22fb41" exitCode=0 Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.923068 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-754f89c4b4-fntms" event={"ID":"cf95689d-1932-4302-ac77-331ba7a72b25","Type":"ContainerDied","Data":"14feb2c529a658106eb2624dafa44f45891ee7a5b71324690b2497d70f22fb41"} Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.923114 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-754f89c4b4-fntms" event={"ID":"cf95689d-1932-4302-ac77-331ba7a72b25","Type":"ContainerDied","Data":"2609dc7a901773787a39c800cf9394bdc569b673fad17e845f58794e3137a30f"} Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.923189 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-754f89c4b4-fntms" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.927854 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f58d8574d-t4gn8" event={"ID":"50aef737-c888-466c-92d0-9c683267d266","Type":"ContainerStarted","Data":"397109ef6a97ec926b5e9f308fb40a8bd07bff7b12782befbb76658a1a9c9250"} Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.929643 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58486f655d-lth95"] Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.937168 5047 generic.go:334] "Generic (PLEG): container finished" podID="901e124c-e075-45a1-b768-f7a62a892c36" containerID="edd2f3fdce90bbe1c681df373dd74416e525f96e9a183d3a9d3062bac2d19195" exitCode=1 Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.937456 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-866f6f8686-qwqpr" event={"ID":"901e124c-e075-45a1-b768-f7a62a892c36","Type":"ContainerDied","Data":"edd2f3fdce90bbe1c681df373dd74416e525f96e9a183d3a9d3062bac2d19195"} Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.938677 5047 scope.go:117] "RemoveContainer" containerID="edd2f3fdce90bbe1c681df373dd74416e525f96e9a183d3a9d3062bac2d19195" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.964152 5047 scope.go:117] "RemoveContainer" containerID="a23e441c2fd57c6b1ebf7e21f571c8fb6537733e1afe33717923e3eef6a88152" Feb 23 08:56:34 crc kubenswrapper[5047]: E0223 08:56:34.965718 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a23e441c2fd57c6b1ebf7e21f571c8fb6537733e1afe33717923e3eef6a88152\": container with ID starting with a23e441c2fd57c6b1ebf7e21f571c8fb6537733e1afe33717923e3eef6a88152 not found: ID does not exist" containerID="a23e441c2fd57c6b1ebf7e21f571c8fb6537733e1afe33717923e3eef6a88152" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.965787 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a23e441c2fd57c6b1ebf7e21f571c8fb6537733e1afe33717923e3eef6a88152"} err="failed to get container status \"a23e441c2fd57c6b1ebf7e21f571c8fb6537733e1afe33717923e3eef6a88152\": rpc error: code = NotFound desc = could not find container \"a23e441c2fd57c6b1ebf7e21f571c8fb6537733e1afe33717923e3eef6a88152\": container with ID starting with a23e441c2fd57c6b1ebf7e21f571c8fb6537733e1afe33717923e3eef6a88152 not found: ID does not exist" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.965832 5047 scope.go:117] "RemoveContainer" containerID="14feb2c529a658106eb2624dafa44f45891ee7a5b71324690b2497d70f22fb41" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.971937 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.976861 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47j4p\" (UniqueName: \"kubernetes.io/projected/cf95689d-1932-4302-ac77-331ba7a72b25-kube-api-access-47j4p\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.976877 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.979336 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-config-data" (OuterVolumeSpecName: "config-data") pod "cf95689d-1932-4302-ac77-331ba7a72b25" (UID: "cf95689d-1932-4302-ac77-331ba7a72b25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:34 crc kubenswrapper[5047]: I0223 08:56:34.998131 5047 scope.go:117] "RemoveContainer" containerID="14feb2c529a658106eb2624dafa44f45891ee7a5b71324690b2497d70f22fb41" Feb 23 08:56:35 crc kubenswrapper[5047]: E0223 08:56:34.999933 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14feb2c529a658106eb2624dafa44f45891ee7a5b71324690b2497d70f22fb41\": container with ID starting with 14feb2c529a658106eb2624dafa44f45891ee7a5b71324690b2497d70f22fb41 not found: ID does not exist" containerID="14feb2c529a658106eb2624dafa44f45891ee7a5b71324690b2497d70f22fb41" Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:34.999979 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14feb2c529a658106eb2624dafa44f45891ee7a5b71324690b2497d70f22fb41"} err="failed to get container status \"14feb2c529a658106eb2624dafa44f45891ee7a5b71324690b2497d70f22fb41\": rpc error: code = NotFound desc = could not find container \"14feb2c529a658106eb2624dafa44f45891ee7a5b71324690b2497d70f22fb41\": container with ID starting with 14feb2c529a658106eb2624dafa44f45891ee7a5b71324690b2497d70f22fb41 not found: ID does not exist" Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.002247 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-79f65cbcb8-nvmfp"] Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.015866 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-79f65cbcb8-nvmfp"] Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.080367 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf95689d-1932-4302-ac77-331ba7a72b25-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.258810 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-754f89c4b4-fntms"] Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.268722 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-754f89c4b4-fntms"] Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.976139 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58486f655d-lth95" event={"ID":"8596ec60-89ed-43d9-be63-b7130fd0f937","Type":"ContainerStarted","Data":"38bbd56ee9a5e5626e98c5f9e39c04af222fb0e3d5740284ea5093fdcc638927"} Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.976717 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58486f655d-lth95" event={"ID":"8596ec60-89ed-43d9-be63-b7130fd0f937","Type":"ContainerStarted","Data":"08643c152f10fec24bf41bde57987b1345ca076fd1ae2d95553fdcbd849612e5"} Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.976743 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.981024 5047 generic.go:334] "Generic (PLEG): container finished" podID="355d03ff-6601-4e09-ba7b-2593b0bf0c27" containerID="231e2bea88046e38503125504e69099707472a13682a8873819fe4fff85d90eb" exitCode=1 Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.981123 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" event={"ID":"355d03ff-6601-4e09-ba7b-2593b0bf0c27","Type":"ContainerDied","Data":"231e2bea88046e38503125504e69099707472a13682a8873819fe4fff85d90eb"} Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.981210 5047 scope.go:117] "RemoveContainer" containerID="89fe5d95c3a4a2add3a5340db64d93ad13f530d8a941bd83ce9b856cce027a04" Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.981823 5047 scope.go:117] "RemoveContainer" containerID="231e2bea88046e38503125504e69099707472a13682a8873819fe4fff85d90eb" Feb 23 08:56:35 crc kubenswrapper[5047]: E0223 08:56:35.982104 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-79cb79dfbb-s8kqv_openstack(355d03ff-6601-4e09-ba7b-2593b0bf0c27)\"" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" podUID="355d03ff-6601-4e09-ba7b-2593b0bf0c27" Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.988725 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f58d8574d-t4gn8" event={"ID":"50aef737-c888-466c-92d0-9c683267d266","Type":"ContainerStarted","Data":"d228a3b0cadcbb3c297b956ba0f819355627a6ecf7412f2e7ba427ce670fe91f"} Feb 23 08:56:35 crc kubenswrapper[5047]: I0223 08:56:35.988781 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:36 crc kubenswrapper[5047]: I0223 08:56:36.006284 5047 generic.go:334] "Generic (PLEG): container finished" podID="901e124c-e075-45a1-b768-f7a62a892c36" containerID="1d6a11214bdb9acc0a409d6cc41f32700274bf27e224ed3b88c518695f57704d" exitCode=1 Feb 23 08:56:36 crc kubenswrapper[5047]: I0223 08:56:36.006354 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-866f6f8686-qwqpr" event={"ID":"901e124c-e075-45a1-b768-f7a62a892c36","Type":"ContainerDied","Data":"1d6a11214bdb9acc0a409d6cc41f32700274bf27e224ed3b88c518695f57704d"} Feb 23 08:56:36 crc kubenswrapper[5047]: I0223 08:56:36.007114 5047 scope.go:117] "RemoveContainer" containerID="1d6a11214bdb9acc0a409d6cc41f32700274bf27e224ed3b88c518695f57704d" Feb 23 08:56:36 crc kubenswrapper[5047]: E0223 08:56:36.007454 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-866f6f8686-qwqpr_openstack(901e124c-e075-45a1-b768-f7a62a892c36)\"" pod="openstack/heat-api-866f6f8686-qwqpr" podUID="901e124c-e075-45a1-b768-f7a62a892c36" Feb 23 08:56:36 crc kubenswrapper[5047]: I0223 08:56:36.008970 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-58486f655d-lth95" podStartSLOduration=3.008944218 podStartE2EDuration="3.008944218s" podCreationTimestamp="2026-02-23 08:56:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:56:36.000595634 +0000 UTC m=+7918.251922788" watchObservedRunningTime="2026-02-23 08:56:36.008944218 +0000 UTC m=+7918.260271372" Feb 23 08:56:36 crc kubenswrapper[5047]: I0223 08:56:36.088482 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5f58d8574d-t4gn8" podStartSLOduration=3.088460163 podStartE2EDuration="3.088460163s" podCreationTimestamp="2026-02-23 08:56:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:56:36.064402917 +0000 UTC m=+7918.315730051" watchObservedRunningTime="2026-02-23 08:56:36.088460163 +0000 UTC m=+7918.339787297" Feb 23 08:56:36 crc kubenswrapper[5047]: I0223 08:56:36.131171 5047 scope.go:117] "RemoveContainer" containerID="edd2f3fdce90bbe1c681df373dd74416e525f96e9a183d3a9d3062bac2d19195" Feb 23 08:56:36 crc kubenswrapper[5047]: I0223 08:56:36.350507 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92c8be36-e854-4f95-b334-6871c09d4e70" path="/var/lib/kubelet/pods/92c8be36-e854-4f95-b334-6871c09d4e70/volumes" Feb 23 08:56:36 crc kubenswrapper[5047]: I0223 08:56:36.351361 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf95689d-1932-4302-ac77-331ba7a72b25" path="/var/lib/kubelet/pods/cf95689d-1932-4302-ac77-331ba7a72b25/volumes" Feb 23 08:56:37 crc kubenswrapper[5047]: I0223 08:56:37.017603 5047 scope.go:117] "RemoveContainer" containerID="1d6a11214bdb9acc0a409d6cc41f32700274bf27e224ed3b88c518695f57704d" Feb 23 08:56:37 crc kubenswrapper[5047]: E0223 08:56:37.018272 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-866f6f8686-qwqpr_openstack(901e124c-e075-45a1-b768-f7a62a892c36)\"" pod="openstack/heat-api-866f6f8686-qwqpr" podUID="901e124c-e075-45a1-b768-f7a62a892c36" Feb 23 08:56:37 crc kubenswrapper[5047]: I0223 08:56:37.019082 5047 scope.go:117] "RemoveContainer" containerID="231e2bea88046e38503125504e69099707472a13682a8873819fe4fff85d90eb" Feb 23 08:56:37 crc kubenswrapper[5047]: E0223 08:56:37.019332 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-79cb79dfbb-s8kqv_openstack(355d03ff-6601-4e09-ba7b-2593b0bf0c27)\"" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" podUID="355d03ff-6601-4e09-ba7b-2593b0bf0c27" Feb 23 08:56:37 crc kubenswrapper[5047]: I0223 08:56:37.535710 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:37 crc kubenswrapper[5047]: I0223 08:56:37.535807 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:37 crc kubenswrapper[5047]: I0223 08:56:37.584835 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:37 crc kubenswrapper[5047]: I0223 08:56:37.585005 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:38 crc kubenswrapper[5047]: I0223 08:56:38.028123 5047 scope.go:117] "RemoveContainer" containerID="1d6a11214bdb9acc0a409d6cc41f32700274bf27e224ed3b88c518695f57704d" Feb 23 08:56:38 crc kubenswrapper[5047]: I0223 08:56:38.028404 5047 scope.go:117] "RemoveContainer" containerID="231e2bea88046e38503125504e69099707472a13682a8873819fe4fff85d90eb" Feb 23 08:56:38 crc kubenswrapper[5047]: E0223 08:56:38.028420 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-866f6f8686-qwqpr_openstack(901e124c-e075-45a1-b768-f7a62a892c36)\"" pod="openstack/heat-api-866f6f8686-qwqpr" podUID="901e124c-e075-45a1-b768-f7a62a892c36" Feb 23 08:56:38 crc kubenswrapper[5047]: E0223 08:56:38.028689 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-79cb79dfbb-s8kqv_openstack(355d03ff-6601-4e09-ba7b-2593b0bf0c27)\"" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" podUID="355d03ff-6601-4e09-ba7b-2593b0bf0c27" Feb 23 08:56:39 crc kubenswrapper[5047]: I0223 08:56:39.040307 5047 scope.go:117] "RemoveContainer" containerID="231e2bea88046e38503125504e69099707472a13682a8873819fe4fff85d90eb" Feb 23 08:56:39 crc kubenswrapper[5047]: E0223 08:56:39.041260 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-79cb79dfbb-s8kqv_openstack(355d03ff-6601-4e09-ba7b-2593b0bf0c27)\"" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" podUID="355d03ff-6601-4e09-ba7b-2593b0bf0c27" Feb 23 08:56:42 crc kubenswrapper[5047]: I0223 08:56:42.316375 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7794c55cb8-dz2pc" podUID="90893ba0-abf6-4d64-be28-115c844f3252" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.113:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8443: connect: connection refused" Feb 23 08:56:45 crc kubenswrapper[5047]: I0223 08:56:45.291586 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:56:45 crc kubenswrapper[5047]: I0223 08:56:45.395407 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 08:56:45 crc kubenswrapper[5047]: I0223 08:56:45.427415 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 08:56:45 crc kubenswrapper[5047]: I0223 08:56:45.455364 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-866f6f8686-qwqpr"] Feb 23 08:56:45 crc kubenswrapper[5047]: I0223 08:56:45.514088 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-79cb79dfbb-s8kqv"] Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:45.993981 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.006461 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.099667 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjwd5\" (UniqueName: \"kubernetes.io/projected/901e124c-e075-45a1-b768-f7a62a892c36-kube-api-access-cjwd5\") pod \"901e124c-e075-45a1-b768-f7a62a892c36\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.099698 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-config-data-custom\") pod \"901e124c-e075-45a1-b768-f7a62a892c36\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.099724 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-config-data\") pod \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.099746 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-config-data\") pod \"901e124c-e075-45a1-b768-f7a62a892c36\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.099799 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-combined-ca-bundle\") pod \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.099982 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7h52\" (UniqueName: \"kubernetes.io/projected/355d03ff-6601-4e09-ba7b-2593b0bf0c27-kube-api-access-s7h52\") pod \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.100660 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-combined-ca-bundle\") pod \"901e124c-e075-45a1-b768-f7a62a892c36\" (UID: \"901e124c-e075-45a1-b768-f7a62a892c36\") " Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.100698 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-config-data-custom\") pod \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\" (UID: \"355d03ff-6601-4e09-ba7b-2593b0bf0c27\") " Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.107210 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/901e124c-e075-45a1-b768-f7a62a892c36-kube-api-access-cjwd5" (OuterVolumeSpecName: "kube-api-access-cjwd5") pod "901e124c-e075-45a1-b768-f7a62a892c36" (UID: "901e124c-e075-45a1-b768-f7a62a892c36"). InnerVolumeSpecName "kube-api-access-cjwd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.107210 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "355d03ff-6601-4e09-ba7b-2593b0bf0c27" (UID: "355d03ff-6601-4e09-ba7b-2593b0bf0c27"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.107255 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/355d03ff-6601-4e09-ba7b-2593b0bf0c27-kube-api-access-s7h52" (OuterVolumeSpecName: "kube-api-access-s7h52") pod "355d03ff-6601-4e09-ba7b-2593b0bf0c27" (UID: "355d03ff-6601-4e09-ba7b-2593b0bf0c27"). InnerVolumeSpecName "kube-api-access-s7h52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.107238 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "901e124c-e075-45a1-b768-f7a62a892c36" (UID: "901e124c-e075-45a1-b768-f7a62a892c36"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.115922 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-866f6f8686-qwqpr" event={"ID":"901e124c-e075-45a1-b768-f7a62a892c36","Type":"ContainerDied","Data":"40dd296d3834b0faa7f08512d591dfbd7674619637bdb52ccff0a9289d626024"} Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.115970 5047 scope.go:117] "RemoveContainer" containerID="1d6a11214bdb9acc0a409d6cc41f32700274bf27e224ed3b88c518695f57704d" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.116079 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-866f6f8686-qwqpr" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.118483 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" event={"ID":"355d03ff-6601-4e09-ba7b-2593b0bf0c27","Type":"ContainerDied","Data":"f94a30e610477b525689b2f1390755dfa54d71ab3abd4d3c756e801e7ef291cf"} Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.118553 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-79cb79dfbb-s8kqv" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.130611 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "901e124c-e075-45a1-b768-f7a62a892c36" (UID: "901e124c-e075-45a1-b768-f7a62a892c36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.137635 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "355d03ff-6601-4e09-ba7b-2593b0bf0c27" (UID: "355d03ff-6601-4e09-ba7b-2593b0bf0c27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.168842 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-config-data" (OuterVolumeSpecName: "config-data") pod "901e124c-e075-45a1-b768-f7a62a892c36" (UID: "901e124c-e075-45a1-b768-f7a62a892c36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.179352 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-config-data" (OuterVolumeSpecName: "config-data") pod "355d03ff-6601-4e09-ba7b-2593b0bf0c27" (UID: "355d03ff-6601-4e09-ba7b-2593b0bf0c27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.202932 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.202959 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.202971 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjwd5\" (UniqueName: \"kubernetes.io/projected/901e124c-e075-45a1-b768-f7a62a892c36-kube-api-access-cjwd5\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.202982 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.202991 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.203001 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901e124c-e075-45a1-b768-f7a62a892c36-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.203009 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355d03ff-6601-4e09-ba7b-2593b0bf0c27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.203017 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7h52\" (UniqueName: \"kubernetes.io/projected/355d03ff-6601-4e09-ba7b-2593b0bf0c27-kube-api-access-s7h52\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.237323 5047 scope.go:117] "RemoveContainer" containerID="231e2bea88046e38503125504e69099707472a13682a8873819fe4fff85d90eb" Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.453846 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-866f6f8686-qwqpr"] Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.460867 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-866f6f8686-qwqpr"] Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.493220 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-79cb79dfbb-s8kqv"] Feb 23 08:56:46 crc kubenswrapper[5047]: I0223 08:56:46.503837 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-79cb79dfbb-s8kqv"] Feb 23 08:56:48 crc kubenswrapper[5047]: I0223 08:56:48.354776 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="355d03ff-6601-4e09-ba7b-2593b0bf0c27" path="/var/lib/kubelet/pods/355d03ff-6601-4e09-ba7b-2593b0bf0c27/volumes" Feb 23 08:56:48 crc kubenswrapper[5047]: I0223 08:56:48.356306 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="901e124c-e075-45a1-b768-f7a62a892c36" path="/var/lib/kubelet/pods/901e124c-e075-45a1-b768-f7a62a892c36/volumes" Feb 23 08:56:52 crc kubenswrapper[5047]: I0223 08:56:52.317652 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7794c55cb8-dz2pc" podUID="90893ba0-abf6-4d64-be28-115c844f3252" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.113:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8443: connect: connection refused" Feb 23 08:56:52 crc kubenswrapper[5047]: I0223 08:56:52.318533 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:56:52 crc kubenswrapper[5047]: I0223 08:56:52.568816 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 08:56:52 crc kubenswrapper[5047]: I0223 08:56:52.638803 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6f7f7b59bc-ft6jn"] Feb 23 08:56:52 crc kubenswrapper[5047]: I0223 08:56:52.639069 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6f7f7b59bc-ft6jn" podUID="b819f642-7df6-4fe0-81b1-531a509215d5" containerName="heat-engine" containerID="cri-o://38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6" gracePeriod=60 Feb 23 08:56:55 crc kubenswrapper[5047]: E0223 08:56:55.235254 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 08:56:55 crc kubenswrapper[5047]: E0223 08:56:55.243351 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 08:56:55 crc kubenswrapper[5047]: E0223 08:56:55.245488 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 08:56:55 crc kubenswrapper[5047]: E0223 08:56:55.245519 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6f7f7b59bc-ft6jn" podUID="b819f642-7df6-4fe0-81b1-531a509215d5" containerName="heat-engine" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.258797 5047 generic.go:334] "Generic (PLEG): container finished" podID="90893ba0-abf6-4d64-be28-115c844f3252" containerID="70e7711cd78711a953a35e90a7e4aa5585094c69e0b06440ef4d0935f4c9720d" exitCode=137 Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.259677 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7794c55cb8-dz2pc" event={"ID":"90893ba0-abf6-4d64-be28-115c844f3252","Type":"ContainerDied","Data":"70e7711cd78711a953a35e90a7e4aa5585094c69e0b06440ef4d0935f4c9720d"} Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.471944 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.564433 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx296\" (UniqueName: \"kubernetes.io/projected/90893ba0-abf6-4d64-be28-115c844f3252-kube-api-access-vx296\") pod \"90893ba0-abf6-4d64-be28-115c844f3252\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.564501 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90893ba0-abf6-4d64-be28-115c844f3252-scripts\") pod \"90893ba0-abf6-4d64-be28-115c844f3252\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.564549 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90893ba0-abf6-4d64-be28-115c844f3252-config-data\") pod \"90893ba0-abf6-4d64-be28-115c844f3252\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.564573 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90893ba0-abf6-4d64-be28-115c844f3252-logs\") pod \"90893ba0-abf6-4d64-be28-115c844f3252\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.564844 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-horizon-secret-key\") pod \"90893ba0-abf6-4d64-be28-115c844f3252\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.564934 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-horizon-tls-certs\") pod \"90893ba0-abf6-4d64-be28-115c844f3252\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.565021 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-combined-ca-bundle\") pod \"90893ba0-abf6-4d64-be28-115c844f3252\" (UID: \"90893ba0-abf6-4d64-be28-115c844f3252\") " Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.566936 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90893ba0-abf6-4d64-be28-115c844f3252-logs" (OuterVolumeSpecName: "logs") pod "90893ba0-abf6-4d64-be28-115c844f3252" (UID: "90893ba0-abf6-4d64-be28-115c844f3252"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.578696 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90893ba0-abf6-4d64-be28-115c844f3252-kube-api-access-vx296" (OuterVolumeSpecName: "kube-api-access-vx296") pod "90893ba0-abf6-4d64-be28-115c844f3252" (UID: "90893ba0-abf6-4d64-be28-115c844f3252"). InnerVolumeSpecName "kube-api-access-vx296". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.592786 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90893ba0-abf6-4d64-be28-115c844f3252-config-data" (OuterVolumeSpecName: "config-data") pod "90893ba0-abf6-4d64-be28-115c844f3252" (UID: "90893ba0-abf6-4d64-be28-115c844f3252"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.595042 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "90893ba0-abf6-4d64-be28-115c844f3252" (UID: "90893ba0-abf6-4d64-be28-115c844f3252"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.597712 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90893ba0-abf6-4d64-be28-115c844f3252" (UID: "90893ba0-abf6-4d64-be28-115c844f3252"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.611315 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90893ba0-abf6-4d64-be28-115c844f3252-scripts" (OuterVolumeSpecName: "scripts") pod "90893ba0-abf6-4d64-be28-115c844f3252" (UID: "90893ba0-abf6-4d64-be28-115c844f3252"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.622069 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "90893ba0-abf6-4d64-be28-115c844f3252" (UID: "90893ba0-abf6-4d64-be28-115c844f3252"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.668100 5047 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.668135 5047 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.668282 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90893ba0-abf6-4d64-be28-115c844f3252-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.668294 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx296\" (UniqueName: \"kubernetes.io/projected/90893ba0-abf6-4d64-be28-115c844f3252-kube-api-access-vx296\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.668304 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90893ba0-abf6-4d64-be28-115c844f3252-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.668313 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90893ba0-abf6-4d64-be28-115c844f3252-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:57 crc kubenswrapper[5047]: I0223 08:56:57.668376 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90893ba0-abf6-4d64-be28-115c844f3252-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:58 crc kubenswrapper[5047]: I0223 08:56:58.275408 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7794c55cb8-dz2pc" event={"ID":"90893ba0-abf6-4d64-be28-115c844f3252","Type":"ContainerDied","Data":"e457ce8eeba6435b150c9187e7ab69b876f22b5794652c51d5b371cec0fc0ba5"} Feb 23 08:56:58 crc kubenswrapper[5047]: I0223 08:56:58.275785 5047 scope.go:117] "RemoveContainer" containerID="a5cf62004fd25e3f3f20a2d2d9f3077f72bdd7c9b10511b881a6633d5f07383a" Feb 23 08:56:58 crc kubenswrapper[5047]: I0223 08:56:58.275956 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7794c55cb8-dz2pc" Feb 23 08:56:58 crc kubenswrapper[5047]: I0223 08:56:58.326142 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7794c55cb8-dz2pc"] Feb 23 08:56:58 crc kubenswrapper[5047]: I0223 08:56:58.333514 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7794c55cb8-dz2pc"] Feb 23 08:56:58 crc kubenswrapper[5047]: I0223 08:56:58.359577 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90893ba0-abf6-4d64-be28-115c844f3252" path="/var/lib/kubelet/pods/90893ba0-abf6-4d64-be28-115c844f3252/volumes" Feb 23 08:56:58 crc kubenswrapper[5047]: I0223 08:56:58.542967 5047 scope.go:117] "RemoveContainer" containerID="70e7711cd78711a953a35e90a7e4aa5585094c69e0b06440ef4d0935f4c9720d" Feb 23 08:57:00 crc kubenswrapper[5047]: I0223 08:57:00.051115 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-lr25h"] Feb 23 08:57:00 crc kubenswrapper[5047]: I0223 08:57:00.067454 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6c0a-account-create-update-jbdtd"] Feb 23 08:57:00 crc kubenswrapper[5047]: I0223 08:57:00.082597 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-lr25h"] Feb 23 08:57:00 crc kubenswrapper[5047]: I0223 08:57:00.096583 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6c0a-account-create-update-jbdtd"] Feb 23 08:57:00 crc kubenswrapper[5047]: I0223 08:57:00.388065 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3824710f-3773-4281-8c49-b9799daa3671" path="/var/lib/kubelet/pods/3824710f-3773-4281-8c49-b9799daa3671/volumes" Feb 23 08:57:00 crc kubenswrapper[5047]: I0223 08:57:00.391220 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d" path="/var/lib/kubelet/pods/dd0727e5-c2c0-41dc-ab5b-42fefccaaa6d/volumes" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.019890 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.144271 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-config-data\") pod \"b819f642-7df6-4fe0-81b1-531a509215d5\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.144496 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-combined-ca-bundle\") pod \"b819f642-7df6-4fe0-81b1-531a509215d5\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.144594 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-config-data-custom\") pod \"b819f642-7df6-4fe0-81b1-531a509215d5\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.144658 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4hj6\" (UniqueName: \"kubernetes.io/projected/b819f642-7df6-4fe0-81b1-531a509215d5-kube-api-access-f4hj6\") pod \"b819f642-7df6-4fe0-81b1-531a509215d5\" (UID: \"b819f642-7df6-4fe0-81b1-531a509215d5\") " Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.151772 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b819f642-7df6-4fe0-81b1-531a509215d5" (UID: "b819f642-7df6-4fe0-81b1-531a509215d5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.152421 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b819f642-7df6-4fe0-81b1-531a509215d5-kube-api-access-f4hj6" (OuterVolumeSpecName: "kube-api-access-f4hj6") pod "b819f642-7df6-4fe0-81b1-531a509215d5" (UID: "b819f642-7df6-4fe0-81b1-531a509215d5"). InnerVolumeSpecName "kube-api-access-f4hj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.188767 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b819f642-7df6-4fe0-81b1-531a509215d5" (UID: "b819f642-7df6-4fe0-81b1-531a509215d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.225565 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-config-data" (OuterVolumeSpecName: "config-data") pod "b819f642-7df6-4fe0-81b1-531a509215d5" (UID: "b819f642-7df6-4fe0-81b1-531a509215d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.247787 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.247841 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.247857 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b819f642-7df6-4fe0-81b1-531a509215d5-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.247873 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4hj6\" (UniqueName: \"kubernetes.io/projected/b819f642-7df6-4fe0-81b1-531a509215d5-kube-api-access-f4hj6\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.361144 5047 generic.go:334] "Generic (PLEG): container finished" podID="b819f642-7df6-4fe0-81b1-531a509215d5" containerID="38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6" exitCode=0 Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.361257 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6f7f7b59bc-ft6jn" event={"ID":"b819f642-7df6-4fe0-81b1-531a509215d5","Type":"ContainerDied","Data":"38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6"} Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.361281 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6f7f7b59bc-ft6jn" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.361304 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6f7f7b59bc-ft6jn" event={"ID":"b819f642-7df6-4fe0-81b1-531a509215d5","Type":"ContainerDied","Data":"482565e6ea56875c298f46fc460066aac8d1171925b123787169ee9ee596b2db"} Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.361357 5047 scope.go:117] "RemoveContainer" containerID="38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.398685 5047 scope.go:117] "RemoveContainer" containerID="38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6" Feb 23 08:57:05 crc kubenswrapper[5047]: E0223 08:57:05.399405 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6\": container with ID starting with 38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6 not found: ID does not exist" containerID="38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.399448 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6"} err="failed to get container status \"38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6\": rpc error: code = NotFound desc = could not find container \"38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6\": container with ID starting with 38c11578888d8b5cbd6b910ee0668dfad44f46e163f66e21f1d2e3aecda3c8a6 not found: ID does not exist" Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.461107 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6f7f7b59bc-ft6jn"] Feb 23 08:57:05 crc kubenswrapper[5047]: I0223 08:57:05.511913 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6f7f7b59bc-ft6jn"] Feb 23 08:57:06 crc kubenswrapper[5047]: I0223 08:57:06.356217 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b819f642-7df6-4fe0-81b1-531a509215d5" path="/var/lib/kubelet/pods/b819f642-7df6-4fe0-81b1-531a509215d5/volumes" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.251693 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd"] Feb 23 08:57:14 crc kubenswrapper[5047]: E0223 08:57:14.252866 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="901e124c-e075-45a1-b768-f7a62a892c36" containerName="heat-api" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.252887 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="901e124c-e075-45a1-b768-f7a62a892c36" containerName="heat-api" Feb 23 08:57:14 crc kubenswrapper[5047]: E0223 08:57:14.252923 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355d03ff-6601-4e09-ba7b-2593b0bf0c27" containerName="heat-cfnapi" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.252934 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="355d03ff-6601-4e09-ba7b-2593b0bf0c27" containerName="heat-cfnapi" Feb 23 08:57:14 crc kubenswrapper[5047]: E0223 08:57:14.252949 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90893ba0-abf6-4d64-be28-115c844f3252" containerName="horizon-log" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.252956 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="90893ba0-abf6-4d64-be28-115c844f3252" containerName="horizon-log" Feb 23 08:57:14 crc kubenswrapper[5047]: E0223 08:57:14.252974 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355d03ff-6601-4e09-ba7b-2593b0bf0c27" containerName="heat-cfnapi" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.252981 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="355d03ff-6601-4e09-ba7b-2593b0bf0c27" containerName="heat-cfnapi" Feb 23 08:57:14 crc kubenswrapper[5047]: E0223 08:57:14.252989 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf95689d-1932-4302-ac77-331ba7a72b25" containerName="heat-api" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.252994 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf95689d-1932-4302-ac77-331ba7a72b25" containerName="heat-api" Feb 23 08:57:14 crc kubenswrapper[5047]: E0223 08:57:14.253007 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b819f642-7df6-4fe0-81b1-531a509215d5" containerName="heat-engine" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.253013 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b819f642-7df6-4fe0-81b1-531a509215d5" containerName="heat-engine" Feb 23 08:57:14 crc kubenswrapper[5047]: E0223 08:57:14.253028 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90893ba0-abf6-4d64-be28-115c844f3252" containerName="horizon" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.253034 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="90893ba0-abf6-4d64-be28-115c844f3252" containerName="horizon" Feb 23 08:57:14 crc kubenswrapper[5047]: E0223 08:57:14.253050 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92c8be36-e854-4f95-b334-6871c09d4e70" containerName="heat-cfnapi" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.253056 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c8be36-e854-4f95-b334-6871c09d4e70" containerName="heat-cfnapi" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.253222 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="901e124c-e075-45a1-b768-f7a62a892c36" containerName="heat-api" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.253231 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="90893ba0-abf6-4d64-be28-115c844f3252" containerName="horizon" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.253240 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf95689d-1932-4302-ac77-331ba7a72b25" containerName="heat-api" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.253252 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b819f642-7df6-4fe0-81b1-531a509215d5" containerName="heat-engine" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.253261 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="901e124c-e075-45a1-b768-f7a62a892c36" containerName="heat-api" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.253272 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="92c8be36-e854-4f95-b334-6871c09d4e70" containerName="heat-cfnapi" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.253282 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="90893ba0-abf6-4d64-be28-115c844f3252" containerName="horizon-log" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.253292 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="355d03ff-6601-4e09-ba7b-2593b0bf0c27" containerName="heat-cfnapi" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.253323 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="355d03ff-6601-4e09-ba7b-2593b0bf0c27" containerName="heat-cfnapi" Feb 23 08:57:14 crc kubenswrapper[5047]: E0223 08:57:14.253488 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="901e124c-e075-45a1-b768-f7a62a892c36" containerName="heat-api" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.253496 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="901e124c-e075-45a1-b768-f7a62a892c36" containerName="heat-api" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.254619 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.258044 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.273024 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd"] Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.354715 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd\" (UID: \"9d5da52b-350d-4489-ba83-8e8db44549b3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.354776 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67xkd\" (UniqueName: \"kubernetes.io/projected/9d5da52b-350d-4489-ba83-8e8db44549b3-kube-api-access-67xkd\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd\" (UID: \"9d5da52b-350d-4489-ba83-8e8db44549b3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.355065 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd\" (UID: \"9d5da52b-350d-4489-ba83-8e8db44549b3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.463988 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd\" (UID: \"9d5da52b-350d-4489-ba83-8e8db44549b3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.464207 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd\" (UID: \"9d5da52b-350d-4489-ba83-8e8db44549b3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.464253 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67xkd\" (UniqueName: \"kubernetes.io/projected/9d5da52b-350d-4489-ba83-8e8db44549b3-kube-api-access-67xkd\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd\" (UID: \"9d5da52b-350d-4489-ba83-8e8db44549b3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.469512 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd\" (UID: \"9d5da52b-350d-4489-ba83-8e8db44549b3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.469802 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd\" (UID: \"9d5da52b-350d-4489-ba83-8e8db44549b3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.514215 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67xkd\" (UniqueName: \"kubernetes.io/projected/9d5da52b-350d-4489-ba83-8e8db44549b3-kube-api-access-67xkd\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd\" (UID: \"9d5da52b-350d-4489-ba83-8e8db44549b3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" Feb 23 08:57:14 crc kubenswrapper[5047]: I0223 08:57:14.581710 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" Feb 23 08:57:15 crc kubenswrapper[5047]: I0223 08:57:15.097173 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd"] Feb 23 08:57:15 crc kubenswrapper[5047]: I0223 08:57:15.543011 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" event={"ID":"9d5da52b-350d-4489-ba83-8e8db44549b3","Type":"ContainerStarted","Data":"ea4a7b95434b906aa1086870c1e53f25fa741caea7213e71d7c1af092c53cd80"} Feb 23 08:57:15 crc kubenswrapper[5047]: I0223 08:57:15.543605 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" event={"ID":"9d5da52b-350d-4489-ba83-8e8db44549b3","Type":"ContainerStarted","Data":"5ba150f30305dd70bbba53d9aad2d6c578c5a2ca5f1f06216993c2e8b110d016"} Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.556226 5047 generic.go:334] "Generic (PLEG): container finished" podID="9d5da52b-350d-4489-ba83-8e8db44549b3" containerID="ea4a7b95434b906aa1086870c1e53f25fa741caea7213e71d7c1af092c53cd80" exitCode=0 Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.556344 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" event={"ID":"9d5da52b-350d-4489-ba83-8e8db44549b3","Type":"ContainerDied","Data":"ea4a7b95434b906aa1086870c1e53f25fa741caea7213e71d7c1af092c53cd80"} Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.593148 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kp6dp"] Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.596774 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.627343 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kp6dp"] Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.636277 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b712e80-d659-48f1-a17a-32bd8c4d2e25-utilities\") pod \"redhat-operators-kp6dp\" (UID: \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\") " pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.636423 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrlh2\" (UniqueName: \"kubernetes.io/projected/7b712e80-d659-48f1-a17a-32bd8c4d2e25-kube-api-access-zrlh2\") pod \"redhat-operators-kp6dp\" (UID: \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\") " pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.636508 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b712e80-d659-48f1-a17a-32bd8c4d2e25-catalog-content\") pod \"redhat-operators-kp6dp\" (UID: \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\") " pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.738250 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrlh2\" (UniqueName: \"kubernetes.io/projected/7b712e80-d659-48f1-a17a-32bd8c4d2e25-kube-api-access-zrlh2\") pod \"redhat-operators-kp6dp\" (UID: \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\") " pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.738327 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b712e80-d659-48f1-a17a-32bd8c4d2e25-catalog-content\") pod \"redhat-operators-kp6dp\" (UID: \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\") " pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.738495 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b712e80-d659-48f1-a17a-32bd8c4d2e25-utilities\") pod \"redhat-operators-kp6dp\" (UID: \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\") " pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.739016 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b712e80-d659-48f1-a17a-32bd8c4d2e25-utilities\") pod \"redhat-operators-kp6dp\" (UID: \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\") " pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.739615 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b712e80-d659-48f1-a17a-32bd8c4d2e25-catalog-content\") pod \"redhat-operators-kp6dp\" (UID: \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\") " pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.759510 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.759571 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.772888 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrlh2\" (UniqueName: \"kubernetes.io/projected/7b712e80-d659-48f1-a17a-32bd8c4d2e25-kube-api-access-zrlh2\") pod \"redhat-operators-kp6dp\" (UID: \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\") " pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:57:16 crc kubenswrapper[5047]: I0223 08:57:16.944052 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:57:17 crc kubenswrapper[5047]: I0223 08:57:17.491454 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kp6dp"] Feb 23 08:57:17 crc kubenswrapper[5047]: I0223 08:57:17.568387 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp6dp" event={"ID":"7b712e80-d659-48f1-a17a-32bd8c4d2e25","Type":"ContainerStarted","Data":"0139cff1d14dd6b65deaf28ba6e1ff5aa1b48426f0f91b77688410e2eb01d72c"} Feb 23 08:57:18 crc kubenswrapper[5047]: I0223 08:57:18.579621 5047 generic.go:334] "Generic (PLEG): container finished" podID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerID="c020d33c0eee17adb43af6bf6f030e331748d86d0b54d390dff0ea3892708774" exitCode=0 Feb 23 08:57:18 crc kubenswrapper[5047]: I0223 08:57:18.580350 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp6dp" event={"ID":"7b712e80-d659-48f1-a17a-32bd8c4d2e25","Type":"ContainerDied","Data":"c020d33c0eee17adb43af6bf6f030e331748d86d0b54d390dff0ea3892708774"} Feb 23 08:57:18 crc kubenswrapper[5047]: I0223 08:57:18.585803 5047 generic.go:334] "Generic (PLEG): container finished" podID="9d5da52b-350d-4489-ba83-8e8db44549b3" containerID="aad91feffe2a2b6ecfbe73331f698209b0c732882a355486e6cdd1e92c236c86" exitCode=0 Feb 23 08:57:18 crc kubenswrapper[5047]: I0223 08:57:18.585860 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" event={"ID":"9d5da52b-350d-4489-ba83-8e8db44549b3","Type":"ContainerDied","Data":"aad91feffe2a2b6ecfbe73331f698209b0c732882a355486e6cdd1e92c236c86"} Feb 23 08:57:19 crc kubenswrapper[5047]: I0223 08:57:19.601790 5047 generic.go:334] "Generic (PLEG): container finished" podID="9d5da52b-350d-4489-ba83-8e8db44549b3" containerID="0ea6696ed9633b5429d9310a76795762cbb36cb8e21e20bdeaacdad4bc81630a" exitCode=0 Feb 23 08:57:19 crc kubenswrapper[5047]: I0223 08:57:19.601975 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" event={"ID":"9d5da52b-350d-4489-ba83-8e8db44549b3","Type":"ContainerDied","Data":"0ea6696ed9633b5429d9310a76795762cbb36cb8e21e20bdeaacdad4bc81630a"} Feb 23 08:57:19 crc kubenswrapper[5047]: I0223 08:57:19.610120 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp6dp" event={"ID":"7b712e80-d659-48f1-a17a-32bd8c4d2e25","Type":"ContainerStarted","Data":"c17fdc9eb3b5a57942d8220516e1acb33273a54b24e3784e03fe4db12860ce70"} Feb 23 08:57:21 crc kubenswrapper[5047]: I0223 08:57:21.068366 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" Feb 23 08:57:21 crc kubenswrapper[5047]: I0223 08:57:21.256620 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-bundle\") pod \"9d5da52b-350d-4489-ba83-8e8db44549b3\" (UID: \"9d5da52b-350d-4489-ba83-8e8db44549b3\") " Feb 23 08:57:21 crc kubenswrapper[5047]: I0223 08:57:21.256723 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67xkd\" (UniqueName: \"kubernetes.io/projected/9d5da52b-350d-4489-ba83-8e8db44549b3-kube-api-access-67xkd\") pod \"9d5da52b-350d-4489-ba83-8e8db44549b3\" (UID: \"9d5da52b-350d-4489-ba83-8e8db44549b3\") " Feb 23 08:57:21 crc kubenswrapper[5047]: I0223 08:57:21.256761 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-util\") pod \"9d5da52b-350d-4489-ba83-8e8db44549b3\" (UID: \"9d5da52b-350d-4489-ba83-8e8db44549b3\") " Feb 23 08:57:21 crc kubenswrapper[5047]: I0223 08:57:21.259035 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-bundle" (OuterVolumeSpecName: "bundle") pod "9d5da52b-350d-4489-ba83-8e8db44549b3" (UID: "9d5da52b-350d-4489-ba83-8e8db44549b3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:57:21 crc kubenswrapper[5047]: I0223 08:57:21.263171 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d5da52b-350d-4489-ba83-8e8db44549b3-kube-api-access-67xkd" (OuterVolumeSpecName: "kube-api-access-67xkd") pod "9d5da52b-350d-4489-ba83-8e8db44549b3" (UID: "9d5da52b-350d-4489-ba83-8e8db44549b3"). InnerVolumeSpecName "kube-api-access-67xkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:57:21 crc kubenswrapper[5047]: I0223 08:57:21.360952 5047 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:21 crc kubenswrapper[5047]: I0223 08:57:21.361014 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67xkd\" (UniqueName: \"kubernetes.io/projected/9d5da52b-350d-4489-ba83-8e8db44549b3-kube-api-access-67xkd\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:21 crc kubenswrapper[5047]: I0223 08:57:21.642472 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" event={"ID":"9d5da52b-350d-4489-ba83-8e8db44549b3","Type":"ContainerDied","Data":"5ba150f30305dd70bbba53d9aad2d6c578c5a2ca5f1f06216993c2e8b110d016"} Feb 23 08:57:21 crc kubenswrapper[5047]: I0223 08:57:21.642517 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ba150f30305dd70bbba53d9aad2d6c578c5a2ca5f1f06216993c2e8b110d016" Feb 23 08:57:21 crc kubenswrapper[5047]: I0223 08:57:21.642594 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd" Feb 23 08:57:22 crc kubenswrapper[5047]: I0223 08:57:22.384985 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-util" (OuterVolumeSpecName: "util") pod "9d5da52b-350d-4489-ba83-8e8db44549b3" (UID: "9d5da52b-350d-4489-ba83-8e8db44549b3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:57:22 crc kubenswrapper[5047]: I0223 08:57:22.386501 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-util\") pod \"9d5da52b-350d-4489-ba83-8e8db44549b3\" (UID: \"9d5da52b-350d-4489-ba83-8e8db44549b3\") " Feb 23 08:57:22 crc kubenswrapper[5047]: W0223 08:57:22.386694 5047 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9d5da52b-350d-4489-ba83-8e8db44549b3/volumes/kubernetes.io~empty-dir/util Feb 23 08:57:22 crc kubenswrapper[5047]: I0223 08:57:22.386744 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-util" (OuterVolumeSpecName: "util") pod "9d5da52b-350d-4489-ba83-8e8db44549b3" (UID: "9d5da52b-350d-4489-ba83-8e8db44549b3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:57:22 crc kubenswrapper[5047]: I0223 08:57:22.388392 5047 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d5da52b-350d-4489-ba83-8e8db44549b3-util\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:22 crc kubenswrapper[5047]: I0223 08:57:22.655346 5047 generic.go:334] "Generic (PLEG): container finished" podID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerID="c17fdc9eb3b5a57942d8220516e1acb33273a54b24e3784e03fe4db12860ce70" exitCode=0 Feb 23 08:57:22 crc kubenswrapper[5047]: I0223 08:57:22.655392 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp6dp" event={"ID":"7b712e80-d659-48f1-a17a-32bd8c4d2e25","Type":"ContainerDied","Data":"c17fdc9eb3b5a57942d8220516e1acb33273a54b24e3784e03fe4db12860ce70"} Feb 23 08:57:23 crc kubenswrapper[5047]: I0223 08:57:23.668469 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp6dp" event={"ID":"7b712e80-d659-48f1-a17a-32bd8c4d2e25","Type":"ContainerStarted","Data":"13e5c349600ae98de0378cd85fe7941eabc2d9437f75029dcb21d95ed3dfcc35"} Feb 23 08:57:23 crc kubenswrapper[5047]: I0223 08:57:23.721361 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kp6dp" podStartSLOduration=3.199388843 podStartE2EDuration="7.721339826s" podCreationTimestamp="2026-02-23 08:57:16 +0000 UTC" firstStartedPulling="2026-02-23 08:57:18.582367692 +0000 UTC m=+7960.833694826" lastFinishedPulling="2026-02-23 08:57:23.104318685 +0000 UTC m=+7965.355645809" observedRunningTime="2026-02-23 08:57:23.712868618 +0000 UTC m=+7965.964195752" watchObservedRunningTime="2026-02-23 08:57:23.721339826 +0000 UTC m=+7965.972666950" Feb 23 08:57:26 crc kubenswrapper[5047]: I0223 08:57:26.945007 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:57:26 crc kubenswrapper[5047]: I0223 08:57:26.947277 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:57:28 crc kubenswrapper[5047]: I0223 08:57:28.001854 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kp6dp" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="registry-server" probeResult="failure" output=< Feb 23 08:57:28 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 08:57:28 crc kubenswrapper[5047]: > Feb 23 08:57:29 crc kubenswrapper[5047]: I0223 08:57:29.049239 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-57qv5"] Feb 23 08:57:29 crc kubenswrapper[5047]: I0223 08:57:29.058344 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-57qv5"] Feb 23 08:57:30 crc kubenswrapper[5047]: I0223 08:57:30.351587 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db167786-3d0a-4918-9f70-dfd9610eb0cd" path="/var/lib/kubelet/pods/db167786-3d0a-4918-9f70-dfd9610eb0cd/volumes" Feb 23 08:57:33 crc kubenswrapper[5047]: I0223 08:57:33.853499 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-r6zrs"] Feb 23 08:57:33 crc kubenswrapper[5047]: E0223 08:57:33.855491 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d5da52b-350d-4489-ba83-8e8db44549b3" containerName="pull" Feb 23 08:57:33 crc kubenswrapper[5047]: I0223 08:57:33.855559 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d5da52b-350d-4489-ba83-8e8db44549b3" containerName="pull" Feb 23 08:57:33 crc kubenswrapper[5047]: E0223 08:57:33.855626 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d5da52b-350d-4489-ba83-8e8db44549b3" containerName="util" Feb 23 08:57:33 crc kubenswrapper[5047]: I0223 08:57:33.855678 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d5da52b-350d-4489-ba83-8e8db44549b3" containerName="util" Feb 23 08:57:33 crc kubenswrapper[5047]: E0223 08:57:33.855743 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d5da52b-350d-4489-ba83-8e8db44549b3" containerName="extract" Feb 23 08:57:33 crc kubenswrapper[5047]: I0223 08:57:33.855792 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d5da52b-350d-4489-ba83-8e8db44549b3" containerName="extract" Feb 23 08:57:33 crc kubenswrapper[5047]: I0223 08:57:33.856033 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d5da52b-350d-4489-ba83-8e8db44549b3" containerName="extract" Feb 23 08:57:33 crc kubenswrapper[5047]: I0223 08:57:33.856715 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-r6zrs" Feb 23 08:57:33 crc kubenswrapper[5047]: I0223 08:57:33.865965 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 23 08:57:33 crc kubenswrapper[5047]: I0223 08:57:33.876509 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 23 08:57:33 crc kubenswrapper[5047]: I0223 08:57:33.876771 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-sjrr5" Feb 23 08:57:33 crc kubenswrapper[5047]: I0223 08:57:33.903730 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-r6zrs"] Feb 23 08:57:33 crc kubenswrapper[5047]: I0223 08:57:33.962784 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b52zh\" (UniqueName: \"kubernetes.io/projected/ecb737b5-f02c-4aea-b4d0-93c516c3c258-kube-api-access-b52zh\") pod \"obo-prometheus-operator-68bc856cb9-r6zrs\" (UID: \"ecb737b5-f02c-4aea-b4d0-93c516c3c258\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-r6zrs" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.053435 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h"] Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.054696 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.058501 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.058687 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-fb6bp" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.066113 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b52zh\" (UniqueName: \"kubernetes.io/projected/ecb737b5-f02c-4aea-b4d0-93c516c3c258-kube-api-access-b52zh\") pod \"obo-prometheus-operator-68bc856cb9-r6zrs\" (UID: \"ecb737b5-f02c-4aea-b4d0-93c516c3c258\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-r6zrs" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.066702 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8"] Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.074982 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.086214 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h"] Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.117817 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b52zh\" (UniqueName: \"kubernetes.io/projected/ecb737b5-f02c-4aea-b4d0-93c516c3c258-kube-api-access-b52zh\") pod \"obo-prometheus-operator-68bc856cb9-r6zrs\" (UID: \"ecb737b5-f02c-4aea-b4d0-93c516c3c258\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-r6zrs" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.130953 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8"] Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.171755 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8e2c1208-29b7-4cda-8067-c429af1b5d63-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h\" (UID: \"8e2c1208-29b7-4cda-8067-c429af1b5d63\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.171865 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1e4753e-1eb8-4961-9669-72b6abe068e4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8\" (UID: \"a1e4753e-1eb8-4961-9669-72b6abe068e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.171952 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8e2c1208-29b7-4cda-8067-c429af1b5d63-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h\" (UID: \"8e2c1208-29b7-4cda-8067-c429af1b5d63\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.171986 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1e4753e-1eb8-4961-9669-72b6abe068e4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8\" (UID: \"a1e4753e-1eb8-4961-9669-72b6abe068e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.185666 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-r6zrs" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.244994 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-x49nh"] Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.248186 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-x49nh" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.259474 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.261809 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-dqj4b" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.262829 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-x49nh"] Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.274778 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8e2c1208-29b7-4cda-8067-c429af1b5d63-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h\" (UID: \"8e2c1208-29b7-4cda-8067-c429af1b5d63\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.274849 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1e4753e-1eb8-4961-9669-72b6abe068e4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8\" (UID: \"a1e4753e-1eb8-4961-9669-72b6abe068e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.274996 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8e2c1208-29b7-4cda-8067-c429af1b5d63-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h\" (UID: \"8e2c1208-29b7-4cda-8067-c429af1b5d63\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.275074 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1e4753e-1eb8-4961-9669-72b6abe068e4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8\" (UID: \"a1e4753e-1eb8-4961-9669-72b6abe068e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.293112 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1e4753e-1eb8-4961-9669-72b6abe068e4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8\" (UID: \"a1e4753e-1eb8-4961-9669-72b6abe068e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.299548 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1e4753e-1eb8-4961-9669-72b6abe068e4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8\" (UID: \"a1e4753e-1eb8-4961-9669-72b6abe068e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.301235 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8e2c1208-29b7-4cda-8067-c429af1b5d63-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h\" (UID: \"8e2c1208-29b7-4cda-8067-c429af1b5d63\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.302099 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8e2c1208-29b7-4cda-8067-c429af1b5d63-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h\" (UID: \"8e2c1208-29b7-4cda-8067-c429af1b5d63\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.394281 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86t8j\" (UniqueName: \"kubernetes.io/projected/ebcba915-39c3-4516-89c6-ccc956b12b99-kube-api-access-86t8j\") pod \"observability-operator-59bdc8b94-x49nh\" (UID: \"ebcba915-39c3-4516-89c6-ccc956b12b99\") " pod="openshift-operators/observability-operator-59bdc8b94-x49nh" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.394748 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebcba915-39c3-4516-89c6-ccc956b12b99-observability-operator-tls\") pod \"observability-operator-59bdc8b94-x49nh\" (UID: \"ebcba915-39c3-4516-89c6-ccc956b12b99\") " pod="openshift-operators/observability-operator-59bdc8b94-x49nh" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.412718 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.414600 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.507580 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebcba915-39c3-4516-89c6-ccc956b12b99-observability-operator-tls\") pod \"observability-operator-59bdc8b94-x49nh\" (UID: \"ebcba915-39c3-4516-89c6-ccc956b12b99\") " pod="openshift-operators/observability-operator-59bdc8b94-x49nh" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.516337 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86t8j\" (UniqueName: \"kubernetes.io/projected/ebcba915-39c3-4516-89c6-ccc956b12b99-kube-api-access-86t8j\") pod \"observability-operator-59bdc8b94-x49nh\" (UID: \"ebcba915-39c3-4516-89c6-ccc956b12b99\") " pod="openshift-operators/observability-operator-59bdc8b94-x49nh" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.544079 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebcba915-39c3-4516-89c6-ccc956b12b99-observability-operator-tls\") pod \"observability-operator-59bdc8b94-x49nh\" (UID: \"ebcba915-39c3-4516-89c6-ccc956b12b99\") " pod="openshift-operators/observability-operator-59bdc8b94-x49nh" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.559458 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-gv8g5"] Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.572242 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-gv8g5" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.578559 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-nvcmj" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.578575 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86t8j\" (UniqueName: \"kubernetes.io/projected/ebcba915-39c3-4516-89c6-ccc956b12b99-kube-api-access-86t8j\") pod \"observability-operator-59bdc8b94-x49nh\" (UID: \"ebcba915-39c3-4516-89c6-ccc956b12b99\") " pod="openshift-operators/observability-operator-59bdc8b94-x49nh" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.616009 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-gv8g5"] Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.618239 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vd92\" (UniqueName: \"kubernetes.io/projected/b0642e24-413d-4eea-83fb-e959cdcbc6dd-kube-api-access-8vd92\") pod \"perses-operator-5bf474d74f-gv8g5\" (UID: \"b0642e24-413d-4eea-83fb-e959cdcbc6dd\") " pod="openshift-operators/perses-operator-5bf474d74f-gv8g5" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.618345 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b0642e24-413d-4eea-83fb-e959cdcbc6dd-openshift-service-ca\") pod \"perses-operator-5bf474d74f-gv8g5\" (UID: \"b0642e24-413d-4eea-83fb-e959cdcbc6dd\") " pod="openshift-operators/perses-operator-5bf474d74f-gv8g5" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.720566 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-x49nh" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.721507 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vd92\" (UniqueName: \"kubernetes.io/projected/b0642e24-413d-4eea-83fb-e959cdcbc6dd-kube-api-access-8vd92\") pod \"perses-operator-5bf474d74f-gv8g5\" (UID: \"b0642e24-413d-4eea-83fb-e959cdcbc6dd\") " pod="openshift-operators/perses-operator-5bf474d74f-gv8g5" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.721589 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b0642e24-413d-4eea-83fb-e959cdcbc6dd-openshift-service-ca\") pod \"perses-operator-5bf474d74f-gv8g5\" (UID: \"b0642e24-413d-4eea-83fb-e959cdcbc6dd\") " pod="openshift-operators/perses-operator-5bf474d74f-gv8g5" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.723542 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b0642e24-413d-4eea-83fb-e959cdcbc6dd-openshift-service-ca\") pod \"perses-operator-5bf474d74f-gv8g5\" (UID: \"b0642e24-413d-4eea-83fb-e959cdcbc6dd\") " pod="openshift-operators/perses-operator-5bf474d74f-gv8g5" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.745130 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vd92\" (UniqueName: \"kubernetes.io/projected/b0642e24-413d-4eea-83fb-e959cdcbc6dd-kube-api-access-8vd92\") pod \"perses-operator-5bf474d74f-gv8g5\" (UID: \"b0642e24-413d-4eea-83fb-e959cdcbc6dd\") " pod="openshift-operators/perses-operator-5bf474d74f-gv8g5" Feb 23 08:57:34 crc kubenswrapper[5047]: I0223 08:57:34.935824 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-gv8g5" Feb 23 08:57:35 crc kubenswrapper[5047]: I0223 08:57:35.045131 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-r6zrs"] Feb 23 08:57:35 crc kubenswrapper[5047]: I0223 08:57:35.289096 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8"] Feb 23 08:57:35 crc kubenswrapper[5047]: I0223 08:57:35.350989 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-x49nh"] Feb 23 08:57:35 crc kubenswrapper[5047]: W0223 08:57:35.360701 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebcba915_39c3_4516_89c6_ccc956b12b99.slice/crio-16b51e021709989c0df1e0a612435ef5d9d1a93c201bf99a6c866c3ea77e6591 WatchSource:0}: Error finding container 16b51e021709989c0df1e0a612435ef5d9d1a93c201bf99a6c866c3ea77e6591: Status 404 returned error can't find the container with id 16b51e021709989c0df1e0a612435ef5d9d1a93c201bf99a6c866c3ea77e6591 Feb 23 08:57:35 crc kubenswrapper[5047]: I0223 08:57:35.695220 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h"] Feb 23 08:57:35 crc kubenswrapper[5047]: W0223 08:57:35.705065 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e2c1208_29b7_4cda_8067_c429af1b5d63.slice/crio-3933de23ff8ea6aa2b1659ded1207a28f99d40ba9f78edbb72b0052ad77281d4 WatchSource:0}: Error finding container 3933de23ff8ea6aa2b1659ded1207a28f99d40ba9f78edbb72b0052ad77281d4: Status 404 returned error can't find the container with id 3933de23ff8ea6aa2b1659ded1207a28f99d40ba9f78edbb72b0052ad77281d4 Feb 23 08:57:35 crc kubenswrapper[5047]: I0223 08:57:35.800297 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-gv8g5"] Feb 23 08:57:35 crc kubenswrapper[5047]: I0223 08:57:35.805052 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h" event={"ID":"8e2c1208-29b7-4cda-8067-c429af1b5d63","Type":"ContainerStarted","Data":"3933de23ff8ea6aa2b1659ded1207a28f99d40ba9f78edbb72b0052ad77281d4"} Feb 23 08:57:35 crc kubenswrapper[5047]: W0223 08:57:35.806098 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0642e24_413d_4eea_83fb_e959cdcbc6dd.slice/crio-79826158c036c70c2c85dd2f0609a92c52e51c7dac15dd5f5cec21ea4e60c15f WatchSource:0}: Error finding container 79826158c036c70c2c85dd2f0609a92c52e51c7dac15dd5f5cec21ea4e60c15f: Status 404 returned error can't find the container with id 79826158c036c70c2c85dd2f0609a92c52e51c7dac15dd5f5cec21ea4e60c15f Feb 23 08:57:35 crc kubenswrapper[5047]: I0223 08:57:35.806370 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-r6zrs" event={"ID":"ecb737b5-f02c-4aea-b4d0-93c516c3c258","Type":"ContainerStarted","Data":"e6478c3d501df8e506827c3a9d8b0d5b02613beaa8c0985ae350895b8f49ffeb"} Feb 23 08:57:35 crc kubenswrapper[5047]: I0223 08:57:35.807759 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-x49nh" event={"ID":"ebcba915-39c3-4516-89c6-ccc956b12b99","Type":"ContainerStarted","Data":"16b51e021709989c0df1e0a612435ef5d9d1a93c201bf99a6c866c3ea77e6591"} Feb 23 08:57:35 crc kubenswrapper[5047]: I0223 08:57:35.808959 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8" event={"ID":"a1e4753e-1eb8-4961-9669-72b6abe068e4","Type":"ContainerStarted","Data":"e0ff5260479c2bece434fb8bb388509d9bc41ded6be52db483b3a1d8449deedb"} Feb 23 08:57:36 crc kubenswrapper[5047]: I0223 08:57:36.823482 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-gv8g5" event={"ID":"b0642e24-413d-4eea-83fb-e959cdcbc6dd","Type":"ContainerStarted","Data":"79826158c036c70c2c85dd2f0609a92c52e51c7dac15dd5f5cec21ea4e60c15f"} Feb 23 08:57:38 crc kubenswrapper[5047]: I0223 08:57:38.019106 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kp6dp" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="registry-server" probeResult="failure" output=< Feb 23 08:57:38 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 08:57:38 crc kubenswrapper[5047]: > Feb 23 08:57:47 crc kubenswrapper[5047]: I0223 08:57:47.711872 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:57:47 crc kubenswrapper[5047]: I0223 08:57:47.712316 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:57:47 crc kubenswrapper[5047]: I0223 08:57:47.830275 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-x49nh" event={"ID":"ebcba915-39c3-4516-89c6-ccc956b12b99","Type":"ContainerStarted","Data":"66b552e192865dbcc411c2c0187af72680a4903f25197e41402a1dac2e95377b"} Feb 23 08:57:47 crc kubenswrapper[5047]: I0223 08:57:47.832207 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-x49nh" Feb 23 08:57:47 crc kubenswrapper[5047]: I0223 08:57:47.853012 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-gv8g5" event={"ID":"b0642e24-413d-4eea-83fb-e959cdcbc6dd","Type":"ContainerStarted","Data":"74af4dfafe41ebba0df5e22ec4df6bf74875ba251eb58977efae233ea40b9108"} Feb 23 08:57:47 crc kubenswrapper[5047]: I0223 08:57:47.853189 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-gv8g5" Feb 23 08:57:47 crc kubenswrapper[5047]: I0223 08:57:47.878985 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8" event={"ID":"a1e4753e-1eb8-4961-9669-72b6abe068e4","Type":"ContainerStarted","Data":"2b7ee8cee1fa5d94c851508f2a48949a0481adabca80fbeece22935ee3c1296a"} Feb 23 08:57:47 crc kubenswrapper[5047]: I0223 08:57:47.879066 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-x49nh" Feb 23 08:57:47 crc kubenswrapper[5047]: I0223 08:57:47.902279 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-x49nh" podStartSLOduration=3.428825651 podStartE2EDuration="13.902258754s" podCreationTimestamp="2026-02-23 08:57:34 +0000 UTC" firstStartedPulling="2026-02-23 08:57:35.375589566 +0000 UTC m=+7977.626916700" lastFinishedPulling="2026-02-23 08:57:45.849022669 +0000 UTC m=+7988.100349803" observedRunningTime="2026-02-23 08:57:47.878503505 +0000 UTC m=+7990.129830639" watchObservedRunningTime="2026-02-23 08:57:47.902258754 +0000 UTC m=+7990.153585898" Feb 23 08:57:47 crc kubenswrapper[5047]: I0223 08:57:47.910789 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h" event={"ID":"8e2c1208-29b7-4cda-8067-c429af1b5d63","Type":"ContainerStarted","Data":"3a7233b2fc0bc56024991cdd6ae46c7fbfb6dbd86e8d043458d9d74b6e5e513a"} Feb 23 08:57:47 crc kubenswrapper[5047]: I0223 08:57:47.938820 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8" podStartSLOduration=3.415260447 podStartE2EDuration="13.93879619s" podCreationTimestamp="2026-02-23 08:57:34 +0000 UTC" firstStartedPulling="2026-02-23 08:57:35.32564315 +0000 UTC m=+7977.576970284" lastFinishedPulling="2026-02-23 08:57:45.849178893 +0000 UTC m=+7988.100506027" observedRunningTime="2026-02-23 08:57:47.917428654 +0000 UTC m=+7990.168755788" watchObservedRunningTime="2026-02-23 08:57:47.93879619 +0000 UTC m=+7990.190123324" Feb 23 08:57:47 crc kubenswrapper[5047]: I0223 08:57:47.961208 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-gv8g5" podStartSLOduration=3.822009761 podStartE2EDuration="13.961174551s" podCreationTimestamp="2026-02-23 08:57:34 +0000 UTC" firstStartedPulling="2026-02-23 08:57:35.808542567 +0000 UTC m=+7978.059869701" lastFinishedPulling="2026-02-23 08:57:45.947707357 +0000 UTC m=+7988.199034491" observedRunningTime="2026-02-23 08:57:47.94404741 +0000 UTC m=+7990.195374574" watchObservedRunningTime="2026-02-23 08:57:47.961174551 +0000 UTC m=+7990.212501695" Feb 23 08:57:48 crc kubenswrapper[5047]: I0223 08:57:48.038186 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h" podStartSLOduration=3.826609147 podStartE2EDuration="14.038164736s" podCreationTimestamp="2026-02-23 08:57:34 +0000 UTC" firstStartedPulling="2026-02-23 08:57:35.708611856 +0000 UTC m=+7977.959938990" lastFinishedPulling="2026-02-23 08:57:45.920167445 +0000 UTC m=+7988.171494579" observedRunningTime="2026-02-23 08:57:48.020333436 +0000 UTC m=+7990.271660570" watchObservedRunningTime="2026-02-23 08:57:48.038164736 +0000 UTC m=+7990.289491870" Feb 23 08:57:48 crc kubenswrapper[5047]: I0223 08:57:48.848590 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kp6dp" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="registry-server" probeResult="failure" output=< Feb 23 08:57:48 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 08:57:48 crc kubenswrapper[5047]: > Feb 23 08:57:51 crc kubenswrapper[5047]: I0223 08:57:51.975974 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-r6zrs" event={"ID":"ecb737b5-f02c-4aea-b4d0-93c516c3c258","Type":"ContainerStarted","Data":"7ce2b20bb9dff72bcc202db6ba95ab3007f03baf3c3f0ea11daa46767a66510e"} Feb 23 08:57:54 crc kubenswrapper[5047]: I0223 08:57:54.942799 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-gv8g5" Feb 23 08:57:54 crc kubenswrapper[5047]: I0223 08:57:54.966146 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-r6zrs" podStartSLOduration=5.996428646 podStartE2EDuration="21.966120638s" podCreationTimestamp="2026-02-23 08:57:33 +0000 UTC" firstStartedPulling="2026-02-23 08:57:35.085415939 +0000 UTC m=+7977.336743073" lastFinishedPulling="2026-02-23 08:57:51.055107931 +0000 UTC m=+7993.306435065" observedRunningTime="2026-02-23 08:57:52.001653447 +0000 UTC m=+7994.252980581" watchObservedRunningTime="2026-02-23 08:57:54.966120638 +0000 UTC m=+7997.217447772" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.554514 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.555035 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="7c6ac225-0341-4574-b14d-8c904e846730" containerName="openstackclient" containerID="cri-o://ced05c29dc6620c644d56af1bbfde94a7814b0753018993fa42f72ce7a42dbad" gracePeriod=2 Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.574979 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.647145 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 23 08:57:57 crc kubenswrapper[5047]: E0223 08:57:57.648627 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c6ac225-0341-4574-b14d-8c904e846730" containerName="openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.648642 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c6ac225-0341-4574-b14d-8c904e846730" containerName="openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.648812 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c6ac225-0341-4574-b14d-8c904e846730" containerName="openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.649553 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.652307 5047 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7c6ac225-0341-4574-b14d-8c904e846730" podUID="d70c1128-8ff1-44a6-a81f-e314378abb7c" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.715459 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.800501 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q86q\" (UniqueName: \"kubernetes.io/projected/d70c1128-8ff1-44a6-a81f-e314378abb7c-kube-api-access-6q86q\") pod \"openstackclient\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " pod="openstack/openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.800575 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d70c1128-8ff1-44a6-a81f-e314378abb7c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " pod="openstack/openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.800611 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d70c1128-8ff1-44a6-a81f-e314378abb7c-openstack-config\") pod \"openstackclient\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " pod="openstack/openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.800645 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70c1128-8ff1-44a6-a81f-e314378abb7c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " pod="openstack/openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.849675 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.850992 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.855106 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-lt25f" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.889257 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.910864 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q86q\" (UniqueName: \"kubernetes.io/projected/d70c1128-8ff1-44a6-a81f-e314378abb7c-kube-api-access-6q86q\") pod \"openstackclient\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " pod="openstack/openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.910979 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d70c1128-8ff1-44a6-a81f-e314378abb7c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " pod="openstack/openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.911014 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d70c1128-8ff1-44a6-a81f-e314378abb7c-openstack-config\") pod \"openstackclient\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " pod="openstack/openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.911044 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70c1128-8ff1-44a6-a81f-e314378abb7c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " pod="openstack/openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.927621 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70c1128-8ff1-44a6-a81f-e314378abb7c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " pod="openstack/openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.934444 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d70c1128-8ff1-44a6-a81f-e314378abb7c-openstack-config\") pod \"openstackclient\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " pod="openstack/openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.935855 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d70c1128-8ff1-44a6-a81f-e314378abb7c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " pod="openstack/openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.956946 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q86q\" (UniqueName: \"kubernetes.io/projected/d70c1128-8ff1-44a6-a81f-e314378abb7c-kube-api-access-6q86q\") pod \"openstackclient\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " pod="openstack/openstackclient" Feb 23 08:57:57 crc kubenswrapper[5047]: I0223 08:57:57.980390 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:57:58 crc kubenswrapper[5047]: I0223 08:57:58.012555 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcbp5\" (UniqueName: \"kubernetes.io/projected/db0d2a61-bbc7-4d94-b49e-18b6647e6ace-kube-api-access-fcbp5\") pod \"kube-state-metrics-0\" (UID: \"db0d2a61-bbc7-4d94-b49e-18b6647e6ace\") " pod="openstack/kube-state-metrics-0" Feb 23 08:57:58 crc kubenswrapper[5047]: I0223 08:57:58.057987 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kp6dp" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="registry-server" probeResult="failure" output=< Feb 23 08:57:58 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 08:57:58 crc kubenswrapper[5047]: > Feb 23 08:57:58 crc kubenswrapper[5047]: I0223 08:57:58.118099 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcbp5\" (UniqueName: \"kubernetes.io/projected/db0d2a61-bbc7-4d94-b49e-18b6647e6ace-kube-api-access-fcbp5\") pod \"kube-state-metrics-0\" (UID: \"db0d2a61-bbc7-4d94-b49e-18b6647e6ace\") " pod="openstack/kube-state-metrics-0" Feb 23 08:57:58 crc kubenswrapper[5047]: I0223 08:57:58.181124 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcbp5\" (UniqueName: \"kubernetes.io/projected/db0d2a61-bbc7-4d94-b49e-18b6647e6ace-kube-api-access-fcbp5\") pod \"kube-state-metrics-0\" (UID: \"db0d2a61-bbc7-4d94-b49e-18b6647e6ace\") " pod="openstack/kube-state-metrics-0" Feb 23 08:57:58 crc kubenswrapper[5047]: I0223 08:57:58.468110 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 08:57:58 crc kubenswrapper[5047]: I0223 08:57:58.810579 5047 scope.go:117] "RemoveContainer" containerID="74c4507af5b6b674cb7bb060a05413012bf15153873749d5c24ee526557f6a28" Feb 23 08:57:58 crc kubenswrapper[5047]: I0223 08:57:58.917849 5047 scope.go:117] "RemoveContainer" containerID="dd2ad55b13c125642d0625ac8fe328fbfd54175872bdaba0e73a09bc93d524dc" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.054156 5047 scope.go:117] "RemoveContainer" containerID="6e9bd22e781b5077b022248894ff9a1e31e4f6f3ba638dd2ab4b5bb0fc69e656" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.071895 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.074779 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.082244 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.082604 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.082793 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.083153 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.083181 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-5q5qc" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.146676 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.170674 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.170737 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.171116 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.171245 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.171293 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.171437 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbnck\" (UniqueName: \"kubernetes.io/projected/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-kube-api-access-dbnck\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.171538 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.276307 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.276363 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.276413 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbnck\" (UniqueName: \"kubernetes.io/projected/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-kube-api-access-dbnck\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.276447 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.276522 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.276546 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.276595 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.285637 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.301070 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.302019 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.310108 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.310851 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.318010 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.334240 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbnck\" (UniqueName: \"kubernetes.io/projected/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-kube-api-access-dbnck\") pod \"alertmanager-metric-storage-0\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.512557 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.540692 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.547175 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.551329 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.552811 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.553161 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.553322 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.553437 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-dr7pg" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.553566 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.553708 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.555309 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.559410 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.673164 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.694112 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.694208 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/049b1f30-3f9e-4bcc-865e-2cd923125eb6-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.694238 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.694264 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x45nq\" (UniqueName: \"kubernetes.io/projected/049b1f30-3f9e-4bcc-865e-2cd923125eb6-kube-api-access-x45nq\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.694298 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.694352 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/049b1f30-3f9e-4bcc-865e-2cd923125eb6-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.694375 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.694392 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-config\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: W0223 08:57:59.694506 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd70c1128_8ff1_44a6_a81f_e314378abb7c.slice/crio-1d568c0435812de96cb3c99882df0aead2ab8e9e264fd125372c3b5898334bdc WatchSource:0}: Error finding container 1d568c0435812de96cb3c99882df0aead2ab8e9e264fd125372c3b5898334bdc: Status 404 returned error can't find the container with id 1d568c0435812de96cb3c99882df0aead2ab8e9e264fd125372c3b5898334bdc Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.694554 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.694683 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.797487 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/049b1f30-3f9e-4bcc-865e-2cd923125eb6-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.797581 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.797618 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x45nq\" (UniqueName: \"kubernetes.io/projected/049b1f30-3f9e-4bcc-865e-2cd923125eb6-kube-api-access-x45nq\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.797678 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.797722 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/049b1f30-3f9e-4bcc-865e-2cd923125eb6-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.797761 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.797784 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-config\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.797852 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.797939 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.798029 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.800880 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.805351 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.806275 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.808507 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-config\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.812564 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.812643 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/049b1f30-3f9e-4bcc-865e-2cd923125eb6-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.817580 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.817608 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a9c6f557254bb72755c4968ba73477eaf01a68b3243fbfe35b5777d635b954d0/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.829300 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.832558 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x45nq\" (UniqueName: \"kubernetes.io/projected/049b1f30-3f9e-4bcc-865e-2cd923125eb6-kube-api-access-x45nq\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.839349 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/049b1f30-3f9e-4bcc-865e-2cd923125eb6-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.856294 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 08:57:59 crc kubenswrapper[5047]: I0223 08:57:59.921828 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") pod \"prometheus-metric-storage-0\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.103663 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.107265 5047 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7c6ac225-0341-4574-b14d-8c904e846730" podUID="d70c1128-8ff1-44a6-a81f-e314378abb7c" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.159498 5047 generic.go:334] "Generic (PLEG): container finished" podID="7c6ac225-0341-4574-b14d-8c904e846730" containerID="ced05c29dc6620c644d56af1bbfde94a7814b0753018993fa42f72ce7a42dbad" exitCode=137 Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.159580 5047 scope.go:117] "RemoveContainer" containerID="ced05c29dc6620c644d56af1bbfde94a7814b0753018993fa42f72ce7a42dbad" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.159593 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.165203 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"db0d2a61-bbc7-4d94-b49e-18b6647e6ace","Type":"ContainerStarted","Data":"2781e9576809c6aa578bc78c5e2fa63733b73fddb9d2c307b6c3be57bb3f797c"} Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.166795 5047 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7c6ac225-0341-4574-b14d-8c904e846730" podUID="d70c1128-8ff1-44a6-a81f-e314378abb7c" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.168652 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d70c1128-8ff1-44a6-a81f-e314378abb7c","Type":"ContainerStarted","Data":"30d720e9d7d25a47b7ebb103f026fe844e5a94b9ebc55855ce996da5516316b1"} Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.168688 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d70c1128-8ff1-44a6-a81f-e314378abb7c","Type":"ContainerStarted","Data":"1d568c0435812de96cb3c99882df0aead2ab8e9e264fd125372c3b5898334bdc"} Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.176687 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.190870 5047 scope.go:117] "RemoveContainer" containerID="ced05c29dc6620c644d56af1bbfde94a7814b0753018993fa42f72ce7a42dbad" Feb 23 08:58:00 crc kubenswrapper[5047]: E0223 08:58:00.192219 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced05c29dc6620c644d56af1bbfde94a7814b0753018993fa42f72ce7a42dbad\": container with ID starting with ced05c29dc6620c644d56af1bbfde94a7814b0753018993fa42f72ce7a42dbad not found: ID does not exist" containerID="ced05c29dc6620c644d56af1bbfde94a7814b0753018993fa42f72ce7a42dbad" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.192254 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced05c29dc6620c644d56af1bbfde94a7814b0753018993fa42f72ce7a42dbad"} err="failed to get container status \"ced05c29dc6620c644d56af1bbfde94a7814b0753018993fa42f72ce7a42dbad\": rpc error: code = NotFound desc = could not find container \"ced05c29dc6620c644d56af1bbfde94a7814b0753018993fa42f72ce7a42dbad\": container with ID starting with ced05c29dc6620c644d56af1bbfde94a7814b0753018993fa42f72ce7a42dbad not found: ID does not exist" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.202505 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.202699 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.2026782799999998 podStartE2EDuration="3.20267828s" podCreationTimestamp="2026-02-23 08:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:58:00.187731858 +0000 UTC m=+8002.439058992" watchObservedRunningTime="2026-02-23 08:58:00.20267828 +0000 UTC m=+8002.454005404" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.221549 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c6ac225-0341-4574-b14d-8c904e846730-combined-ca-bundle\") pod \"7c6ac225-0341-4574-b14d-8c904e846730\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.224379 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bx4s\" (UniqueName: \"kubernetes.io/projected/7c6ac225-0341-4574-b14d-8c904e846730-kube-api-access-6bx4s\") pod \"7c6ac225-0341-4574-b14d-8c904e846730\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.224664 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c6ac225-0341-4574-b14d-8c904e846730-openstack-config\") pod \"7c6ac225-0341-4574-b14d-8c904e846730\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.224753 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c6ac225-0341-4574-b14d-8c904e846730-openstack-config-secret\") pod \"7c6ac225-0341-4574-b14d-8c904e846730\" (UID: \"7c6ac225-0341-4574-b14d-8c904e846730\") " Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.229028 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c6ac225-0341-4574-b14d-8c904e846730-kube-api-access-6bx4s" (OuterVolumeSpecName: "kube-api-access-6bx4s") pod "7c6ac225-0341-4574-b14d-8c904e846730" (UID: "7c6ac225-0341-4574-b14d-8c904e846730"). InnerVolumeSpecName "kube-api-access-6bx4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.255868 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c6ac225-0341-4574-b14d-8c904e846730-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c6ac225-0341-4574-b14d-8c904e846730" (UID: "7c6ac225-0341-4574-b14d-8c904e846730"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.285412 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c6ac225-0341-4574-b14d-8c904e846730-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "7c6ac225-0341-4574-b14d-8c904e846730" (UID: "7c6ac225-0341-4574-b14d-8c904e846730"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.295254 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c6ac225-0341-4574-b14d-8c904e846730-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "7c6ac225-0341-4574-b14d-8c904e846730" (UID: "7c6ac225-0341-4574-b14d-8c904e846730"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.329995 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c6ac225-0341-4574-b14d-8c904e846730-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.330018 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bx4s\" (UniqueName: \"kubernetes.io/projected/7c6ac225-0341-4574-b14d-8c904e846730-kube-api-access-6bx4s\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.330027 5047 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c6ac225-0341-4574-b14d-8c904e846730-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.330036 5047 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c6ac225-0341-4574-b14d-8c904e846730-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.382746 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c6ac225-0341-4574-b14d-8c904e846730" path="/var/lib/kubelet/pods/7c6ac225-0341-4574-b14d-8c904e846730/volumes" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.479315 5047 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7c6ac225-0341-4574-b14d-8c904e846730" podUID="d70c1128-8ff1-44a6-a81f-e314378abb7c" Feb 23 08:58:00 crc kubenswrapper[5047]: I0223 08:58:00.755196 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 08:58:01 crc kubenswrapper[5047]: I0223 08:58:01.202866 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6","Type":"ContainerStarted","Data":"fd9e969f29217f252ede0efae00289691d2ded6c6fe9ba4cdecd9365036422ea"} Feb 23 08:58:01 crc kubenswrapper[5047]: I0223 08:58:01.207745 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"db0d2a61-bbc7-4d94-b49e-18b6647e6ace","Type":"ContainerStarted","Data":"1bbe89bf46812cf3d79addd0d981e03735f6af40c5c18a1376ac3a82a61ff972"} Feb 23 08:58:01 crc kubenswrapper[5047]: I0223 08:58:01.209533 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 23 08:58:01 crc kubenswrapper[5047]: I0223 08:58:01.221566 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"049b1f30-3f9e-4bcc-865e-2cd923125eb6","Type":"ContainerStarted","Data":"236c408625a6130926edf10c8b3e051bf9fbc493b0cdf5a2a36d49b9fe228362"} Feb 23 08:58:01 crc kubenswrapper[5047]: I0223 08:58:01.245340 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.786783683 podStartE2EDuration="4.245314624s" podCreationTimestamp="2026-02-23 08:57:57 +0000 UTC" firstStartedPulling="2026-02-23 08:57:59.911810415 +0000 UTC m=+8002.163137549" lastFinishedPulling="2026-02-23 08:58:00.370341356 +0000 UTC m=+8002.621668490" observedRunningTime="2026-02-23 08:58:01.232197012 +0000 UTC m=+8003.483524166" watchObservedRunningTime="2026-02-23 08:58:01.245314624 +0000 UTC m=+8003.496641758" Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.273438 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t42kc"] Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.282580 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.284572 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t42kc"] Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.296963 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"049b1f30-3f9e-4bcc-865e-2cd923125eb6","Type":"ContainerStarted","Data":"5e6ae55be7a72c1e27d0e6fc01b07b5f203566328042e654b410529194b8f5e0"} Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.299936 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6","Type":"ContainerStarted","Data":"a616b652208daefb07b38cd626733fc90a295a0a07102e8c6660ad4f1c65e03e"} Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.343042 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d58a9f-2416-4b89-8b12-1c59715953a9-catalog-content\") pod \"redhat-marketplace-t42kc\" (UID: \"d9d58a9f-2416-4b89-8b12-1c59715953a9\") " pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.343475 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxkbj\" (UniqueName: \"kubernetes.io/projected/d9d58a9f-2416-4b89-8b12-1c59715953a9-kube-api-access-sxkbj\") pod \"redhat-marketplace-t42kc\" (UID: \"d9d58a9f-2416-4b89-8b12-1c59715953a9\") " pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.343938 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d58a9f-2416-4b89-8b12-1c59715953a9-utilities\") pod \"redhat-marketplace-t42kc\" (UID: \"d9d58a9f-2416-4b89-8b12-1c59715953a9\") " pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.447485 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d58a9f-2416-4b89-8b12-1c59715953a9-catalog-content\") pod \"redhat-marketplace-t42kc\" (UID: \"d9d58a9f-2416-4b89-8b12-1c59715953a9\") " pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.447629 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxkbj\" (UniqueName: \"kubernetes.io/projected/d9d58a9f-2416-4b89-8b12-1c59715953a9-kube-api-access-sxkbj\") pod \"redhat-marketplace-t42kc\" (UID: \"d9d58a9f-2416-4b89-8b12-1c59715953a9\") " pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.447747 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d58a9f-2416-4b89-8b12-1c59715953a9-utilities\") pod \"redhat-marketplace-t42kc\" (UID: \"d9d58a9f-2416-4b89-8b12-1c59715953a9\") " pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.448113 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d58a9f-2416-4b89-8b12-1c59715953a9-catalog-content\") pod \"redhat-marketplace-t42kc\" (UID: \"d9d58a9f-2416-4b89-8b12-1c59715953a9\") " pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.448149 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d58a9f-2416-4b89-8b12-1c59715953a9-utilities\") pod \"redhat-marketplace-t42kc\" (UID: \"d9d58a9f-2416-4b89-8b12-1c59715953a9\") " pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.472688 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxkbj\" (UniqueName: \"kubernetes.io/projected/d9d58a9f-2416-4b89-8b12-1c59715953a9-kube-api-access-sxkbj\") pod \"redhat-marketplace-t42kc\" (UID: \"d9d58a9f-2416-4b89-8b12-1c59715953a9\") " pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:07 crc kubenswrapper[5047]: I0223 08:58:07.617102 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:08 crc kubenswrapper[5047]: I0223 08:58:08.002088 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kp6dp" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="registry-server" probeResult="failure" output=< Feb 23 08:58:08 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 08:58:08 crc kubenswrapper[5047]: > Feb 23 08:58:08 crc kubenswrapper[5047]: I0223 08:58:08.195831 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t42kc"] Feb 23 08:58:08 crc kubenswrapper[5047]: I0223 08:58:08.310243 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t42kc" event={"ID":"d9d58a9f-2416-4b89-8b12-1c59715953a9","Type":"ContainerStarted","Data":"e9197d4386c10287a8f139a4d21dcb1db809ed11b166b8435c7f52cbfe62d9a1"} Feb 23 08:58:08 crc kubenswrapper[5047]: I0223 08:58:08.473811 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 23 08:58:09 crc kubenswrapper[5047]: I0223 08:58:09.363749 5047 generic.go:334] "Generic (PLEG): container finished" podID="d9d58a9f-2416-4b89-8b12-1c59715953a9" containerID="6eed3120889b5e8e91f9e68a00e23de22fd8212ac451bb8d830e35982a209723" exitCode=0 Feb 23 08:58:09 crc kubenswrapper[5047]: I0223 08:58:09.363810 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t42kc" event={"ID":"d9d58a9f-2416-4b89-8b12-1c59715953a9","Type":"ContainerDied","Data":"6eed3120889b5e8e91f9e68a00e23de22fd8212ac451bb8d830e35982a209723"} Feb 23 08:58:10 crc kubenswrapper[5047]: I0223 08:58:10.386027 5047 generic.go:334] "Generic (PLEG): container finished" podID="d9d58a9f-2416-4b89-8b12-1c59715953a9" containerID="e334eab3f03857caadaba0f206c7faa1b2efa1bb4ed911292c96b091da36afb7" exitCode=0 Feb 23 08:58:10 crc kubenswrapper[5047]: I0223 08:58:10.386202 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t42kc" event={"ID":"d9d58a9f-2416-4b89-8b12-1c59715953a9","Type":"ContainerDied","Data":"e334eab3f03857caadaba0f206c7faa1b2efa1bb4ed911292c96b091da36afb7"} Feb 23 08:58:11 crc kubenswrapper[5047]: I0223 08:58:11.405805 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t42kc" event={"ID":"d9d58a9f-2416-4b89-8b12-1c59715953a9","Type":"ContainerStarted","Data":"1eea5a7e721ea52eaea5a46223eef2854668b5bc3a946ae7e7c12bb70cfe489a"} Feb 23 08:58:11 crc kubenswrapper[5047]: I0223 08:58:11.432140 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t42kc" podStartSLOduration=2.932766421 podStartE2EDuration="4.432111228s" podCreationTimestamp="2026-02-23 08:58:07 +0000 UTC" firstStartedPulling="2026-02-23 08:58:09.371762029 +0000 UTC m=+8011.623089163" lastFinishedPulling="2026-02-23 08:58:10.871106826 +0000 UTC m=+8013.122433970" observedRunningTime="2026-02-23 08:58:11.429439895 +0000 UTC m=+8013.680767069" watchObservedRunningTime="2026-02-23 08:58:11.432111228 +0000 UTC m=+8013.683438352" Feb 23 08:58:14 crc kubenswrapper[5047]: I0223 08:58:14.450776 5047 generic.go:334] "Generic (PLEG): container finished" podID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerID="5e6ae55be7a72c1e27d0e6fc01b07b5f203566328042e654b410529194b8f5e0" exitCode=0 Feb 23 08:58:14 crc kubenswrapper[5047]: I0223 08:58:14.451641 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"049b1f30-3f9e-4bcc-865e-2cd923125eb6","Type":"ContainerDied","Data":"5e6ae55be7a72c1e27d0e6fc01b07b5f203566328042e654b410529194b8f5e0"} Feb 23 08:58:14 crc kubenswrapper[5047]: I0223 08:58:14.457234 5047 generic.go:334] "Generic (PLEG): container finished" podID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" containerID="a616b652208daefb07b38cd626733fc90a295a0a07102e8c6660ad4f1c65e03e" exitCode=0 Feb 23 08:58:14 crc kubenswrapper[5047]: I0223 08:58:14.457495 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6","Type":"ContainerDied","Data":"a616b652208daefb07b38cd626733fc90a295a0a07102e8c6660ad4f1c65e03e"} Feb 23 08:58:16 crc kubenswrapper[5047]: I0223 08:58:16.759925 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:58:16 crc kubenswrapper[5047]: I0223 08:58:16.760297 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:58:16 crc kubenswrapper[5047]: I0223 08:58:16.760372 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 08:58:16 crc kubenswrapper[5047]: I0223 08:58:16.761334 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ec7e4d435932fd8405e5cc018a986356ba6984287b8f0f9c351f6397165ec6c"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:58:16 crc kubenswrapper[5047]: I0223 08:58:16.761399 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://5ec7e4d435932fd8405e5cc018a986356ba6984287b8f0f9c351f6397165ec6c" gracePeriod=600 Feb 23 08:58:17 crc kubenswrapper[5047]: I0223 08:58:17.515161 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="5ec7e4d435932fd8405e5cc018a986356ba6984287b8f0f9c351f6397165ec6c" exitCode=0 Feb 23 08:58:17 crc kubenswrapper[5047]: I0223 08:58:17.515223 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"5ec7e4d435932fd8405e5cc018a986356ba6984287b8f0f9c351f6397165ec6c"} Feb 23 08:58:17 crc kubenswrapper[5047]: I0223 08:58:17.515579 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492"} Feb 23 08:58:17 crc kubenswrapper[5047]: I0223 08:58:17.515606 5047 scope.go:117] "RemoveContainer" containerID="a6ccd839aacd91812301f1634a9902213e713a00eb8ada7ef33234ca594ed873" Feb 23 08:58:17 crc kubenswrapper[5047]: I0223 08:58:17.521045 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6","Type":"ContainerStarted","Data":"7061c930f5e73ec3ef1ef35e6ca4955b8a79a7a14eda68ced588ceb181b02467"} Feb 23 08:58:17 crc kubenswrapper[5047]: I0223 08:58:17.617629 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:17 crc kubenswrapper[5047]: I0223 08:58:17.617709 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:17 crc kubenswrapper[5047]: I0223 08:58:17.684543 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:18 crc kubenswrapper[5047]: I0223 08:58:18.004182 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kp6dp" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="registry-server" probeResult="failure" output=< Feb 23 08:58:18 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 08:58:18 crc kubenswrapper[5047]: > Feb 23 08:58:18 crc kubenswrapper[5047]: I0223 08:58:18.578293 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:18 crc kubenswrapper[5047]: I0223 08:58:18.626539 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t42kc"] Feb 23 08:58:20 crc kubenswrapper[5047]: I0223 08:58:20.554698 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t42kc" podUID="d9d58a9f-2416-4b89-8b12-1c59715953a9" containerName="registry-server" containerID="cri-o://1eea5a7e721ea52eaea5a46223eef2854668b5bc3a946ae7e7c12bb70cfe489a" gracePeriod=2 Feb 23 08:58:20 crc kubenswrapper[5047]: I0223 08:58:20.555216 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6","Type":"ContainerStarted","Data":"8c1124f684fe17c61a20660889d0932a297693e7c21894c823edb60c364c744b"} Feb 23 08:58:20 crc kubenswrapper[5047]: I0223 08:58:20.555794 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 23 08:58:20 crc kubenswrapper[5047]: I0223 08:58:20.561037 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 23 08:58:20 crc kubenswrapper[5047]: I0223 08:58:20.602116 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.116099683 podStartE2EDuration="22.602092641s" podCreationTimestamp="2026-02-23 08:57:58 +0000 UTC" firstStartedPulling="2026-02-23 08:58:00.190816391 +0000 UTC m=+8002.442143525" lastFinishedPulling="2026-02-23 08:58:16.676809349 +0000 UTC m=+8018.928136483" observedRunningTime="2026-02-23 08:58:20.585652858 +0000 UTC m=+8022.836980032" watchObservedRunningTime="2026-02-23 08:58:20.602092641 +0000 UTC m=+8022.853419805" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.031242 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.190873 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d58a9f-2416-4b89-8b12-1c59715953a9-utilities\") pod \"d9d58a9f-2416-4b89-8b12-1c59715953a9\" (UID: \"d9d58a9f-2416-4b89-8b12-1c59715953a9\") " Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.191161 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxkbj\" (UniqueName: \"kubernetes.io/projected/d9d58a9f-2416-4b89-8b12-1c59715953a9-kube-api-access-sxkbj\") pod \"d9d58a9f-2416-4b89-8b12-1c59715953a9\" (UID: \"d9d58a9f-2416-4b89-8b12-1c59715953a9\") " Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.191424 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d58a9f-2416-4b89-8b12-1c59715953a9-catalog-content\") pod \"d9d58a9f-2416-4b89-8b12-1c59715953a9\" (UID: \"d9d58a9f-2416-4b89-8b12-1c59715953a9\") " Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.191582 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9d58a9f-2416-4b89-8b12-1c59715953a9-utilities" (OuterVolumeSpecName: "utilities") pod "d9d58a9f-2416-4b89-8b12-1c59715953a9" (UID: "d9d58a9f-2416-4b89-8b12-1c59715953a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.192155 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d58a9f-2416-4b89-8b12-1c59715953a9-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.196654 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d58a9f-2416-4b89-8b12-1c59715953a9-kube-api-access-sxkbj" (OuterVolumeSpecName: "kube-api-access-sxkbj") pod "d9d58a9f-2416-4b89-8b12-1c59715953a9" (UID: "d9d58a9f-2416-4b89-8b12-1c59715953a9"). InnerVolumeSpecName "kube-api-access-sxkbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.211941 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9d58a9f-2416-4b89-8b12-1c59715953a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9d58a9f-2416-4b89-8b12-1c59715953a9" (UID: "d9d58a9f-2416-4b89-8b12-1c59715953a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.294415 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d58a9f-2416-4b89-8b12-1c59715953a9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.294468 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxkbj\" (UniqueName: \"kubernetes.io/projected/d9d58a9f-2416-4b89-8b12-1c59715953a9-kube-api-access-sxkbj\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.569242 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"049b1f30-3f9e-4bcc-865e-2cd923125eb6","Type":"ContainerStarted","Data":"74bb043a6f113ba465605487bf47a03f605f2521de6d3affcfee6d08e21de58b"} Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.573063 5047 generic.go:334] "Generic (PLEG): container finished" podID="d9d58a9f-2416-4b89-8b12-1c59715953a9" containerID="1eea5a7e721ea52eaea5a46223eef2854668b5bc3a946ae7e7c12bb70cfe489a" exitCode=0 Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.573483 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t42kc" event={"ID":"d9d58a9f-2416-4b89-8b12-1c59715953a9","Type":"ContainerDied","Data":"1eea5a7e721ea52eaea5a46223eef2854668b5bc3a946ae7e7c12bb70cfe489a"} Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.573540 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t42kc" event={"ID":"d9d58a9f-2416-4b89-8b12-1c59715953a9","Type":"ContainerDied","Data":"e9197d4386c10287a8f139a4d21dcb1db809ed11b166b8435c7f52cbfe62d9a1"} Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.573580 5047 scope.go:117] "RemoveContainer" containerID="1eea5a7e721ea52eaea5a46223eef2854668b5bc3a946ae7e7c12bb70cfe489a" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.573825 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t42kc" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.619570 5047 scope.go:117] "RemoveContainer" containerID="e334eab3f03857caadaba0f206c7faa1b2efa1bb4ed911292c96b091da36afb7" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.642926 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t42kc"] Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.657672 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t42kc"] Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.663050 5047 scope.go:117] "RemoveContainer" containerID="6eed3120889b5e8e91f9e68a00e23de22fd8212ac451bb8d830e35982a209723" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.692437 5047 scope.go:117] "RemoveContainer" containerID="1eea5a7e721ea52eaea5a46223eef2854668b5bc3a946ae7e7c12bb70cfe489a" Feb 23 08:58:21 crc kubenswrapper[5047]: E0223 08:58:21.693029 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eea5a7e721ea52eaea5a46223eef2854668b5bc3a946ae7e7c12bb70cfe489a\": container with ID starting with 1eea5a7e721ea52eaea5a46223eef2854668b5bc3a946ae7e7c12bb70cfe489a not found: ID does not exist" containerID="1eea5a7e721ea52eaea5a46223eef2854668b5bc3a946ae7e7c12bb70cfe489a" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.693082 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eea5a7e721ea52eaea5a46223eef2854668b5bc3a946ae7e7c12bb70cfe489a"} err="failed to get container status \"1eea5a7e721ea52eaea5a46223eef2854668b5bc3a946ae7e7c12bb70cfe489a\": rpc error: code = NotFound desc = could not find container \"1eea5a7e721ea52eaea5a46223eef2854668b5bc3a946ae7e7c12bb70cfe489a\": container with ID starting with 1eea5a7e721ea52eaea5a46223eef2854668b5bc3a946ae7e7c12bb70cfe489a not found: ID does not exist" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.693117 5047 scope.go:117] "RemoveContainer" containerID="e334eab3f03857caadaba0f206c7faa1b2efa1bb4ed911292c96b091da36afb7" Feb 23 08:58:21 crc kubenswrapper[5047]: E0223 08:58:21.694557 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e334eab3f03857caadaba0f206c7faa1b2efa1bb4ed911292c96b091da36afb7\": container with ID starting with e334eab3f03857caadaba0f206c7faa1b2efa1bb4ed911292c96b091da36afb7 not found: ID does not exist" containerID="e334eab3f03857caadaba0f206c7faa1b2efa1bb4ed911292c96b091da36afb7" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.694636 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e334eab3f03857caadaba0f206c7faa1b2efa1bb4ed911292c96b091da36afb7"} err="failed to get container status \"e334eab3f03857caadaba0f206c7faa1b2efa1bb4ed911292c96b091da36afb7\": rpc error: code = NotFound desc = could not find container \"e334eab3f03857caadaba0f206c7faa1b2efa1bb4ed911292c96b091da36afb7\": container with ID starting with e334eab3f03857caadaba0f206c7faa1b2efa1bb4ed911292c96b091da36afb7 not found: ID does not exist" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.694704 5047 scope.go:117] "RemoveContainer" containerID="6eed3120889b5e8e91f9e68a00e23de22fd8212ac451bb8d830e35982a209723" Feb 23 08:58:21 crc kubenswrapper[5047]: E0223 08:58:21.695166 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eed3120889b5e8e91f9e68a00e23de22fd8212ac451bb8d830e35982a209723\": container with ID starting with 6eed3120889b5e8e91f9e68a00e23de22fd8212ac451bb8d830e35982a209723 not found: ID does not exist" containerID="6eed3120889b5e8e91f9e68a00e23de22fd8212ac451bb8d830e35982a209723" Feb 23 08:58:21 crc kubenswrapper[5047]: I0223 08:58:21.695194 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eed3120889b5e8e91f9e68a00e23de22fd8212ac451bb8d830e35982a209723"} err="failed to get container status \"6eed3120889b5e8e91f9e68a00e23de22fd8212ac451bb8d830e35982a209723\": rpc error: code = NotFound desc = could not find container \"6eed3120889b5e8e91f9e68a00e23de22fd8212ac451bb8d830e35982a209723\": container with ID starting with 6eed3120889b5e8e91f9e68a00e23de22fd8212ac451bb8d830e35982a209723 not found: ID does not exist" Feb 23 08:58:22 crc kubenswrapper[5047]: I0223 08:58:22.360899 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9d58a9f-2416-4b89-8b12-1c59715953a9" path="/var/lib/kubelet/pods/d9d58a9f-2416-4b89-8b12-1c59715953a9/volumes" Feb 23 08:58:26 crc kubenswrapper[5047]: I0223 08:58:26.652226 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"049b1f30-3f9e-4bcc-865e-2cd923125eb6","Type":"ContainerStarted","Data":"fbe2e293d22cc9ab907d434c3ac2625dca11f93588e6be42390744ada31ad4f0"} Feb 23 08:58:27 crc kubenswrapper[5047]: I0223 08:58:27.033159 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:58:27 crc kubenswrapper[5047]: I0223 08:58:27.107810 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:58:27 crc kubenswrapper[5047]: I0223 08:58:27.296556 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kp6dp"] Feb 23 08:58:28 crc kubenswrapper[5047]: I0223 08:58:28.677308 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kp6dp" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="registry-server" containerID="cri-o://13e5c349600ae98de0378cd85fe7941eabc2d9437f75029dcb21d95ed3dfcc35" gracePeriod=2 Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.493206 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.617080 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrlh2\" (UniqueName: \"kubernetes.io/projected/7b712e80-d659-48f1-a17a-32bd8c4d2e25-kube-api-access-zrlh2\") pod \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\" (UID: \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\") " Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.617156 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b712e80-d659-48f1-a17a-32bd8c4d2e25-utilities\") pod \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\" (UID: \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\") " Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.617177 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b712e80-d659-48f1-a17a-32bd8c4d2e25-catalog-content\") pod \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\" (UID: \"7b712e80-d659-48f1-a17a-32bd8c4d2e25\") " Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.618557 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b712e80-d659-48f1-a17a-32bd8c4d2e25-utilities" (OuterVolumeSpecName: "utilities") pod "7b712e80-d659-48f1-a17a-32bd8c4d2e25" (UID: "7b712e80-d659-48f1-a17a-32bd8c4d2e25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.625536 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b712e80-d659-48f1-a17a-32bd8c4d2e25-kube-api-access-zrlh2" (OuterVolumeSpecName: "kube-api-access-zrlh2") pod "7b712e80-d659-48f1-a17a-32bd8c4d2e25" (UID: "7b712e80-d659-48f1-a17a-32bd8c4d2e25"). InnerVolumeSpecName "kube-api-access-zrlh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.720386 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrlh2\" (UniqueName: \"kubernetes.io/projected/7b712e80-d659-48f1-a17a-32bd8c4d2e25-kube-api-access-zrlh2\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.720385 5047 generic.go:334] "Generic (PLEG): container finished" podID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerID="13e5c349600ae98de0378cd85fe7941eabc2d9437f75029dcb21d95ed3dfcc35" exitCode=0 Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.720427 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp6dp" event={"ID":"7b712e80-d659-48f1-a17a-32bd8c4d2e25","Type":"ContainerDied","Data":"13e5c349600ae98de0378cd85fe7941eabc2d9437f75029dcb21d95ed3dfcc35"} Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.720541 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kp6dp" event={"ID":"7b712e80-d659-48f1-a17a-32bd8c4d2e25","Type":"ContainerDied","Data":"0139cff1d14dd6b65deaf28ba6e1ff5aa1b48426f0f91b77688410e2eb01d72c"} Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.720571 5047 scope.go:117] "RemoveContainer" containerID="13e5c349600ae98de0378cd85fe7941eabc2d9437f75029dcb21d95ed3dfcc35" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.720693 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kp6dp" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.721521 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b712e80-d659-48f1-a17a-32bd8c4d2e25-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.723791 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"049b1f30-3f9e-4bcc-865e-2cd923125eb6","Type":"ContainerStarted","Data":"d8a161be5f8b4126f518619c51280e8a08feca1a551976743ab9b803ce0e215c"} Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.744523 5047 scope.go:117] "RemoveContainer" containerID="c17fdc9eb3b5a57942d8220516e1acb33273a54b24e3784e03fe4db12860ce70" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.758927 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.225057659 podStartE2EDuration="31.7588916s" podCreationTimestamp="2026-02-23 08:57:58 +0000 UTC" firstStartedPulling="2026-02-23 08:58:00.754394551 +0000 UTC m=+8003.005721685" lastFinishedPulling="2026-02-23 08:58:29.288228482 +0000 UTC m=+8031.539555626" observedRunningTime="2026-02-23 08:58:29.751091809 +0000 UTC m=+8032.002418953" watchObservedRunningTime="2026-02-23 08:58:29.7588916 +0000 UTC m=+8032.010218734" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.772398 5047 scope.go:117] "RemoveContainer" containerID="c020d33c0eee17adb43af6bf6f030e331748d86d0b54d390dff0ea3892708774" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.774527 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b712e80-d659-48f1-a17a-32bd8c4d2e25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b712e80-d659-48f1-a17a-32bd8c4d2e25" (UID: "7b712e80-d659-48f1-a17a-32bd8c4d2e25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.824152 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b712e80-d659-48f1-a17a-32bd8c4d2e25-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.824654 5047 scope.go:117] "RemoveContainer" containerID="13e5c349600ae98de0378cd85fe7941eabc2d9437f75029dcb21d95ed3dfcc35" Feb 23 08:58:29 crc kubenswrapper[5047]: E0223 08:58:29.826176 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13e5c349600ae98de0378cd85fe7941eabc2d9437f75029dcb21d95ed3dfcc35\": container with ID starting with 13e5c349600ae98de0378cd85fe7941eabc2d9437f75029dcb21d95ed3dfcc35 not found: ID does not exist" containerID="13e5c349600ae98de0378cd85fe7941eabc2d9437f75029dcb21d95ed3dfcc35" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.826221 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13e5c349600ae98de0378cd85fe7941eabc2d9437f75029dcb21d95ed3dfcc35"} err="failed to get container status \"13e5c349600ae98de0378cd85fe7941eabc2d9437f75029dcb21d95ed3dfcc35\": rpc error: code = NotFound desc = could not find container \"13e5c349600ae98de0378cd85fe7941eabc2d9437f75029dcb21d95ed3dfcc35\": container with ID starting with 13e5c349600ae98de0378cd85fe7941eabc2d9437f75029dcb21d95ed3dfcc35 not found: ID does not exist" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.826250 5047 scope.go:117] "RemoveContainer" containerID="c17fdc9eb3b5a57942d8220516e1acb33273a54b24e3784e03fe4db12860ce70" Feb 23 08:58:29 crc kubenswrapper[5047]: E0223 08:58:29.826560 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c17fdc9eb3b5a57942d8220516e1acb33273a54b24e3784e03fe4db12860ce70\": container with ID starting with c17fdc9eb3b5a57942d8220516e1acb33273a54b24e3784e03fe4db12860ce70 not found: ID does not exist" containerID="c17fdc9eb3b5a57942d8220516e1acb33273a54b24e3784e03fe4db12860ce70" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.826587 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17fdc9eb3b5a57942d8220516e1acb33273a54b24e3784e03fe4db12860ce70"} err="failed to get container status \"c17fdc9eb3b5a57942d8220516e1acb33273a54b24e3784e03fe4db12860ce70\": rpc error: code = NotFound desc = could not find container \"c17fdc9eb3b5a57942d8220516e1acb33273a54b24e3784e03fe4db12860ce70\": container with ID starting with c17fdc9eb3b5a57942d8220516e1acb33273a54b24e3784e03fe4db12860ce70 not found: ID does not exist" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.826603 5047 scope.go:117] "RemoveContainer" containerID="c020d33c0eee17adb43af6bf6f030e331748d86d0b54d390dff0ea3892708774" Feb 23 08:58:29 crc kubenswrapper[5047]: E0223 08:58:29.827247 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c020d33c0eee17adb43af6bf6f030e331748d86d0b54d390dff0ea3892708774\": container with ID starting with c020d33c0eee17adb43af6bf6f030e331748d86d0b54d390dff0ea3892708774 not found: ID does not exist" containerID="c020d33c0eee17adb43af6bf6f030e331748d86d0b54d390dff0ea3892708774" Feb 23 08:58:29 crc kubenswrapper[5047]: I0223 08:58:29.827306 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c020d33c0eee17adb43af6bf6f030e331748d86d0b54d390dff0ea3892708774"} err="failed to get container status \"c020d33c0eee17adb43af6bf6f030e331748d86d0b54d390dff0ea3892708774\": rpc error: code = NotFound desc = could not find container \"c020d33c0eee17adb43af6bf6f030e331748d86d0b54d390dff0ea3892708774\": container with ID starting with c020d33c0eee17adb43af6bf6f030e331748d86d0b54d390dff0ea3892708774 not found: ID does not exist" Feb 23 08:58:30 crc kubenswrapper[5047]: I0223 08:58:30.074486 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kp6dp"] Feb 23 08:58:30 crc kubenswrapper[5047]: I0223 08:58:30.086705 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kp6dp"] Feb 23 08:58:30 crc kubenswrapper[5047]: I0223 08:58:30.203519 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:30 crc kubenswrapper[5047]: I0223 08:58:30.203586 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:30 crc kubenswrapper[5047]: I0223 08:58:30.206073 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:30 crc kubenswrapper[5047]: I0223 08:58:30.362380 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" path="/var/lib/kubelet/pods/7b712e80-d659-48f1-a17a-32bd8c4d2e25/volumes" Feb 23 08:58:30 crc kubenswrapper[5047]: I0223 08:58:30.740901 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.625568 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.626659 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="d70c1128-8ff1-44a6-a81f-e314378abb7c" containerName="openstackclient" containerID="cri-o://30d720e9d7d25a47b7ebb103f026fe844e5a94b9ebc55855ce996da5516316b1" gracePeriod=2 Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.635078 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.680695 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 23 08:58:32 crc kubenswrapper[5047]: E0223 08:58:32.681252 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70c1128-8ff1-44a6-a81f-e314378abb7c" containerName="openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.681275 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70c1128-8ff1-44a6-a81f-e314378abb7c" containerName="openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: E0223 08:58:32.681291 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d58a9f-2416-4b89-8b12-1c59715953a9" containerName="extract-content" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.681301 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d58a9f-2416-4b89-8b12-1c59715953a9" containerName="extract-content" Feb 23 08:58:32 crc kubenswrapper[5047]: E0223 08:58:32.681322 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="extract-utilities" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.681331 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="extract-utilities" Feb 23 08:58:32 crc kubenswrapper[5047]: E0223 08:58:32.681343 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="registry-server" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.681351 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="registry-server" Feb 23 08:58:32 crc kubenswrapper[5047]: E0223 08:58:32.681367 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d58a9f-2416-4b89-8b12-1c59715953a9" containerName="registry-server" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.681376 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d58a9f-2416-4b89-8b12-1c59715953a9" containerName="registry-server" Feb 23 08:58:32 crc kubenswrapper[5047]: E0223 08:58:32.681434 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="extract-content" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.681445 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="extract-content" Feb 23 08:58:32 crc kubenswrapper[5047]: E0223 08:58:32.681456 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d58a9f-2416-4b89-8b12-1c59715953a9" containerName="extract-utilities" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.681464 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d58a9f-2416-4b89-8b12-1c59715953a9" containerName="extract-utilities" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.681716 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d58a9f-2416-4b89-8b12-1c59715953a9" containerName="registry-server" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.681743 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d70c1128-8ff1-44a6-a81f-e314378abb7c" containerName="openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.681760 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b712e80-d659-48f1-a17a-32bd8c4d2e25" containerName="registry-server" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.682574 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.687949 5047 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="d70c1128-8ff1-44a6-a81f-e314378abb7c" podUID="5eaba05e-ecdc-472a-b77f-a4fd716b467e" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.697403 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.808045 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eaba05e-ecdc-472a-b77f-a4fd716b467e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " pod="openstack/openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.808321 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5eaba05e-ecdc-472a-b77f-a4fd716b467e-openstack-config\") pod \"openstackclient\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " pod="openstack/openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.808574 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5eaba05e-ecdc-472a-b77f-a4fd716b467e-openstack-config-secret\") pod \"openstackclient\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " pod="openstack/openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.808776 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjxld\" (UniqueName: \"kubernetes.io/projected/5eaba05e-ecdc-472a-b77f-a4fd716b467e-kube-api-access-vjxld\") pod \"openstackclient\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " pod="openstack/openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.910355 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eaba05e-ecdc-472a-b77f-a4fd716b467e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " pod="openstack/openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.910493 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5eaba05e-ecdc-472a-b77f-a4fd716b467e-openstack-config\") pod \"openstackclient\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " pod="openstack/openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.910659 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5eaba05e-ecdc-472a-b77f-a4fd716b467e-openstack-config-secret\") pod \"openstackclient\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " pod="openstack/openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.910837 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjxld\" (UniqueName: \"kubernetes.io/projected/5eaba05e-ecdc-472a-b77f-a4fd716b467e-kube-api-access-vjxld\") pod \"openstackclient\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " pod="openstack/openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.911427 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5eaba05e-ecdc-472a-b77f-a4fd716b467e-openstack-config\") pod \"openstackclient\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " pod="openstack/openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.916022 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eaba05e-ecdc-472a-b77f-a4fd716b467e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " pod="openstack/openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.916043 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5eaba05e-ecdc-472a-b77f-a4fd716b467e-openstack-config-secret\") pod \"openstackclient\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " pod="openstack/openstackclient" Feb 23 08:58:32 crc kubenswrapper[5047]: I0223 08:58:32.938104 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjxld\" (UniqueName: \"kubernetes.io/projected/5eaba05e-ecdc-472a-b77f-a4fd716b467e-kube-api-access-vjxld\") pod \"openstackclient\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " pod="openstack/openstackclient" Feb 23 08:58:33 crc kubenswrapper[5047]: I0223 08:58:33.013945 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:58:33 crc kubenswrapper[5047]: I0223 08:58:33.547132 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 08:58:33 crc kubenswrapper[5047]: I0223 08:58:33.770098 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5eaba05e-ecdc-472a-b77f-a4fd716b467e","Type":"ContainerStarted","Data":"69d3101d0dd1f058d3e374fea4657a2a86dab508f6826496a6c180ad1d829e76"} Feb 23 08:58:33 crc kubenswrapper[5047]: I0223 08:58:33.770418 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5eaba05e-ecdc-472a-b77f-a4fd716b467e","Type":"ContainerStarted","Data":"b2d1364d2377b55cec89e9e6dd0d2263842116d06a3bbf2b60af00450d28796f"} Feb 23 08:58:33 crc kubenswrapper[5047]: I0223 08:58:33.790233 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.790210767 podStartE2EDuration="1.790210767s" podCreationTimestamp="2026-02-23 08:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:58:33.789506149 +0000 UTC m=+8036.040833283" watchObservedRunningTime="2026-02-23 08:58:33.790210767 +0000 UTC m=+8036.041537911" Feb 23 08:58:33 crc kubenswrapper[5047]: I0223 08:58:33.934060 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 08:58:33 crc kubenswrapper[5047]: I0223 08:58:33.934437 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="prometheus" containerID="cri-o://74bb043a6f113ba465605487bf47a03f605f2521de6d3affcfee6d08e21de58b" gracePeriod=600 Feb 23 08:58:33 crc kubenswrapper[5047]: I0223 08:58:33.934594 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="thanos-sidecar" containerID="cri-o://d8a161be5f8b4126f518619c51280e8a08feca1a551976743ab9b803ce0e215c" gracePeriod=600 Feb 23 08:58:33 crc kubenswrapper[5047]: I0223 08:58:33.934660 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="config-reloader" containerID="cri-o://fbe2e293d22cc9ab907d434c3ac2625dca11f93588e6be42390744ada31ad4f0" gracePeriod=600 Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.805376 5047 generic.go:334] "Generic (PLEG): container finished" podID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerID="d8a161be5f8b4126f518619c51280e8a08feca1a551976743ab9b803ce0e215c" exitCode=0 Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.806285 5047 generic.go:334] "Generic (PLEG): container finished" podID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerID="fbe2e293d22cc9ab907d434c3ac2625dca11f93588e6be42390744ada31ad4f0" exitCode=0 Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.806299 5047 generic.go:334] "Generic (PLEG): container finished" podID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerID="74bb043a6f113ba465605487bf47a03f605f2521de6d3affcfee6d08e21de58b" exitCode=0 Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.805511 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"049b1f30-3f9e-4bcc-865e-2cd923125eb6","Type":"ContainerDied","Data":"d8a161be5f8b4126f518619c51280e8a08feca1a551976743ab9b803ce0e215c"} Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.806366 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"049b1f30-3f9e-4bcc-865e-2cd923125eb6","Type":"ContainerDied","Data":"fbe2e293d22cc9ab907d434c3ac2625dca11f93588e6be42390744ada31ad4f0"} Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.806385 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"049b1f30-3f9e-4bcc-865e-2cd923125eb6","Type":"ContainerDied","Data":"74bb043a6f113ba465605487bf47a03f605f2521de6d3affcfee6d08e21de58b"} Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.815384 5047 generic.go:334] "Generic (PLEG): container finished" podID="d70c1128-8ff1-44a6-a81f-e314378abb7c" containerID="30d720e9d7d25a47b7ebb103f026fe844e5a94b9ebc55855ce996da5516316b1" exitCode=137 Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.897082 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.915807 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.950234 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/049b1f30-3f9e-4bcc-865e-2cd923125eb6-config-out\") pod \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.950298 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-0\") pod \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.950332 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-1\") pod \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.950370 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x45nq\" (UniqueName: \"kubernetes.io/projected/049b1f30-3f9e-4bcc-865e-2cd923125eb6-kube-api-access-x45nq\") pod \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.950442 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-web-config\") pod \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.950475 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/049b1f30-3f9e-4bcc-865e-2cd923125eb6-tls-assets\") pod \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.950621 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") pod \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.950683 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-thanos-prometheus-http-client-file\") pod \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.950770 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-2\") pod \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.950828 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-config\") pod \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\" (UID: \"049b1f30-3f9e-4bcc-865e-2cd923125eb6\") " Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.953705 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "049b1f30-3f9e-4bcc-865e-2cd923125eb6" (UID: "049b1f30-3f9e-4bcc-865e-2cd923125eb6"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.954089 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "049b1f30-3f9e-4bcc-865e-2cd923125eb6" (UID: "049b1f30-3f9e-4bcc-865e-2cd923125eb6"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.957758 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/049b1f30-3f9e-4bcc-865e-2cd923125eb6-config-out" (OuterVolumeSpecName: "config-out") pod "049b1f30-3f9e-4bcc-865e-2cd923125eb6" (UID: "049b1f30-3f9e-4bcc-865e-2cd923125eb6"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.964991 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "049b1f30-3f9e-4bcc-865e-2cd923125eb6" (UID: "049b1f30-3f9e-4bcc-865e-2cd923125eb6"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.968062 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049b1f30-3f9e-4bcc-865e-2cd923125eb6-kube-api-access-x45nq" (OuterVolumeSpecName: "kube-api-access-x45nq") pod "049b1f30-3f9e-4bcc-865e-2cd923125eb6" (UID: "049b1f30-3f9e-4bcc-865e-2cd923125eb6"). InnerVolumeSpecName "kube-api-access-x45nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.968281 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049b1f30-3f9e-4bcc-865e-2cd923125eb6-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "049b1f30-3f9e-4bcc-865e-2cd923125eb6" (UID: "049b1f30-3f9e-4bcc-865e-2cd923125eb6"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.978813 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "049b1f30-3f9e-4bcc-865e-2cd923125eb6" (UID: "049b1f30-3f9e-4bcc-865e-2cd923125eb6"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.986036 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-config" (OuterVolumeSpecName: "config") pod "049b1f30-3f9e-4bcc-865e-2cd923125eb6" (UID: "049b1f30-3f9e-4bcc-865e-2cd923125eb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.996469 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "049b1f30-3f9e-4bcc-865e-2cd923125eb6" (UID: "049b1f30-3f9e-4bcc-865e-2cd923125eb6"). InnerVolumeSpecName "pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 08:58:34 crc kubenswrapper[5047]: I0223 08:58:34.996677 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-web-config" (OuterVolumeSpecName: "web-config") pod "049b1f30-3f9e-4bcc-865e-2cd923125eb6" (UID: "049b1f30-3f9e-4bcc-865e-2cd923125eb6"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.052399 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d70c1128-8ff1-44a6-a81f-e314378abb7c-openstack-config\") pod \"d70c1128-8ff1-44a6-a81f-e314378abb7c\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.052514 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d70c1128-8ff1-44a6-a81f-e314378abb7c-openstack-config-secret\") pod \"d70c1128-8ff1-44a6-a81f-e314378abb7c\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.052599 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70c1128-8ff1-44a6-a81f-e314378abb7c-combined-ca-bundle\") pod \"d70c1128-8ff1-44a6-a81f-e314378abb7c\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.052647 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q86q\" (UniqueName: \"kubernetes.io/projected/d70c1128-8ff1-44a6-a81f-e314378abb7c-kube-api-access-6q86q\") pod \"d70c1128-8ff1-44a6-a81f-e314378abb7c\" (UID: \"d70c1128-8ff1-44a6-a81f-e314378abb7c\") " Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.053148 5047 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.053161 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.053170 5047 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/049b1f30-3f9e-4bcc-865e-2cd923125eb6-config-out\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.053180 5047 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.053190 5047 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/049b1f30-3f9e-4bcc-865e-2cd923125eb6-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.053200 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x45nq\" (UniqueName: \"kubernetes.io/projected/049b1f30-3f9e-4bcc-865e-2cd923125eb6-kube-api-access-x45nq\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.053208 5047 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-web-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.053216 5047 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/049b1f30-3f9e-4bcc-865e-2cd923125eb6-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.053237 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") on node \"crc\" " Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.053248 5047 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/049b1f30-3f9e-4bcc-865e-2cd923125eb6-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.068464 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d70c1128-8ff1-44a6-a81f-e314378abb7c-kube-api-access-6q86q" (OuterVolumeSpecName: "kube-api-access-6q86q") pod "d70c1128-8ff1-44a6-a81f-e314378abb7c" (UID: "d70c1128-8ff1-44a6-a81f-e314378abb7c"). InnerVolumeSpecName "kube-api-access-6q86q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.075452 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.075682 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5") on node "crc" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.076938 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d70c1128-8ff1-44a6-a81f-e314378abb7c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d70c1128-8ff1-44a6-a81f-e314378abb7c" (UID: "d70c1128-8ff1-44a6-a81f-e314378abb7c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.086917 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70c1128-8ff1-44a6-a81f-e314378abb7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d70c1128-8ff1-44a6-a81f-e314378abb7c" (UID: "d70c1128-8ff1-44a6-a81f-e314378abb7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.110570 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70c1128-8ff1-44a6-a81f-e314378abb7c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d70c1128-8ff1-44a6-a81f-e314378abb7c" (UID: "d70c1128-8ff1-44a6-a81f-e314378abb7c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.155827 5047 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d70c1128-8ff1-44a6-a81f-e314378abb7c-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.155876 5047 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d70c1128-8ff1-44a6-a81f-e314378abb7c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.155889 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d70c1128-8ff1-44a6-a81f-e314378abb7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.155971 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.155985 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q86q\" (UniqueName: \"kubernetes.io/projected/d70c1128-8ff1-44a6-a81f-e314378abb7c-kube-api-access-6q86q\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.832633 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"049b1f30-3f9e-4bcc-865e-2cd923125eb6","Type":"ContainerDied","Data":"236c408625a6130926edf10c8b3e051bf9fbc493b0cdf5a2a36d49b9fe228362"} Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.832696 5047 scope.go:117] "RemoveContainer" containerID="d8a161be5f8b4126f518619c51280e8a08feca1a551976743ab9b803ce0e215c" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.832713 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.835982 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.874968 5047 scope.go:117] "RemoveContainer" containerID="fbe2e293d22cc9ab907d434c3ac2625dca11f93588e6be42390744ada31ad4f0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.878171 5047 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="d70c1128-8ff1-44a6-a81f-e314378abb7c" podUID="5eaba05e-ecdc-472a-b77f-a4fd716b467e" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.885120 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.897674 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.915465 5047 scope.go:117] "RemoveContainer" containerID="74bb043a6f113ba465605487bf47a03f605f2521de6d3affcfee6d08e21de58b" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.926984 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 08:58:35 crc kubenswrapper[5047]: E0223 08:58:35.927501 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="init-config-reloader" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.927523 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="init-config-reloader" Feb 23 08:58:35 crc kubenswrapper[5047]: E0223 08:58:35.927538 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="config-reloader" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.927546 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="config-reloader" Feb 23 08:58:35 crc kubenswrapper[5047]: E0223 08:58:35.927561 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="thanos-sidecar" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.927570 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="thanos-sidecar" Feb 23 08:58:35 crc kubenswrapper[5047]: E0223 08:58:35.927600 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="prometheus" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.927608 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="prometheus" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.927834 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="prometheus" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.927848 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="config-reloader" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.927878 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" containerName="thanos-sidecar" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.930204 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.937746 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.938110 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.938356 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-dr7pg" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.938639 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.940635 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.940876 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.948079 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.948173 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.951190 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.966597 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.978858 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.978953 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.978997 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.979229 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-config\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.979290 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.979372 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.979432 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.979476 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.979550 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5pdn\" (UniqueName: \"kubernetes.io/projected/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-kube-api-access-t5pdn\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.979599 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.979690 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.979784 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:35 crc kubenswrapper[5047]: I0223 08:58:35.979851 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.002789 5047 scope.go:117] "RemoveContainer" containerID="5e6ae55be7a72c1e27d0e6fc01b07b5f203566328042e654b410529194b8f5e0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.027669 5047 scope.go:117] "RemoveContainer" containerID="30d720e9d7d25a47b7ebb103f026fe844e5a94b9ebc55855ce996da5516316b1" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.082025 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.082412 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-config\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.082524 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.082664 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.082767 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.082930 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.083046 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5pdn\" (UniqueName: \"kubernetes.io/projected/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-kube-api-access-t5pdn\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.082967 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.083237 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.083470 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.083617 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.083752 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.083845 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.083969 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.084040 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.084186 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.087307 5047 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.087336 5047 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a9c6f557254bb72755c4968ba73477eaf01a68b3243fbfe35b5777d635b954d0/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.088953 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-config\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.089118 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.089188 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.089494 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.089587 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.089840 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.091050 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.104500 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.107731 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5pdn\" (UniqueName: \"kubernetes.io/projected/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-kube-api-access-t5pdn\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.153804 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") pod \"prometheus-metric-storage-0\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.278993 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.358487 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="049b1f30-3f9e-4bcc-865e-2cd923125eb6" path="/var/lib/kubelet/pods/049b1f30-3f9e-4bcc-865e-2cd923125eb6/volumes" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.359397 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d70c1128-8ff1-44a6-a81f-e314378abb7c" path="/var/lib/kubelet/pods/d70c1128-8ff1-44a6-a81f-e314378abb7c/volumes" Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.796223 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 08:58:36 crc kubenswrapper[5047]: W0223 08:58:36.796577 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ac0b4e9_66ad_4719_b036_d9e833ad7a37.slice/crio-ea290947b23bacf10558492b4748209193c0c5af0f3e71634623034bd1b1875d WatchSource:0}: Error finding container ea290947b23bacf10558492b4748209193c0c5af0f3e71634623034bd1b1875d: Status 404 returned error can't find the container with id ea290947b23bacf10558492b4748209193c0c5af0f3e71634623034bd1b1875d Feb 23 08:58:36 crc kubenswrapper[5047]: I0223 08:58:36.851849 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9ac0b4e9-66ad-4719-b036-d9e833ad7a37","Type":"ContainerStarted","Data":"ea290947b23bacf10558492b4748209193c0c5af0f3e71634623034bd1b1875d"} Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.277234 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.282652 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.284532 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.295051 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.300367 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.428252 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-config-data\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.428306 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9vhw\" (UniqueName: \"kubernetes.io/projected/c23289c0-430c-4d64-b177-74a2eba5a09f-kube-api-access-g9vhw\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.428333 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23289c0-430c-4d64-b177-74a2eba5a09f-log-httpd\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.428360 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.428383 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-scripts\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.428504 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23289c0-430c-4d64-b177-74a2eba5a09f-run-httpd\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.428713 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.536571 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-config-data\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.537295 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9vhw\" (UniqueName: \"kubernetes.io/projected/c23289c0-430c-4d64-b177-74a2eba5a09f-kube-api-access-g9vhw\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.537358 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23289c0-430c-4d64-b177-74a2eba5a09f-log-httpd\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.537432 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.537475 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-scripts\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.537534 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23289c0-430c-4d64-b177-74a2eba5a09f-run-httpd\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.537779 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.537788 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23289c0-430c-4d64-b177-74a2eba5a09f-log-httpd\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.538499 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23289c0-430c-4d64-b177-74a2eba5a09f-run-httpd\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.543464 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-scripts\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.545090 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-config-data\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.546192 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.547656 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.558831 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9vhw\" (UniqueName: \"kubernetes.io/projected/c23289c0-430c-4d64-b177-74a2eba5a09f-kube-api-access-g9vhw\") pod \"ceilometer-0\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " pod="openstack/ceilometer-0" Feb 23 08:58:38 crc kubenswrapper[5047]: I0223 08:58:38.608168 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:58:39 crc kubenswrapper[5047]: I0223 08:58:39.077957 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:58:39 crc kubenswrapper[5047]: I0223 08:58:39.883228 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23289c0-430c-4d64-b177-74a2eba5a09f","Type":"ContainerStarted","Data":"619ce537754aa39af5c2ccf288d5f888ccae8ae8cee84603f28541cd847cd34b"} Feb 23 08:58:40 crc kubenswrapper[5047]: I0223 08:58:40.894398 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9ac0b4e9-66ad-4719-b036-d9e833ad7a37","Type":"ContainerStarted","Data":"b6c3d6a20c48b88b3ebe6dbb452453fb26811d2587e79ad60243317acb4b1af9"} Feb 23 08:58:43 crc kubenswrapper[5047]: I0223 08:58:43.926457 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23289c0-430c-4d64-b177-74a2eba5a09f","Type":"ContainerStarted","Data":"5cd3d57ab8936ab135a4d4e6c8dad445aef3a80cf6f2b4a5f7d09c87f12a0c99"} Feb 23 08:58:44 crc kubenswrapper[5047]: I0223 08:58:44.938119 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23289c0-430c-4d64-b177-74a2eba5a09f","Type":"ContainerStarted","Data":"1194df452564d0b4c44a973f143079606911ffd9ef3df159521cc2c9a83dcf07"} Feb 23 08:58:45 crc kubenswrapper[5047]: I0223 08:58:45.947932 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23289c0-430c-4d64-b177-74a2eba5a09f","Type":"ContainerStarted","Data":"0e21d931a8dad7e4916143d30a545a73726a9eb165311b72ba230311ec886bb6"} Feb 23 08:58:47 crc kubenswrapper[5047]: I0223 08:58:47.969012 5047 generic.go:334] "Generic (PLEG): container finished" podID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerID="b6c3d6a20c48b88b3ebe6dbb452453fb26811d2587e79ad60243317acb4b1af9" exitCode=0 Feb 23 08:58:47 crc kubenswrapper[5047]: I0223 08:58:47.969065 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9ac0b4e9-66ad-4719-b036-d9e833ad7a37","Type":"ContainerDied","Data":"b6c3d6a20c48b88b3ebe6dbb452453fb26811d2587e79ad60243317acb4b1af9"} Feb 23 08:58:47 crc kubenswrapper[5047]: I0223 08:58:47.973277 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23289c0-430c-4d64-b177-74a2eba5a09f","Type":"ContainerStarted","Data":"3d289c02cef4ff3f3680dff6740e61f04461c782c8c63d6d33b366f845545f37"} Feb 23 08:58:47 crc kubenswrapper[5047]: I0223 08:58:47.973490 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 08:58:48 crc kubenswrapper[5047]: I0223 08:58:48.063890 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.374365629 podStartE2EDuration="10.063866754s" podCreationTimestamp="2026-02-23 08:58:38 +0000 UTC" firstStartedPulling="2026-02-23 08:58:39.086114248 +0000 UTC m=+8041.337441382" lastFinishedPulling="2026-02-23 08:58:46.775615373 +0000 UTC m=+8049.026942507" observedRunningTime="2026-02-23 08:58:48.055444257 +0000 UTC m=+8050.306771411" watchObservedRunningTime="2026-02-23 08:58:48.063866754 +0000 UTC m=+8050.315193888" Feb 23 08:58:48 crc kubenswrapper[5047]: I0223 08:58:48.999309 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9ac0b4e9-66ad-4719-b036-d9e833ad7a37","Type":"ContainerStarted","Data":"4f7304ad6ed4398aa70c150989b050ad9909a66441b44da2cc965ae4528229d0"} Feb 23 08:58:52 crc kubenswrapper[5047]: I0223 08:58:52.031757 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9ac0b4e9-66ad-4719-b036-d9e833ad7a37","Type":"ContainerStarted","Data":"e6a7f689edafc92eda919136c37b6c2443ad2862e793b4c249f8c7326840442a"} Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.051822 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9ac0b4e9-66ad-4719-b036-d9e833ad7a37","Type":"ContainerStarted","Data":"f8aec307320ad2b429f05d28aac10036e7f2f6cdf40d47a064a28a6e546628b0"} Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.093901 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.093876243 podStartE2EDuration="18.093876243s" podCreationTimestamp="2026-02-23 08:58:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:58:53.083462032 +0000 UTC m=+8055.334789186" watchObservedRunningTime="2026-02-23 08:58:53.093876243 +0000 UTC m=+8055.345203387" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.348509 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-lmjpr"] Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.356474 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-lmjpr" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.389174 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-lmjpr"] Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.415381 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e48ca0ec-13fa-4342-aafb-b1ef2c09e08b-operator-scripts\") pod \"aodh-db-create-lmjpr\" (UID: \"e48ca0ec-13fa-4342-aafb-b1ef2c09e08b\") " pod="openstack/aodh-db-create-lmjpr" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.415668 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9c7m\" (UniqueName: \"kubernetes.io/projected/e48ca0ec-13fa-4342-aafb-b1ef2c09e08b-kube-api-access-d9c7m\") pod \"aodh-db-create-lmjpr\" (UID: \"e48ca0ec-13fa-4342-aafb-b1ef2c09e08b\") " pod="openstack/aodh-db-create-lmjpr" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.445108 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-dcb1-account-create-update-96tpc"] Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.446522 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-dcb1-account-create-update-96tpc" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.451481 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.460136 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-dcb1-account-create-update-96tpc"] Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.516956 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9c7m\" (UniqueName: \"kubernetes.io/projected/e48ca0ec-13fa-4342-aafb-b1ef2c09e08b-kube-api-access-d9c7m\") pod \"aodh-db-create-lmjpr\" (UID: \"e48ca0ec-13fa-4342-aafb-b1ef2c09e08b\") " pod="openstack/aodh-db-create-lmjpr" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.517013 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e48ca0ec-13fa-4342-aafb-b1ef2c09e08b-operator-scripts\") pod \"aodh-db-create-lmjpr\" (UID: \"e48ca0ec-13fa-4342-aafb-b1ef2c09e08b\") " pod="openstack/aodh-db-create-lmjpr" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.517100 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdscn\" (UniqueName: \"kubernetes.io/projected/031265ff-da09-4cb9-b2e8-f16da9486723-kube-api-access-xdscn\") pod \"aodh-dcb1-account-create-update-96tpc\" (UID: \"031265ff-da09-4cb9-b2e8-f16da9486723\") " pod="openstack/aodh-dcb1-account-create-update-96tpc" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.517185 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031265ff-da09-4cb9-b2e8-f16da9486723-operator-scripts\") pod \"aodh-dcb1-account-create-update-96tpc\" (UID: \"031265ff-da09-4cb9-b2e8-f16da9486723\") " pod="openstack/aodh-dcb1-account-create-update-96tpc" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.518230 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e48ca0ec-13fa-4342-aafb-b1ef2c09e08b-operator-scripts\") pod \"aodh-db-create-lmjpr\" (UID: \"e48ca0ec-13fa-4342-aafb-b1ef2c09e08b\") " pod="openstack/aodh-db-create-lmjpr" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.540656 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9c7m\" (UniqueName: \"kubernetes.io/projected/e48ca0ec-13fa-4342-aafb-b1ef2c09e08b-kube-api-access-d9c7m\") pod \"aodh-db-create-lmjpr\" (UID: \"e48ca0ec-13fa-4342-aafb-b1ef2c09e08b\") " pod="openstack/aodh-db-create-lmjpr" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.619061 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdscn\" (UniqueName: \"kubernetes.io/projected/031265ff-da09-4cb9-b2e8-f16da9486723-kube-api-access-xdscn\") pod \"aodh-dcb1-account-create-update-96tpc\" (UID: \"031265ff-da09-4cb9-b2e8-f16da9486723\") " pod="openstack/aodh-dcb1-account-create-update-96tpc" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.619181 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031265ff-da09-4cb9-b2e8-f16da9486723-operator-scripts\") pod \"aodh-dcb1-account-create-update-96tpc\" (UID: \"031265ff-da09-4cb9-b2e8-f16da9486723\") " pod="openstack/aodh-dcb1-account-create-update-96tpc" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.620087 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031265ff-da09-4cb9-b2e8-f16da9486723-operator-scripts\") pod \"aodh-dcb1-account-create-update-96tpc\" (UID: \"031265ff-da09-4cb9-b2e8-f16da9486723\") " pod="openstack/aodh-dcb1-account-create-update-96tpc" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.638398 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdscn\" (UniqueName: \"kubernetes.io/projected/031265ff-da09-4cb9-b2e8-f16da9486723-kube-api-access-xdscn\") pod \"aodh-dcb1-account-create-update-96tpc\" (UID: \"031265ff-da09-4cb9-b2e8-f16da9486723\") " pod="openstack/aodh-dcb1-account-create-update-96tpc" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.701389 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-lmjpr" Feb 23 08:58:53 crc kubenswrapper[5047]: I0223 08:58:53.773988 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-dcb1-account-create-update-96tpc" Feb 23 08:58:54 crc kubenswrapper[5047]: I0223 08:58:54.221592 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-lmjpr"] Feb 23 08:58:54 crc kubenswrapper[5047]: I0223 08:58:54.363591 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-dcb1-account-create-update-96tpc"] Feb 23 08:58:54 crc kubenswrapper[5047]: W0223 08:58:54.368875 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod031265ff_da09_4cb9_b2e8_f16da9486723.slice/crio-950bcc008bf5de80004f6037ab67f60b42c0ec033f4871b67e4785e08e151e4c WatchSource:0}: Error finding container 950bcc008bf5de80004f6037ab67f60b42c0ec033f4871b67e4785e08e151e4c: Status 404 returned error can't find the container with id 950bcc008bf5de80004f6037ab67f60b42c0ec033f4871b67e4785e08e151e4c Feb 23 08:58:55 crc kubenswrapper[5047]: I0223 08:58:55.081759 5047 generic.go:334] "Generic (PLEG): container finished" podID="e48ca0ec-13fa-4342-aafb-b1ef2c09e08b" containerID="8110fa3c9068cd81f7b808eb4081d1c9b96f0d6ba507405957aa5e3d3e2001f4" exitCode=0 Feb 23 08:58:55 crc kubenswrapper[5047]: I0223 08:58:55.081822 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-lmjpr" event={"ID":"e48ca0ec-13fa-4342-aafb-b1ef2c09e08b","Type":"ContainerDied","Data":"8110fa3c9068cd81f7b808eb4081d1c9b96f0d6ba507405957aa5e3d3e2001f4"} Feb 23 08:58:55 crc kubenswrapper[5047]: I0223 08:58:55.081884 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-lmjpr" event={"ID":"e48ca0ec-13fa-4342-aafb-b1ef2c09e08b","Type":"ContainerStarted","Data":"008499467f245f76362c603fd9fe75e7c9e5f0c25d2285b4d4d7594998052d24"} Feb 23 08:58:55 crc kubenswrapper[5047]: I0223 08:58:55.084306 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-dcb1-account-create-update-96tpc" event={"ID":"031265ff-da09-4cb9-b2e8-f16da9486723","Type":"ContainerStarted","Data":"89fe650989f06df1fa46950063352f4f07e0ac4cdfbad470ce07cca5b1f0509c"} Feb 23 08:58:55 crc kubenswrapper[5047]: I0223 08:58:55.084341 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-dcb1-account-create-update-96tpc" event={"ID":"031265ff-da09-4cb9-b2e8-f16da9486723","Type":"ContainerStarted","Data":"950bcc008bf5de80004f6037ab67f60b42c0ec033f4871b67e4785e08e151e4c"} Feb 23 08:58:55 crc kubenswrapper[5047]: I0223 08:58:55.119738 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-dcb1-account-create-update-96tpc" podStartSLOduration=2.119718551 podStartE2EDuration="2.119718551s" podCreationTimestamp="2026-02-23 08:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:58:55.116687319 +0000 UTC m=+8057.368014463" watchObservedRunningTime="2026-02-23 08:58:55.119718551 +0000 UTC m=+8057.371045695" Feb 23 08:58:56 crc kubenswrapper[5047]: I0223 08:58:56.102593 5047 generic.go:334] "Generic (PLEG): container finished" podID="031265ff-da09-4cb9-b2e8-f16da9486723" containerID="89fe650989f06df1fa46950063352f4f07e0ac4cdfbad470ce07cca5b1f0509c" exitCode=0 Feb 23 08:58:56 crc kubenswrapper[5047]: I0223 08:58:56.102699 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-dcb1-account-create-update-96tpc" event={"ID":"031265ff-da09-4cb9-b2e8-f16da9486723","Type":"ContainerDied","Data":"89fe650989f06df1fa46950063352f4f07e0ac4cdfbad470ce07cca5b1f0509c"} Feb 23 08:58:56 crc kubenswrapper[5047]: I0223 08:58:56.280163 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 23 08:58:56 crc kubenswrapper[5047]: I0223 08:58:56.458044 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-lmjpr" Feb 23 08:58:56 crc kubenswrapper[5047]: I0223 08:58:56.583396 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e48ca0ec-13fa-4342-aafb-b1ef2c09e08b-operator-scripts\") pod \"e48ca0ec-13fa-4342-aafb-b1ef2c09e08b\" (UID: \"e48ca0ec-13fa-4342-aafb-b1ef2c09e08b\") " Feb 23 08:58:56 crc kubenswrapper[5047]: I0223 08:58:56.583469 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9c7m\" (UniqueName: \"kubernetes.io/projected/e48ca0ec-13fa-4342-aafb-b1ef2c09e08b-kube-api-access-d9c7m\") pod \"e48ca0ec-13fa-4342-aafb-b1ef2c09e08b\" (UID: \"e48ca0ec-13fa-4342-aafb-b1ef2c09e08b\") " Feb 23 08:58:56 crc kubenswrapper[5047]: I0223 08:58:56.583851 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48ca0ec-13fa-4342-aafb-b1ef2c09e08b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e48ca0ec-13fa-4342-aafb-b1ef2c09e08b" (UID: "e48ca0ec-13fa-4342-aafb-b1ef2c09e08b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:56 crc kubenswrapper[5047]: I0223 08:58:56.584456 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e48ca0ec-13fa-4342-aafb-b1ef2c09e08b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:56 crc kubenswrapper[5047]: I0223 08:58:56.589147 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e48ca0ec-13fa-4342-aafb-b1ef2c09e08b-kube-api-access-d9c7m" (OuterVolumeSpecName: "kube-api-access-d9c7m") pod "e48ca0ec-13fa-4342-aafb-b1ef2c09e08b" (UID: "e48ca0ec-13fa-4342-aafb-b1ef2c09e08b"). InnerVolumeSpecName "kube-api-access-d9c7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:58:56 crc kubenswrapper[5047]: I0223 08:58:56.686369 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9c7m\" (UniqueName: \"kubernetes.io/projected/e48ca0ec-13fa-4342-aafb-b1ef2c09e08b-kube-api-access-d9c7m\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:57 crc kubenswrapper[5047]: I0223 08:58:57.115130 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-lmjpr" event={"ID":"e48ca0ec-13fa-4342-aafb-b1ef2c09e08b","Type":"ContainerDied","Data":"008499467f245f76362c603fd9fe75e7c9e5f0c25d2285b4d4d7594998052d24"} Feb 23 08:58:57 crc kubenswrapper[5047]: I0223 08:58:57.115169 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-lmjpr" Feb 23 08:58:57 crc kubenswrapper[5047]: I0223 08:58:57.115176 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="008499467f245f76362c603fd9fe75e7c9e5f0c25d2285b4d4d7594998052d24" Feb 23 08:58:57 crc kubenswrapper[5047]: I0223 08:58:57.519142 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-dcb1-account-create-update-96tpc" Feb 23 08:58:57 crc kubenswrapper[5047]: I0223 08:58:57.614697 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdscn\" (UniqueName: \"kubernetes.io/projected/031265ff-da09-4cb9-b2e8-f16da9486723-kube-api-access-xdscn\") pod \"031265ff-da09-4cb9-b2e8-f16da9486723\" (UID: \"031265ff-da09-4cb9-b2e8-f16da9486723\") " Feb 23 08:58:57 crc kubenswrapper[5047]: I0223 08:58:57.615006 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031265ff-da09-4cb9-b2e8-f16da9486723-operator-scripts\") pod \"031265ff-da09-4cb9-b2e8-f16da9486723\" (UID: \"031265ff-da09-4cb9-b2e8-f16da9486723\") " Feb 23 08:58:57 crc kubenswrapper[5047]: I0223 08:58:57.616120 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/031265ff-da09-4cb9-b2e8-f16da9486723-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "031265ff-da09-4cb9-b2e8-f16da9486723" (UID: "031265ff-da09-4cb9-b2e8-f16da9486723"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:57 crc kubenswrapper[5047]: I0223 08:58:57.634254 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031265ff-da09-4cb9-b2e8-f16da9486723-kube-api-access-xdscn" (OuterVolumeSpecName: "kube-api-access-xdscn") pod "031265ff-da09-4cb9-b2e8-f16da9486723" (UID: "031265ff-da09-4cb9-b2e8-f16da9486723"). InnerVolumeSpecName "kube-api-access-xdscn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:58:57 crc kubenswrapper[5047]: I0223 08:58:57.717345 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031265ff-da09-4cb9-b2e8-f16da9486723-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:57 crc kubenswrapper[5047]: I0223 08:58:57.717376 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdscn\" (UniqueName: \"kubernetes.io/projected/031265ff-da09-4cb9-b2e8-f16da9486723-kube-api-access-xdscn\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.131875 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-dcb1-account-create-update-96tpc" event={"ID":"031265ff-da09-4cb9-b2e8-f16da9486723","Type":"ContainerDied","Data":"950bcc008bf5de80004f6037ab67f60b42c0ec033f4871b67e4785e08e151e4c"} Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.132920 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="950bcc008bf5de80004f6037ab67f60b42c0ec033f4871b67e4785e08e151e4c" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.132003 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-dcb1-account-create-update-96tpc" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.688794 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-fd8nx"] Feb 23 08:58:58 crc kubenswrapper[5047]: E0223 08:58:58.689507 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48ca0ec-13fa-4342-aafb-b1ef2c09e08b" containerName="mariadb-database-create" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.689525 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48ca0ec-13fa-4342-aafb-b1ef2c09e08b" containerName="mariadb-database-create" Feb 23 08:58:58 crc kubenswrapper[5047]: E0223 08:58:58.689541 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031265ff-da09-4cb9-b2e8-f16da9486723" containerName="mariadb-account-create-update" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.689549 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="031265ff-da09-4cb9-b2e8-f16da9486723" containerName="mariadb-account-create-update" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.689730 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e48ca0ec-13fa-4342-aafb-b1ef2c09e08b" containerName="mariadb-database-create" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.689762 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="031265ff-da09-4cb9-b2e8-f16da9486723" containerName="mariadb-account-create-update" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.690457 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.692367 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.692426 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.692832 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-x7sd7" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.693089 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.709504 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-fd8nx"] Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.739063 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-combined-ca-bundle\") pod \"aodh-db-sync-fd8nx\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.739236 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-config-data\") pod \"aodh-db-sync-fd8nx\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.739406 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-scripts\") pod \"aodh-db-sync-fd8nx\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.739661 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8sdd\" (UniqueName: \"kubernetes.io/projected/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-kube-api-access-v8sdd\") pod \"aodh-db-sync-fd8nx\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.841635 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-config-data\") pod \"aodh-db-sync-fd8nx\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.841977 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-scripts\") pod \"aodh-db-sync-fd8nx\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.842128 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8sdd\" (UniqueName: \"kubernetes.io/projected/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-kube-api-access-v8sdd\") pod \"aodh-db-sync-fd8nx\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.842243 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-combined-ca-bundle\") pod \"aodh-db-sync-fd8nx\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.846444 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-scripts\") pod \"aodh-db-sync-fd8nx\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.846473 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-config-data\") pod \"aodh-db-sync-fd8nx\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.851402 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-combined-ca-bundle\") pod \"aodh-db-sync-fd8nx\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:58 crc kubenswrapper[5047]: I0223 08:58:58.869741 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8sdd\" (UniqueName: \"kubernetes.io/projected/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-kube-api-access-v8sdd\") pod \"aodh-db-sync-fd8nx\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:59 crc kubenswrapper[5047]: I0223 08:58:59.015254 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:58:59 crc kubenswrapper[5047]: I0223 08:58:59.503864 5047 scope.go:117] "RemoveContainer" containerID="990d530341f70397379fe224e343c016e6e159f6712f07a61694c646dab09e15" Feb 23 08:58:59 crc kubenswrapper[5047]: I0223 08:58:59.523820 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-fd8nx"] Feb 23 08:58:59 crc kubenswrapper[5047]: I0223 08:58:59.532127 5047 scope.go:117] "RemoveContainer" containerID="e91f6fc2a51b6dd890ddd0cd906848298a6cfb611dc8571a601ec0e8de7c70bd" Feb 23 08:58:59 crc kubenswrapper[5047]: I0223 08:58:59.570416 5047 scope.go:117] "RemoveContainer" containerID="59dfcbeeffabc99d039d88010bb5dbb9ae513cd15cc4d2873ae9b5dbbbe1aea4" Feb 23 08:58:59 crc kubenswrapper[5047]: I0223 08:58:59.588887 5047 scope.go:117] "RemoveContainer" containerID="fe25c9c5d07dc38acc41cd64e8eebf2bad34c1053763c03e932f4786a28248cb" Feb 23 08:58:59 crc kubenswrapper[5047]: I0223 08:58:59.612579 5047 scope.go:117] "RemoveContainer" containerID="f806e832e2772a999811e25f467bf3106a01c38ff43fad44b5eaadb575c9eb60" Feb 23 08:59:00 crc kubenswrapper[5047]: I0223 08:59:00.160172 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fd8nx" event={"ID":"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a","Type":"ContainerStarted","Data":"1f1f20dd5e1834273ac98c0f5f633b0bd0e250af425118344a00e7fd2e31343c"} Feb 23 08:59:05 crc kubenswrapper[5047]: I0223 08:59:05.218022 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fd8nx" event={"ID":"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a","Type":"ContainerStarted","Data":"5ee8f747b109ed16b9e7a124807d89b72589cbc2393d42b99e3fd63fd22b516e"} Feb 23 08:59:05 crc kubenswrapper[5047]: I0223 08:59:05.245316 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-fd8nx" podStartSLOduration=1.870223471 podStartE2EDuration="7.245294114s" podCreationTimestamp="2026-02-23 08:58:58 +0000 UTC" firstStartedPulling="2026-02-23 08:58:59.532001351 +0000 UTC m=+8061.783328485" lastFinishedPulling="2026-02-23 08:59:04.907071994 +0000 UTC m=+8067.158399128" observedRunningTime="2026-02-23 08:59:05.238936273 +0000 UTC m=+8067.490263447" watchObservedRunningTime="2026-02-23 08:59:05.245294114 +0000 UTC m=+8067.496621248" Feb 23 08:59:06 crc kubenswrapper[5047]: I0223 08:59:06.282163 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 23 08:59:06 crc kubenswrapper[5047]: I0223 08:59:06.295077 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 23 08:59:07 crc kubenswrapper[5047]: I0223 08:59:07.265027 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 23 08:59:08 crc kubenswrapper[5047]: I0223 08:59:08.273490 5047 generic.go:334] "Generic (PLEG): container finished" podID="0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a" containerID="5ee8f747b109ed16b9e7a124807d89b72589cbc2393d42b99e3fd63fd22b516e" exitCode=0 Feb 23 08:59:08 crc kubenswrapper[5047]: I0223 08:59:08.273568 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fd8nx" event={"ID":"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a","Type":"ContainerDied","Data":"5ee8f747b109ed16b9e7a124807d89b72589cbc2393d42b99e3fd63fd22b516e"} Feb 23 08:59:08 crc kubenswrapper[5047]: I0223 08:59:08.752380 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 23 08:59:09 crc kubenswrapper[5047]: I0223 08:59:09.668239 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:59:09 crc kubenswrapper[5047]: I0223 08:59:09.797413 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-config-data\") pod \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " Feb 23 08:59:09 crc kubenswrapper[5047]: I0223 08:59:09.797547 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-combined-ca-bundle\") pod \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " Feb 23 08:59:09 crc kubenswrapper[5047]: I0223 08:59:09.797668 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8sdd\" (UniqueName: \"kubernetes.io/projected/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-kube-api-access-v8sdd\") pod \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " Feb 23 08:59:09 crc kubenswrapper[5047]: I0223 08:59:09.797748 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-scripts\") pod \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\" (UID: \"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a\") " Feb 23 08:59:09 crc kubenswrapper[5047]: I0223 08:59:09.804014 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-kube-api-access-v8sdd" (OuterVolumeSpecName: "kube-api-access-v8sdd") pod "0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a" (UID: "0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a"). InnerVolumeSpecName "kube-api-access-v8sdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:59:09 crc kubenswrapper[5047]: I0223 08:59:09.804099 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-scripts" (OuterVolumeSpecName: "scripts") pod "0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a" (UID: "0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:09 crc kubenswrapper[5047]: I0223 08:59:09.825109 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a" (UID: "0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:09 crc kubenswrapper[5047]: I0223 08:59:09.830433 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-config-data" (OuterVolumeSpecName: "config-data") pod "0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a" (UID: "0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:09 crc kubenswrapper[5047]: I0223 08:59:09.900070 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8sdd\" (UniqueName: \"kubernetes.io/projected/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-kube-api-access-v8sdd\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:09 crc kubenswrapper[5047]: I0223 08:59:09.900301 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:09 crc kubenswrapper[5047]: I0223 08:59:09.900310 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:09 crc kubenswrapper[5047]: I0223 08:59:09.900322 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:10 crc kubenswrapper[5047]: I0223 08:59:10.297751 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fd8nx" event={"ID":"0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a","Type":"ContainerDied","Data":"1f1f20dd5e1834273ac98c0f5f633b0bd0e250af425118344a00e7fd2e31343c"} Feb 23 08:59:10 crc kubenswrapper[5047]: I0223 08:59:10.297792 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f1f20dd5e1834273ac98c0f5f633b0bd0e250af425118344a00e7fd2e31343c" Feb 23 08:59:10 crc kubenswrapper[5047]: I0223 08:59:10.297843 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fd8nx" Feb 23 08:59:12 crc kubenswrapper[5047]: I0223 08:59:12.320791 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 08:59:12 crc kubenswrapper[5047]: I0223 08:59:12.321490 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="db0d2a61-bbc7-4d94-b49e-18b6647e6ace" containerName="kube-state-metrics" containerID="cri-o://1bbe89bf46812cf3d79addd0d981e03735f6af40c5c18a1376ac3a82a61ff972" gracePeriod=30 Feb 23 08:59:12 crc kubenswrapper[5047]: I0223 08:59:12.798949 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 08:59:12 crc kubenswrapper[5047]: I0223 08:59:12.969184 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcbp5\" (UniqueName: \"kubernetes.io/projected/db0d2a61-bbc7-4d94-b49e-18b6647e6ace-kube-api-access-fcbp5\") pod \"db0d2a61-bbc7-4d94-b49e-18b6647e6ace\" (UID: \"db0d2a61-bbc7-4d94-b49e-18b6647e6ace\") " Feb 23 08:59:12 crc kubenswrapper[5047]: I0223 08:59:12.974751 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db0d2a61-bbc7-4d94-b49e-18b6647e6ace-kube-api-access-fcbp5" (OuterVolumeSpecName: "kube-api-access-fcbp5") pod "db0d2a61-bbc7-4d94-b49e-18b6647e6ace" (UID: "db0d2a61-bbc7-4d94-b49e-18b6647e6ace"). InnerVolumeSpecName "kube-api-access-fcbp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.048839 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0953-account-create-update-w46gb"] Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.058521 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-0953-account-create-update-w46gb"] Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.067249 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-zfpsk"] Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.071801 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcbp5\" (UniqueName: \"kubernetes.io/projected/db0d2a61-bbc7-4d94-b49e-18b6647e6ace-kube-api-access-fcbp5\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.076271 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-zfpsk"] Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.349530 5047 generic.go:334] "Generic (PLEG): container finished" podID="db0d2a61-bbc7-4d94-b49e-18b6647e6ace" containerID="1bbe89bf46812cf3d79addd0d981e03735f6af40c5c18a1376ac3a82a61ff972" exitCode=2 Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.349792 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"db0d2a61-bbc7-4d94-b49e-18b6647e6ace","Type":"ContainerDied","Data":"1bbe89bf46812cf3d79addd0d981e03735f6af40c5c18a1376ac3a82a61ff972"} Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.349830 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"db0d2a61-bbc7-4d94-b49e-18b6647e6ace","Type":"ContainerDied","Data":"2781e9576809c6aa578bc78c5e2fa63733b73fddb9d2c307b6c3be57bb3f797c"} Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.349846 5047 scope.go:117] "RemoveContainer" containerID="1bbe89bf46812cf3d79addd0d981e03735f6af40c5c18a1376ac3a82a61ff972" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.349976 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.368778 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 23 08:59:13 crc kubenswrapper[5047]: E0223 08:59:13.369464 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db0d2a61-bbc7-4d94-b49e-18b6647e6ace" containerName="kube-state-metrics" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.369487 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="db0d2a61-bbc7-4d94-b49e-18b6647e6ace" containerName="kube-state-metrics" Feb 23 08:59:13 crc kubenswrapper[5047]: E0223 08:59:13.369510 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a" containerName="aodh-db-sync" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.369520 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a" containerName="aodh-db-sync" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.369766 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="db0d2a61-bbc7-4d94-b49e-18b6647e6ace" containerName="kube-state-metrics" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.369794 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a" containerName="aodh-db-sync" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.374095 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.375421 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-x7sd7" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.385820 5047 scope.go:117] "RemoveContainer" containerID="1bbe89bf46812cf3d79addd0d981e03735f6af40c5c18a1376ac3a82a61ff972" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.386066 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.386373 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 23 08:59:13 crc kubenswrapper[5047]: E0223 08:59:13.392658 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bbe89bf46812cf3d79addd0d981e03735f6af40c5c18a1376ac3a82a61ff972\": container with ID starting with 1bbe89bf46812cf3d79addd0d981e03735f6af40c5c18a1376ac3a82a61ff972 not found: ID does not exist" containerID="1bbe89bf46812cf3d79addd0d981e03735f6af40c5c18a1376ac3a82a61ff972" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.392728 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bbe89bf46812cf3d79addd0d981e03735f6af40c5c18a1376ac3a82a61ff972"} err="failed to get container status \"1bbe89bf46812cf3d79addd0d981e03735f6af40c5c18a1376ac3a82a61ff972\": rpc error: code = NotFound desc = could not find container \"1bbe89bf46812cf3d79addd0d981e03735f6af40c5c18a1376ac3a82a61ff972\": container with ID starting with 1bbe89bf46812cf3d79addd0d981e03735f6af40c5c18a1376ac3a82a61ff972 not found: ID does not exist" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.412184 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.433060 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.452188 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.463471 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.466620 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.470732 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.471227 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.481772 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-scripts\") pod \"aodh-0\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.481936 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9hvn\" (UniqueName: \"kubernetes.io/projected/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-kube-api-access-m9hvn\") pod \"aodh-0\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.482396 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.482511 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-config-data\") pod \"aodh-0\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.492140 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.584164 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.584463 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4m9l\" (UniqueName: \"kubernetes.io/projected/2d714b4f-02d1-433b-ba95-54c199957dce-kube-api-access-q4m9l\") pod \"kube-state-metrics-0\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.584595 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-config-data\") pod \"aodh-0\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.584677 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.584758 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-scripts\") pod \"aodh-0\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.584871 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9hvn\" (UniqueName: \"kubernetes.io/projected/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-kube-api-access-m9hvn\") pod \"aodh-0\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.584983 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.585092 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.589490 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.592027 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-config-data\") pod \"aodh-0\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.608415 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-scripts\") pod \"aodh-0\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.611534 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9hvn\" (UniqueName: \"kubernetes.io/projected/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-kube-api-access-m9hvn\") pod \"aodh-0\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.687366 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.687412 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.687498 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4m9l\" (UniqueName: \"kubernetes.io/projected/2d714b4f-02d1-433b-ba95-54c199957dce-kube-api-access-q4m9l\") pod \"kube-state-metrics-0\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.687540 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.691360 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.691431 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.691529 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.706303 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4m9l\" (UniqueName: \"kubernetes.io/projected/2d714b4f-02d1-433b-ba95-54c199957dce-kube-api-access-q4m9l\") pod \"kube-state-metrics-0\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " pod="openstack/kube-state-metrics-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.708279 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 23 08:59:13 crc kubenswrapper[5047]: I0223 08:59:13.806751 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 08:59:14 crc kubenswrapper[5047]: I0223 08:59:14.241523 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:59:14 crc kubenswrapper[5047]: I0223 08:59:14.244932 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 23 08:59:14 crc kubenswrapper[5047]: I0223 08:59:14.298894 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 08:59:14 crc kubenswrapper[5047]: W0223 08:59:14.308550 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d714b4f_02d1_433b_ba95_54c199957dce.slice/crio-1134957310d5ea06b0897ec767a16c8f525402bf936aa6b654d8cce8ae60c26e WatchSource:0}: Error finding container 1134957310d5ea06b0897ec767a16c8f525402bf936aa6b654d8cce8ae60c26e: Status 404 returned error can't find the container with id 1134957310d5ea06b0897ec767a16c8f525402bf936aa6b654d8cce8ae60c26e Feb 23 08:59:14 crc kubenswrapper[5047]: I0223 08:59:14.355259 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd73798-3881-4a98-86b0-dc80cf37d901" path="/var/lib/kubelet/pods/3cd73798-3881-4a98-86b0-dc80cf37d901/volumes" Feb 23 08:59:14 crc kubenswrapper[5047]: I0223 08:59:14.356472 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a225bdc-5631-4a69-a67d-b59b7c055392" path="/var/lib/kubelet/pods/9a225bdc-5631-4a69-a67d-b59b7c055392/volumes" Feb 23 08:59:14 crc kubenswrapper[5047]: I0223 08:59:14.357183 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db0d2a61-bbc7-4d94-b49e-18b6647e6ace" path="/var/lib/kubelet/pods/db0d2a61-bbc7-4d94-b49e-18b6647e6ace/volumes" Feb 23 08:59:14 crc kubenswrapper[5047]: I0223 08:59:14.360564 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2d714b4f-02d1-433b-ba95-54c199957dce","Type":"ContainerStarted","Data":"1134957310d5ea06b0897ec767a16c8f525402bf936aa6b654d8cce8ae60c26e"} Feb 23 08:59:14 crc kubenswrapper[5047]: I0223 08:59:14.362516 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1853024c-432e-43e9-8d9b-cb5a4cdbc39f","Type":"ContainerStarted","Data":"983b9f330d92e41e6dca107d4b183483731b2bdde1192cf311b67c29a6e659d5"} Feb 23 08:59:14 crc kubenswrapper[5047]: I0223 08:59:14.617406 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:59:14 crc kubenswrapper[5047]: I0223 08:59:14.618022 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="ceilometer-central-agent" containerID="cri-o://5cd3d57ab8936ab135a4d4e6c8dad445aef3a80cf6f2b4a5f7d09c87f12a0c99" gracePeriod=30 Feb 23 08:59:14 crc kubenswrapper[5047]: I0223 08:59:14.618239 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="proxy-httpd" containerID="cri-o://3d289c02cef4ff3f3680dff6740e61f04461c782c8c63d6d33b366f845545f37" gracePeriod=30 Feb 23 08:59:14 crc kubenswrapper[5047]: I0223 08:59:14.618291 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="sg-core" containerID="cri-o://0e21d931a8dad7e4916143d30a545a73726a9eb165311b72ba230311ec886bb6" gracePeriod=30 Feb 23 08:59:14 crc kubenswrapper[5047]: I0223 08:59:14.618260 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="ceilometer-notification-agent" containerID="cri-o://1194df452564d0b4c44a973f143079606911ffd9ef3df159521cc2c9a83dcf07" gracePeriod=30 Feb 23 08:59:15 crc kubenswrapper[5047]: I0223 08:59:15.371824 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1853024c-432e-43e9-8d9b-cb5a4cdbc39f","Type":"ContainerStarted","Data":"9a344a70d305da84272f456c84d94bd1c65b8cb0b3595b9f1c5f4cde6cc01402"} Feb 23 08:59:15 crc kubenswrapper[5047]: I0223 08:59:15.376326 5047 generic.go:334] "Generic (PLEG): container finished" podID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerID="3d289c02cef4ff3f3680dff6740e61f04461c782c8c63d6d33b366f845545f37" exitCode=0 Feb 23 08:59:15 crc kubenswrapper[5047]: I0223 08:59:15.376413 5047 generic.go:334] "Generic (PLEG): container finished" podID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerID="0e21d931a8dad7e4916143d30a545a73726a9eb165311b72ba230311ec886bb6" exitCode=2 Feb 23 08:59:15 crc kubenswrapper[5047]: I0223 08:59:15.376486 5047 generic.go:334] "Generic (PLEG): container finished" podID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerID="5cd3d57ab8936ab135a4d4e6c8dad445aef3a80cf6f2b4a5f7d09c87f12a0c99" exitCode=0 Feb 23 08:59:15 crc kubenswrapper[5047]: I0223 08:59:15.376576 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23289c0-430c-4d64-b177-74a2eba5a09f","Type":"ContainerDied","Data":"3d289c02cef4ff3f3680dff6740e61f04461c782c8c63d6d33b366f845545f37"} Feb 23 08:59:15 crc kubenswrapper[5047]: I0223 08:59:15.376662 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23289c0-430c-4d64-b177-74a2eba5a09f","Type":"ContainerDied","Data":"0e21d931a8dad7e4916143d30a545a73726a9eb165311b72ba230311ec886bb6"} Feb 23 08:59:15 crc kubenswrapper[5047]: I0223 08:59:15.376722 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23289c0-430c-4d64-b177-74a2eba5a09f","Type":"ContainerDied","Data":"5cd3d57ab8936ab135a4d4e6c8dad445aef3a80cf6f2b4a5f7d09c87f12a0c99"} Feb 23 08:59:15 crc kubenswrapper[5047]: I0223 08:59:15.378993 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2d714b4f-02d1-433b-ba95-54c199957dce","Type":"ContainerStarted","Data":"f23fd67e4cf7fa54402cf3da162be46abfc25906ea3cdcb5916c37e37fe8051a"} Feb 23 08:59:15 crc kubenswrapper[5047]: I0223 08:59:15.380201 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 23 08:59:15 crc kubenswrapper[5047]: I0223 08:59:15.409007 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.9768803240000001 podStartE2EDuration="2.408985574s" podCreationTimestamp="2026-02-23 08:59:13 +0000 UTC" firstStartedPulling="2026-02-23 08:59:14.31364269 +0000 UTC m=+8076.564969814" lastFinishedPulling="2026-02-23 08:59:14.74574793 +0000 UTC m=+8076.997075064" observedRunningTime="2026-02-23 08:59:15.396301863 +0000 UTC m=+8077.647629007" watchObservedRunningTime="2026-02-23 08:59:15.408985574 +0000 UTC m=+8077.660312718" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.424294 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1853024c-432e-43e9-8d9b-cb5a4cdbc39f","Type":"ContainerStarted","Data":"e51df2969e8cab06944c3378d39d1d2e5c0c75e7deb374b823bcf69db53f32c7"} Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.451838 5047 generic.go:334] "Generic (PLEG): container finished" podID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerID="1194df452564d0b4c44a973f143079606911ffd9ef3df159521cc2c9a83dcf07" exitCode=0 Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.452110 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23289c0-430c-4d64-b177-74a2eba5a09f","Type":"ContainerDied","Data":"1194df452564d0b4c44a973f143079606911ffd9ef3df159521cc2c9a83dcf07"} Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.595896 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.766074 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-combined-ca-bundle\") pod \"c23289c0-430c-4d64-b177-74a2eba5a09f\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.766180 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23289c0-430c-4d64-b177-74a2eba5a09f-log-httpd\") pod \"c23289c0-430c-4d64-b177-74a2eba5a09f\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.766237 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-sg-core-conf-yaml\") pod \"c23289c0-430c-4d64-b177-74a2eba5a09f\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.766413 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-scripts\") pod \"c23289c0-430c-4d64-b177-74a2eba5a09f\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.766501 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23289c0-430c-4d64-b177-74a2eba5a09f-run-httpd\") pod \"c23289c0-430c-4d64-b177-74a2eba5a09f\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.766678 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-config-data\") pod \"c23289c0-430c-4d64-b177-74a2eba5a09f\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.766776 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9vhw\" (UniqueName: \"kubernetes.io/projected/c23289c0-430c-4d64-b177-74a2eba5a09f-kube-api-access-g9vhw\") pod \"c23289c0-430c-4d64-b177-74a2eba5a09f\" (UID: \"c23289c0-430c-4d64-b177-74a2eba5a09f\") " Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.767235 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23289c0-430c-4d64-b177-74a2eba5a09f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c23289c0-430c-4d64-b177-74a2eba5a09f" (UID: "c23289c0-430c-4d64-b177-74a2eba5a09f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.768135 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23289c0-430c-4d64-b177-74a2eba5a09f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c23289c0-430c-4d64-b177-74a2eba5a09f" (UID: "c23289c0-430c-4d64-b177-74a2eba5a09f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.768178 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23289c0-430c-4d64-b177-74a2eba5a09f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.774777 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-scripts" (OuterVolumeSpecName: "scripts") pod "c23289c0-430c-4d64-b177-74a2eba5a09f" (UID: "c23289c0-430c-4d64-b177-74a2eba5a09f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.780199 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c23289c0-430c-4d64-b177-74a2eba5a09f-kube-api-access-g9vhw" (OuterVolumeSpecName: "kube-api-access-g9vhw") pod "c23289c0-430c-4d64-b177-74a2eba5a09f" (UID: "c23289c0-430c-4d64-b177-74a2eba5a09f"). InnerVolumeSpecName "kube-api-access-g9vhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.823789 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c23289c0-430c-4d64-b177-74a2eba5a09f" (UID: "c23289c0-430c-4d64-b177-74a2eba5a09f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.872778 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9vhw\" (UniqueName: \"kubernetes.io/projected/c23289c0-430c-4d64-b177-74a2eba5a09f-kube-api-access-g9vhw\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.873123 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23289c0-430c-4d64-b177-74a2eba5a09f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.873190 5047 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.873250 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.876031 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c23289c0-430c-4d64-b177-74a2eba5a09f" (UID: "c23289c0-430c-4d64-b177-74a2eba5a09f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.910118 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-config-data" (OuterVolumeSpecName: "config-data") pod "c23289c0-430c-4d64-b177-74a2eba5a09f" (UID: "c23289c0-430c-4d64-b177-74a2eba5a09f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.974829 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:16 crc kubenswrapper[5047]: I0223 08:59:16.975304 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23289c0-430c-4d64-b177-74a2eba5a09f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.285508 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.462865 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.462841 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23289c0-430c-4d64-b177-74a2eba5a09f","Type":"ContainerDied","Data":"619ce537754aa39af5c2ccf288d5f888ccae8ae8cee84603f28541cd847cd34b"} Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.463036 5047 scope.go:117] "RemoveContainer" containerID="3d289c02cef4ff3f3680dff6740e61f04461c782c8c63d6d33b366f845545f37" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.486896 5047 scope.go:117] "RemoveContainer" containerID="0e21d931a8dad7e4916143d30a545a73726a9eb165311b72ba230311ec886bb6" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.515260 5047 scope.go:117] "RemoveContainer" containerID="1194df452564d0b4c44a973f143079606911ffd9ef3df159521cc2c9a83dcf07" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.529037 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.534991 5047 scope.go:117] "RemoveContainer" containerID="5cd3d57ab8936ab135a4d4e6c8dad445aef3a80cf6f2b4a5f7d09c87f12a0c99" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.542947 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.553323 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:59:17 crc kubenswrapper[5047]: E0223 08:59:17.553683 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="ceilometer-notification-agent" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.553701 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="ceilometer-notification-agent" Feb 23 08:59:17 crc kubenswrapper[5047]: E0223 08:59:17.553730 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="proxy-httpd" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.553736 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="proxy-httpd" Feb 23 08:59:17 crc kubenswrapper[5047]: E0223 08:59:17.553760 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="ceilometer-central-agent" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.553766 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="ceilometer-central-agent" Feb 23 08:59:17 crc kubenswrapper[5047]: E0223 08:59:17.553782 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="sg-core" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.553787 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="sg-core" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.554002 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="sg-core" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.554029 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="proxy-httpd" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.554038 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="ceilometer-notification-agent" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.554051 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" containerName="ceilometer-central-agent" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.556882 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.563550 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.563869 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.564149 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.579922 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.690578 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.690940 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5swkn\" (UniqueName: \"kubernetes.io/projected/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-kube-api-access-5swkn\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.690966 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-log-httpd\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.690997 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-config-data\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.691013 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-scripts\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.691051 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.691084 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.691118 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-run-httpd\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.793414 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.793486 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5swkn\" (UniqueName: \"kubernetes.io/projected/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-kube-api-access-5swkn\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.793520 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-log-httpd\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.793563 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-config-data\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.793589 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-scripts\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.793635 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.793680 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.793728 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-run-httpd\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.794341 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-run-httpd\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.794633 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-log-httpd\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.800775 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-scripts\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.800826 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.804938 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.806358 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.816865 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5swkn\" (UniqueName: \"kubernetes.io/projected/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-kube-api-access-5swkn\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.817416 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-config-data\") pod \"ceilometer-0\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " pod="openstack/ceilometer-0" Feb 23 08:59:17 crc kubenswrapper[5047]: I0223 08:59:17.925393 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:59:18 crc kubenswrapper[5047]: I0223 08:59:18.357181 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c23289c0-430c-4d64-b177-74a2eba5a09f" path="/var/lib/kubelet/pods/c23289c0-430c-4d64-b177-74a2eba5a09f/volumes" Feb 23 08:59:18 crc kubenswrapper[5047]: I0223 08:59:18.411062 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:59:18 crc kubenswrapper[5047]: I0223 08:59:18.476327 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1853024c-432e-43e9-8d9b-cb5a4cdbc39f","Type":"ContainerStarted","Data":"98a71348c2797ffcd1130188b50e453e15bf9b1e58ee0d0713a25d02cd4af58b"} Feb 23 08:59:18 crc kubenswrapper[5047]: W0223 08:59:18.730872 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb0665b2_dd73_4a04_aa6c_43aeb1baf8f4.slice/crio-a59f11a217c85c33e20c1fd6a27e1b6de293527df8cb631747f8b0f586ac8e23 WatchSource:0}: Error finding container a59f11a217c85c33e20c1fd6a27e1b6de293527df8cb631747f8b0f586ac8e23: Status 404 returned error can't find the container with id a59f11a217c85c33e20c1fd6a27e1b6de293527df8cb631747f8b0f586ac8e23 Feb 23 08:59:19 crc kubenswrapper[5047]: I0223 08:59:19.289054 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:59:19 crc kubenswrapper[5047]: I0223 08:59:19.493534 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1853024c-432e-43e9-8d9b-cb5a4cdbc39f","Type":"ContainerStarted","Data":"34f60f373a002128de40a0235b7c75ab6d4805ff06e0522e8facc1e315214e3d"} Feb 23 08:59:19 crc kubenswrapper[5047]: I0223 08:59:19.494503 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-api" containerID="cri-o://9a344a70d305da84272f456c84d94bd1c65b8cb0b3595b9f1c5f4cde6cc01402" gracePeriod=30 Feb 23 08:59:19 crc kubenswrapper[5047]: I0223 08:59:19.494998 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-notifier" containerID="cri-o://98a71348c2797ffcd1130188b50e453e15bf9b1e58ee0d0713a25d02cd4af58b" gracePeriod=30 Feb 23 08:59:19 crc kubenswrapper[5047]: I0223 08:59:19.495019 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-evaluator" containerID="cri-o://e51df2969e8cab06944c3378d39d1d2e5c0c75e7deb374b823bcf69db53f32c7" gracePeriod=30 Feb 23 08:59:19 crc kubenswrapper[5047]: I0223 08:59:19.495108 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-listener" containerID="cri-o://34f60f373a002128de40a0235b7c75ab6d4805ff06e0522e8facc1e315214e3d" gracePeriod=30 Feb 23 08:59:19 crc kubenswrapper[5047]: I0223 08:59:19.508147 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4","Type":"ContainerStarted","Data":"7489988b6f0fac175bc52429fb0ad6ac2148d000108e81107975b91d42cbc370"} Feb 23 08:59:19 crc kubenswrapper[5047]: I0223 08:59:19.508198 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4","Type":"ContainerStarted","Data":"cd238d20a084a0cfaa25cbb6f71030e0a8e68f9a45026e684eddb175283603c5"} Feb 23 08:59:19 crc kubenswrapper[5047]: I0223 08:59:19.508213 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4","Type":"ContainerStarted","Data":"a59f11a217c85c33e20c1fd6a27e1b6de293527df8cb631747f8b0f586ac8e23"} Feb 23 08:59:19 crc kubenswrapper[5047]: I0223 08:59:19.521489 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.9604871830000001 podStartE2EDuration="6.521475048s" podCreationTimestamp="2026-02-23 08:59:13 +0000 UTC" firstStartedPulling="2026-02-23 08:59:14.241162318 +0000 UTC m=+8076.492489452" lastFinishedPulling="2026-02-23 08:59:18.802150183 +0000 UTC m=+8081.053477317" observedRunningTime="2026-02-23 08:59:19.519958718 +0000 UTC m=+8081.771285852" watchObservedRunningTime="2026-02-23 08:59:19.521475048 +0000 UTC m=+8081.772802182" Feb 23 08:59:20 crc kubenswrapper[5047]: I0223 08:59:20.520211 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4","Type":"ContainerStarted","Data":"3714b187c77584f83e1e64ba7d6c734ef5f89206d0843fd06e3e5570083893a6"} Feb 23 08:59:20 crc kubenswrapper[5047]: I0223 08:59:20.522768 5047 generic.go:334] "Generic (PLEG): container finished" podID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerID="e51df2969e8cab06944c3378d39d1d2e5c0c75e7deb374b823bcf69db53f32c7" exitCode=0 Feb 23 08:59:20 crc kubenswrapper[5047]: I0223 08:59:20.522794 5047 generic.go:334] "Generic (PLEG): container finished" podID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerID="9a344a70d305da84272f456c84d94bd1c65b8cb0b3595b9f1c5f4cde6cc01402" exitCode=0 Feb 23 08:59:20 crc kubenswrapper[5047]: I0223 08:59:20.522810 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1853024c-432e-43e9-8d9b-cb5a4cdbc39f","Type":"ContainerDied","Data":"e51df2969e8cab06944c3378d39d1d2e5c0c75e7deb374b823bcf69db53f32c7"} Feb 23 08:59:20 crc kubenswrapper[5047]: I0223 08:59:20.522825 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1853024c-432e-43e9-8d9b-cb5a4cdbc39f","Type":"ContainerDied","Data":"9a344a70d305da84272f456c84d94bd1c65b8cb0b3595b9f1c5f4cde6cc01402"} Feb 23 08:59:22 crc kubenswrapper[5047]: I0223 08:59:22.549660 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4","Type":"ContainerStarted","Data":"64e0dfa2a95116ae428b701a4546e6e7497a091305e28da272fa39f44e7fd20f"} Feb 23 08:59:22 crc kubenswrapper[5047]: I0223 08:59:22.550082 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="ceilometer-central-agent" containerID="cri-o://cd238d20a084a0cfaa25cbb6f71030e0a8e68f9a45026e684eddb175283603c5" gracePeriod=30 Feb 23 08:59:22 crc kubenswrapper[5047]: I0223 08:59:22.550154 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 08:59:22 crc kubenswrapper[5047]: I0223 08:59:22.550221 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="proxy-httpd" containerID="cri-o://64e0dfa2a95116ae428b701a4546e6e7497a091305e28da272fa39f44e7fd20f" gracePeriod=30 Feb 23 08:59:22 crc kubenswrapper[5047]: I0223 08:59:22.550276 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="sg-core" containerID="cri-o://3714b187c77584f83e1e64ba7d6c734ef5f89206d0843fd06e3e5570083893a6" gracePeriod=30 Feb 23 08:59:22 crc kubenswrapper[5047]: I0223 08:59:22.550321 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="ceilometer-notification-agent" containerID="cri-o://7489988b6f0fac175bc52429fb0ad6ac2148d000108e81107975b91d42cbc370" gracePeriod=30 Feb 23 08:59:22 crc kubenswrapper[5047]: I0223 08:59:22.575162 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.920244329 podStartE2EDuration="5.575141592s" podCreationTimestamp="2026-02-23 08:59:17 +0000 UTC" firstStartedPulling="2026-02-23 08:59:18.735658822 +0000 UTC m=+8080.986985956" lastFinishedPulling="2026-02-23 08:59:21.390556085 +0000 UTC m=+8083.641883219" observedRunningTime="2026-02-23 08:59:22.567961709 +0000 UTC m=+8084.819288853" watchObservedRunningTime="2026-02-23 08:59:22.575141592 +0000 UTC m=+8084.826468736" Feb 23 08:59:23 crc kubenswrapper[5047]: I0223 08:59:23.563408 5047 generic.go:334] "Generic (PLEG): container finished" podID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerID="64e0dfa2a95116ae428b701a4546e6e7497a091305e28da272fa39f44e7fd20f" exitCode=0 Feb 23 08:59:23 crc kubenswrapper[5047]: I0223 08:59:23.563698 5047 generic.go:334] "Generic (PLEG): container finished" podID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerID="3714b187c77584f83e1e64ba7d6c734ef5f89206d0843fd06e3e5570083893a6" exitCode=2 Feb 23 08:59:23 crc kubenswrapper[5047]: I0223 08:59:23.563708 5047 generic.go:334] "Generic (PLEG): container finished" podID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerID="7489988b6f0fac175bc52429fb0ad6ac2148d000108e81107975b91d42cbc370" exitCode=0 Feb 23 08:59:23 crc kubenswrapper[5047]: I0223 08:59:23.563628 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4","Type":"ContainerDied","Data":"64e0dfa2a95116ae428b701a4546e6e7497a091305e28da272fa39f44e7fd20f"} Feb 23 08:59:23 crc kubenswrapper[5047]: I0223 08:59:23.563747 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4","Type":"ContainerDied","Data":"3714b187c77584f83e1e64ba7d6c734ef5f89206d0843fd06e3e5570083893a6"} Feb 23 08:59:23 crc kubenswrapper[5047]: I0223 08:59:23.563766 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4","Type":"ContainerDied","Data":"7489988b6f0fac175bc52429fb0ad6ac2148d000108e81107975b91d42cbc370"} Feb 23 08:59:23 crc kubenswrapper[5047]: I0223 08:59:23.821202 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.111420 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.222253 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-run-httpd\") pod \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.222375 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5swkn\" (UniqueName: \"kubernetes.io/projected/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-kube-api-access-5swkn\") pod \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.222485 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-log-httpd\") pod \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.222602 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-config-data\") pod \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.222659 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-scripts\") pod \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.222711 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" (UID: "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.222807 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-sg-core-conf-yaml\") pod \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.222891 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-combined-ca-bundle\") pod \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.223003 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-ceilometer-tls-certs\") pod \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\" (UID: \"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4\") " Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.223076 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" (UID: "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.223790 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.223824 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.228255 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-kube-api-access-5swkn" (OuterVolumeSpecName: "kube-api-access-5swkn") pod "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" (UID: "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4"). InnerVolumeSpecName "kube-api-access-5swkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.241748 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-scripts" (OuterVolumeSpecName: "scripts") pod "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" (UID: "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.255076 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" (UID: "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.295160 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" (UID: "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.326104 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5swkn\" (UniqueName: \"kubernetes.io/projected/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-kube-api-access-5swkn\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.326309 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.326365 5047 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.326417 5047 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.380547 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" (UID: "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.393106 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-config-data" (OuterVolumeSpecName: "config-data") pod "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" (UID: "eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.428490 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.428550 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.635684 5047 generic.go:334] "Generic (PLEG): container finished" podID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerID="cd238d20a084a0cfaa25cbb6f71030e0a8e68f9a45026e684eddb175283603c5" exitCode=0 Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.635736 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.635753 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4","Type":"ContainerDied","Data":"cd238d20a084a0cfaa25cbb6f71030e0a8e68f9a45026e684eddb175283603c5"} Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.636542 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4","Type":"ContainerDied","Data":"a59f11a217c85c33e20c1fd6a27e1b6de293527df8cb631747f8b0f586ac8e23"} Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.636561 5047 scope.go:117] "RemoveContainer" containerID="64e0dfa2a95116ae428b701a4546e6e7497a091305e28da272fa39f44e7fd20f" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.673559 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.674862 5047 scope.go:117] "RemoveContainer" containerID="3714b187c77584f83e1e64ba7d6c734ef5f89206d0843fd06e3e5570083893a6" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.699317 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.711003 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:59:28 crc kubenswrapper[5047]: E0223 08:59:28.711501 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="ceilometer-central-agent" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.711542 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="ceilometer-central-agent" Feb 23 08:59:28 crc kubenswrapper[5047]: E0223 08:59:28.711561 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="sg-core" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.711573 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="sg-core" Feb 23 08:59:28 crc kubenswrapper[5047]: E0223 08:59:28.711596 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="ceilometer-notification-agent" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.711605 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="ceilometer-notification-agent" Feb 23 08:59:28 crc kubenswrapper[5047]: E0223 08:59:28.711636 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="proxy-httpd" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.711644 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="proxy-httpd" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.711896 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="sg-core" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.711946 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="proxy-httpd" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.711978 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="ceilometer-notification-agent" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.711990 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" containerName="ceilometer-central-agent" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.714139 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.716564 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.717224 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.722951 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.743604 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.765052 5047 scope.go:117] "RemoveContainer" containerID="7489988b6f0fac175bc52429fb0ad6ac2148d000108e81107975b91d42cbc370" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.789344 5047 scope.go:117] "RemoveContainer" containerID="cd238d20a084a0cfaa25cbb6f71030e0a8e68f9a45026e684eddb175283603c5" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.815701 5047 scope.go:117] "RemoveContainer" containerID="64e0dfa2a95116ae428b701a4546e6e7497a091305e28da272fa39f44e7fd20f" Feb 23 08:59:28 crc kubenswrapper[5047]: E0223 08:59:28.816245 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64e0dfa2a95116ae428b701a4546e6e7497a091305e28da272fa39f44e7fd20f\": container with ID starting with 64e0dfa2a95116ae428b701a4546e6e7497a091305e28da272fa39f44e7fd20f not found: ID does not exist" containerID="64e0dfa2a95116ae428b701a4546e6e7497a091305e28da272fa39f44e7fd20f" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.816289 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64e0dfa2a95116ae428b701a4546e6e7497a091305e28da272fa39f44e7fd20f"} err="failed to get container status \"64e0dfa2a95116ae428b701a4546e6e7497a091305e28da272fa39f44e7fd20f\": rpc error: code = NotFound desc = could not find container \"64e0dfa2a95116ae428b701a4546e6e7497a091305e28da272fa39f44e7fd20f\": container with ID starting with 64e0dfa2a95116ae428b701a4546e6e7497a091305e28da272fa39f44e7fd20f not found: ID does not exist" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.816315 5047 scope.go:117] "RemoveContainer" containerID="3714b187c77584f83e1e64ba7d6c734ef5f89206d0843fd06e3e5570083893a6" Feb 23 08:59:28 crc kubenswrapper[5047]: E0223 08:59:28.816677 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3714b187c77584f83e1e64ba7d6c734ef5f89206d0843fd06e3e5570083893a6\": container with ID starting with 3714b187c77584f83e1e64ba7d6c734ef5f89206d0843fd06e3e5570083893a6 not found: ID does not exist" containerID="3714b187c77584f83e1e64ba7d6c734ef5f89206d0843fd06e3e5570083893a6" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.816756 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3714b187c77584f83e1e64ba7d6c734ef5f89206d0843fd06e3e5570083893a6"} err="failed to get container status \"3714b187c77584f83e1e64ba7d6c734ef5f89206d0843fd06e3e5570083893a6\": rpc error: code = NotFound desc = could not find container \"3714b187c77584f83e1e64ba7d6c734ef5f89206d0843fd06e3e5570083893a6\": container with ID starting with 3714b187c77584f83e1e64ba7d6c734ef5f89206d0843fd06e3e5570083893a6 not found: ID does not exist" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.816771 5047 scope.go:117] "RemoveContainer" containerID="7489988b6f0fac175bc52429fb0ad6ac2148d000108e81107975b91d42cbc370" Feb 23 08:59:28 crc kubenswrapper[5047]: E0223 08:59:28.817041 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7489988b6f0fac175bc52429fb0ad6ac2148d000108e81107975b91d42cbc370\": container with ID starting with 7489988b6f0fac175bc52429fb0ad6ac2148d000108e81107975b91d42cbc370 not found: ID does not exist" containerID="7489988b6f0fac175bc52429fb0ad6ac2148d000108e81107975b91d42cbc370" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.817061 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7489988b6f0fac175bc52429fb0ad6ac2148d000108e81107975b91d42cbc370"} err="failed to get container status \"7489988b6f0fac175bc52429fb0ad6ac2148d000108e81107975b91d42cbc370\": rpc error: code = NotFound desc = could not find container \"7489988b6f0fac175bc52429fb0ad6ac2148d000108e81107975b91d42cbc370\": container with ID starting with 7489988b6f0fac175bc52429fb0ad6ac2148d000108e81107975b91d42cbc370 not found: ID does not exist" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.817075 5047 scope.go:117] "RemoveContainer" containerID="cd238d20a084a0cfaa25cbb6f71030e0a8e68f9a45026e684eddb175283603c5" Feb 23 08:59:28 crc kubenswrapper[5047]: E0223 08:59:28.817407 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd238d20a084a0cfaa25cbb6f71030e0a8e68f9a45026e684eddb175283603c5\": container with ID starting with cd238d20a084a0cfaa25cbb6f71030e0a8e68f9a45026e684eddb175283603c5 not found: ID does not exist" containerID="cd238d20a084a0cfaa25cbb6f71030e0a8e68f9a45026e684eddb175283603c5" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.817429 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd238d20a084a0cfaa25cbb6f71030e0a8e68f9a45026e684eddb175283603c5"} err="failed to get container status \"cd238d20a084a0cfaa25cbb6f71030e0a8e68f9a45026e684eddb175283603c5\": rpc error: code = NotFound desc = could not find container \"cd238d20a084a0cfaa25cbb6f71030e0a8e68f9a45026e684eddb175283603c5\": container with ID starting with cd238d20a084a0cfaa25cbb6f71030e0a8e68f9a45026e684eddb175283603c5 not found: ID does not exist" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.834935 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.835004 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6dr2\" (UniqueName: \"kubernetes.io/projected/b24cc8b9-bf0e-45d8-85a9-7c3937896968-kube-api-access-h6dr2\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.835143 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b24cc8b9-bf0e-45d8-85a9-7c3937896968-run-httpd\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.835223 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-scripts\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.835276 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.835332 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-config-data\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.835463 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b24cc8b9-bf0e-45d8-85a9-7c3937896968-log-httpd\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.835488 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.937330 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6dr2\" (UniqueName: \"kubernetes.io/projected/b24cc8b9-bf0e-45d8-85a9-7c3937896968-kube-api-access-h6dr2\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.937420 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b24cc8b9-bf0e-45d8-85a9-7c3937896968-run-httpd\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.937457 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-scripts\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.937481 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.937506 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-config-data\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.937560 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b24cc8b9-bf0e-45d8-85a9-7c3937896968-log-httpd\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.937581 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.937659 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.938521 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b24cc8b9-bf0e-45d8-85a9-7c3937896968-log-httpd\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.938653 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b24cc8b9-bf0e-45d8-85a9-7c3937896968-run-httpd\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.943730 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-scripts\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.943828 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.946884 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-config-data\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.948411 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.949712 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:28 crc kubenswrapper[5047]: I0223 08:59:28.960379 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6dr2\" (UniqueName: \"kubernetes.io/projected/b24cc8b9-bf0e-45d8-85a9-7c3937896968-kube-api-access-h6dr2\") pod \"ceilometer-0\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " pod="openstack/ceilometer-0" Feb 23 08:59:29 crc kubenswrapper[5047]: I0223 08:59:29.073393 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:59:29 crc kubenswrapper[5047]: I0223 08:59:29.552600 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:59:29 crc kubenswrapper[5047]: I0223 08:59:29.647437 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b24cc8b9-bf0e-45d8-85a9-7c3937896968","Type":"ContainerStarted","Data":"36109629f4a0c8a04a72f128f8db98fbfb27085eeb7b6927cb9c00597f00dea5"} Feb 23 08:59:30 crc kubenswrapper[5047]: I0223 08:59:30.360305 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4" path="/var/lib/kubelet/pods/eb0665b2-dd73-4a04-aa6c-43aeb1baf8f4/volumes" Feb 23 08:59:30 crc kubenswrapper[5047]: I0223 08:59:30.659265 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b24cc8b9-bf0e-45d8-85a9-7c3937896968","Type":"ContainerStarted","Data":"5fe5d75b2b3b7484337c60c1fa98ffc49a3af8f8d35cefe2ba43edd510fddd68"} Feb 23 08:59:30 crc kubenswrapper[5047]: I0223 08:59:30.659323 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b24cc8b9-bf0e-45d8-85a9-7c3937896968","Type":"ContainerStarted","Data":"a65243912ed9e00054f4ebb306fb5fe73793f4ea7a467cd196686a7efbdd8df4"} Feb 23 08:59:31 crc kubenswrapper[5047]: I0223 08:59:31.670219 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b24cc8b9-bf0e-45d8-85a9-7c3937896968","Type":"ContainerStarted","Data":"60141a0931e5bb65b57978e1903e918449f3377efc61f92213ef2a271cd7c68c"} Feb 23 08:59:32 crc kubenswrapper[5047]: I0223 08:59:32.691826 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b24cc8b9-bf0e-45d8-85a9-7c3937896968","Type":"ContainerStarted","Data":"a0215ac3fbe253cdcd33e23224e66c3483f6042b430140b93acbd73cad3aa678"} Feb 23 08:59:32 crc kubenswrapper[5047]: I0223 08:59:32.717934 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.971688405 podStartE2EDuration="4.717884828s" podCreationTimestamp="2026-02-23 08:59:28 +0000 UTC" firstStartedPulling="2026-02-23 08:59:29.561082206 +0000 UTC m=+8091.812409340" lastFinishedPulling="2026-02-23 08:59:32.307278629 +0000 UTC m=+8094.558605763" observedRunningTime="2026-02-23 08:59:32.711441315 +0000 UTC m=+8094.962768449" watchObservedRunningTime="2026-02-23 08:59:32.717884828 +0000 UTC m=+8094.969211962" Feb 23 08:59:33 crc kubenswrapper[5047]: I0223 08:59:33.701441 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 08:59:39 crc kubenswrapper[5047]: I0223 08:59:39.059042 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-h4lq4"] Feb 23 08:59:39 crc kubenswrapper[5047]: I0223 08:59:39.066844 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-h4lq4"] Feb 23 08:59:40 crc kubenswrapper[5047]: I0223 08:59:40.352331 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb42e62-854d-4100-a22c-631a4292d890" path="/var/lib/kubelet/pods/6eb42e62-854d-4100-a22c-631a4292d890/volumes" Feb 23 08:59:49 crc kubenswrapper[5047]: I0223 08:59:49.883370 5047 generic.go:334] "Generic (PLEG): container finished" podID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerID="34f60f373a002128de40a0235b7c75ab6d4805ff06e0522e8facc1e315214e3d" exitCode=137 Feb 23 08:59:49 crc kubenswrapper[5047]: I0223 08:59:49.884016 5047 generic.go:334] "Generic (PLEG): container finished" podID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerID="98a71348c2797ffcd1130188b50e453e15bf9b1e58ee0d0713a25d02cd4af58b" exitCode=137 Feb 23 08:59:49 crc kubenswrapper[5047]: I0223 08:59:49.883581 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1853024c-432e-43e9-8d9b-cb5a4cdbc39f","Type":"ContainerDied","Data":"34f60f373a002128de40a0235b7c75ab6d4805ff06e0522e8facc1e315214e3d"} Feb 23 08:59:49 crc kubenswrapper[5047]: I0223 08:59:49.884087 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1853024c-432e-43e9-8d9b-cb5a4cdbc39f","Type":"ContainerDied","Data":"98a71348c2797ffcd1130188b50e453e15bf9b1e58ee0d0713a25d02cd4af58b"} Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.525097 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.662883 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-combined-ca-bundle\") pod \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.662972 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9hvn\" (UniqueName: \"kubernetes.io/projected/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-kube-api-access-m9hvn\") pod \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.663031 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-config-data\") pod \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.663138 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-scripts\") pod \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\" (UID: \"1853024c-432e-43e9-8d9b-cb5a4cdbc39f\") " Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.669566 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-kube-api-access-m9hvn" (OuterVolumeSpecName: "kube-api-access-m9hvn") pod "1853024c-432e-43e9-8d9b-cb5a4cdbc39f" (UID: "1853024c-432e-43e9-8d9b-cb5a4cdbc39f"). InnerVolumeSpecName "kube-api-access-m9hvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.670055 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-scripts" (OuterVolumeSpecName: "scripts") pod "1853024c-432e-43e9-8d9b-cb5a4cdbc39f" (UID: "1853024c-432e-43e9-8d9b-cb5a4cdbc39f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.765369 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.765410 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9hvn\" (UniqueName: \"kubernetes.io/projected/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-kube-api-access-m9hvn\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.816002 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-config-data" (OuterVolumeSpecName: "config-data") pod "1853024c-432e-43e9-8d9b-cb5a4cdbc39f" (UID: "1853024c-432e-43e9-8d9b-cb5a4cdbc39f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.818162 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1853024c-432e-43e9-8d9b-cb5a4cdbc39f" (UID: "1853024c-432e-43e9-8d9b-cb5a4cdbc39f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.867246 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.867527 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1853024c-432e-43e9-8d9b-cb5a4cdbc39f-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.923547 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1853024c-432e-43e9-8d9b-cb5a4cdbc39f","Type":"ContainerDied","Data":"983b9f330d92e41e6dca107d4b183483731b2bdde1192cf311b67c29a6e659d5"} Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.923600 5047 scope.go:117] "RemoveContainer" containerID="34f60f373a002128de40a0235b7c75ab6d4805ff06e0522e8facc1e315214e3d" Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.923709 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.962346 5047 scope.go:117] "RemoveContainer" containerID="98a71348c2797ffcd1130188b50e453e15bf9b1e58ee0d0713a25d02cd4af58b" Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.975876 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.989476 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Feb 23 08:59:50 crc kubenswrapper[5047]: I0223 08:59:50.995067 5047 scope.go:117] "RemoveContainer" containerID="e51df2969e8cab06944c3378d39d1d2e5c0c75e7deb374b823bcf69db53f32c7" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.028211 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 23 08:59:51 crc kubenswrapper[5047]: E0223 08:59:51.028652 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-api" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.028673 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-api" Feb 23 08:59:51 crc kubenswrapper[5047]: E0223 08:59:51.028692 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-listener" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.028701 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-listener" Feb 23 08:59:51 crc kubenswrapper[5047]: E0223 08:59:51.028716 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-notifier" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.028723 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-notifier" Feb 23 08:59:51 crc kubenswrapper[5047]: E0223 08:59:51.028735 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-evaluator" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.028742 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-evaluator" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.028930 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-api" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.028949 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-evaluator" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.028969 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-notifier" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.028981 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" containerName="aodh-listener" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.030662 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.035421 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.035594 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.035709 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.035831 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.035968 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-x7sd7" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.039702 5047 scope.go:117] "RemoveContainer" containerID="9a344a70d305da84272f456c84d94bd1c65b8cb0b3595b9f1c5f4cde6cc01402" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.064036 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.175319 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-internal-tls-certs\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.175515 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-public-tls-certs\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.175613 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-scripts\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.175648 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2jxh\" (UniqueName: \"kubernetes.io/projected/da9c6500-238e-415a-9e31-e7bf9ccdd205-kube-api-access-l2jxh\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.175780 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-combined-ca-bundle\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.176186 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-config-data\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.278039 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-public-tls-certs\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.278096 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-scripts\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.278125 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2jxh\" (UniqueName: \"kubernetes.io/projected/da9c6500-238e-415a-9e31-e7bf9ccdd205-kube-api-access-l2jxh\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.278165 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-combined-ca-bundle\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.278267 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-config-data\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.278295 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-internal-tls-certs\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.284732 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-public-tls-certs\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.286436 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-combined-ca-bundle\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.287252 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-internal-tls-certs\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.287469 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-scripts\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.288152 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-config-data\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.298739 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2jxh\" (UniqueName: \"kubernetes.io/projected/da9c6500-238e-415a-9e31-e7bf9ccdd205-kube-api-access-l2jxh\") pod \"aodh-0\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.358099 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 23 08:59:51 crc kubenswrapper[5047]: I0223 08:59:51.916414 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 23 08:59:51 crc kubenswrapper[5047]: W0223 08:59:51.928503 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda9c6500_238e_415a_9e31_e7bf9ccdd205.slice/crio-23fe375fd297f1a2f28ba51c2e0cd8e6145ccb0f45696d8d95c745b5f117ae0e WatchSource:0}: Error finding container 23fe375fd297f1a2f28ba51c2e0cd8e6145ccb0f45696d8d95c745b5f117ae0e: Status 404 returned error can't find the container with id 23fe375fd297f1a2f28ba51c2e0cd8e6145ccb0f45696d8d95c745b5f117ae0e Feb 23 08:59:52 crc kubenswrapper[5047]: I0223 08:59:52.357213 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1853024c-432e-43e9-8d9b-cb5a4cdbc39f" path="/var/lib/kubelet/pods/1853024c-432e-43e9-8d9b-cb5a4cdbc39f/volumes" Feb 23 08:59:52 crc kubenswrapper[5047]: I0223 08:59:52.951851 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da9c6500-238e-415a-9e31-e7bf9ccdd205","Type":"ContainerStarted","Data":"7a05e953fc80368078f66f82f035d35c3158450022c92ac887c442e42eb47b06"} Feb 23 08:59:52 crc kubenswrapper[5047]: I0223 08:59:52.951924 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da9c6500-238e-415a-9e31-e7bf9ccdd205","Type":"ContainerStarted","Data":"e5e7ee2c8b135051740b1ea7d291484779a9c2e5a708219176467993426ff5fd"} Feb 23 08:59:52 crc kubenswrapper[5047]: I0223 08:59:52.951939 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da9c6500-238e-415a-9e31-e7bf9ccdd205","Type":"ContainerStarted","Data":"23fe375fd297f1a2f28ba51c2e0cd8e6145ccb0f45696d8d95c745b5f117ae0e"} Feb 23 08:59:53 crc kubenswrapper[5047]: I0223 08:59:53.965156 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da9c6500-238e-415a-9e31-e7bf9ccdd205","Type":"ContainerStarted","Data":"e932f691f1d7bc2d70f16a64edd63fd7ca5272ea9a3dce01dda8b2cbb647641f"} Feb 23 08:59:53 crc kubenswrapper[5047]: I0223 08:59:53.965658 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da9c6500-238e-415a-9e31-e7bf9ccdd205","Type":"ContainerStarted","Data":"a0fef05f6987a47a8d1202e9246540bf050838d7c193ea1ef5e1bb51346912c0"} Feb 23 08:59:54 crc kubenswrapper[5047]: I0223 08:59:54.000268 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.367935843 podStartE2EDuration="4.000222121s" podCreationTimestamp="2026-02-23 08:59:50 +0000 UTC" firstStartedPulling="2026-02-23 08:59:51.935579608 +0000 UTC m=+8114.186906742" lastFinishedPulling="2026-02-23 08:59:53.567865886 +0000 UTC m=+8115.819193020" observedRunningTime="2026-02-23 08:59:53.988695081 +0000 UTC m=+8116.240022245" watchObservedRunningTime="2026-02-23 08:59:54.000222121 +0000 UTC m=+8116.251549255" Feb 23 08:59:59 crc kubenswrapper[5047]: I0223 08:59:59.106988 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 23 08:59:59 crc kubenswrapper[5047]: I0223 08:59:59.797095 5047 scope.go:117] "RemoveContainer" containerID="a164cf73073bc9b7a575ba24df33d8f55c9938205bd8eacdd659460c4cd11ee7" Feb 23 08:59:59 crc kubenswrapper[5047]: I0223 08:59:59.822097 5047 scope.go:117] "RemoveContainer" containerID="bafd5f1c994050944eaf630f40ff770762aa20659a81d657072958d5ad173f93" Feb 23 08:59:59 crc kubenswrapper[5047]: I0223 08:59:59.872782 5047 scope.go:117] "RemoveContainer" containerID="797cd075f412154a1f8d9ad573c8dace687d3d7909df29163a20b4d2fcc3c2ba" Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.187349 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld"] Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.189436 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.195546 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.196684 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.202202 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld"] Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.335513 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkxms\" (UniqueName: \"kubernetes.io/projected/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-kube-api-access-kkxms\") pod \"collect-profiles-29530620-985ld\" (UID: \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.335645 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-config-volume\") pod \"collect-profiles-29530620-985ld\" (UID: \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.335714 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-secret-volume\") pod \"collect-profiles-29530620-985ld\" (UID: \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.437441 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-config-volume\") pod \"collect-profiles-29530620-985ld\" (UID: \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.437540 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-secret-volume\") pod \"collect-profiles-29530620-985ld\" (UID: \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.437732 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkxms\" (UniqueName: \"kubernetes.io/projected/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-kube-api-access-kkxms\") pod \"collect-profiles-29530620-985ld\" (UID: \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.438599 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-config-volume\") pod \"collect-profiles-29530620-985ld\" (UID: \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.446785 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-secret-volume\") pod \"collect-profiles-29530620-985ld\" (UID: \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.456277 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkxms\" (UniqueName: \"kubernetes.io/projected/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-kube-api-access-kkxms\") pod \"collect-profiles-29530620-985ld\" (UID: \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" Feb 23 09:00:00 crc kubenswrapper[5047]: I0223 09:00:00.509436 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" Feb 23 09:00:01 crc kubenswrapper[5047]: I0223 09:00:01.010207 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld"] Feb 23 09:00:01 crc kubenswrapper[5047]: I0223 09:00:01.041037 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" event={"ID":"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d","Type":"ContainerStarted","Data":"295697bede6aa698299ec28c04a8478b36ad90bd9fa78a3c5463edf4c572355f"} Feb 23 09:00:01 crc kubenswrapper[5047]: E0223 09:00:01.937217 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0fe3ebe_6b2c_4070_b7da_23f413a0fd0d.slice/crio-conmon-a037c2158690fa99501c28013aed2d3913a466e2f67a19649b7f0524365a544a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0fe3ebe_6b2c_4070_b7da_23f413a0fd0d.slice/crio-a037c2158690fa99501c28013aed2d3913a466e2f67a19649b7f0524365a544a.scope\": RecentStats: unable to find data in memory cache]" Feb 23 09:00:02 crc kubenswrapper[5047]: I0223 09:00:02.053682 5047 generic.go:334] "Generic (PLEG): container finished" podID="e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d" containerID="a037c2158690fa99501c28013aed2d3913a466e2f67a19649b7f0524365a544a" exitCode=0 Feb 23 09:00:02 crc kubenswrapper[5047]: I0223 09:00:02.053789 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" event={"ID":"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d","Type":"ContainerDied","Data":"a037c2158690fa99501c28013aed2d3913a466e2f67a19649b7f0524365a544a"} Feb 23 09:00:03 crc kubenswrapper[5047]: I0223 09:00:03.517103 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" Feb 23 09:00:03 crc kubenswrapper[5047]: I0223 09:00:03.626410 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-config-volume\") pod \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\" (UID: \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\") " Feb 23 09:00:03 crc kubenswrapper[5047]: I0223 09:00:03.626652 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-secret-volume\") pod \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\" (UID: \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\") " Feb 23 09:00:03 crc kubenswrapper[5047]: I0223 09:00:03.627393 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-config-volume" (OuterVolumeSpecName: "config-volume") pod "e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d" (UID: "e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:00:03 crc kubenswrapper[5047]: I0223 09:00:03.627686 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkxms\" (UniqueName: \"kubernetes.io/projected/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-kube-api-access-kkxms\") pod \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\" (UID: \"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d\") " Feb 23 09:00:03 crc kubenswrapper[5047]: I0223 09:00:03.628452 5047 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:00:03 crc kubenswrapper[5047]: I0223 09:00:03.638326 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-kube-api-access-kkxms" (OuterVolumeSpecName: "kube-api-access-kkxms") pod "e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d" (UID: "e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d"). InnerVolumeSpecName "kube-api-access-kkxms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:00:03 crc kubenswrapper[5047]: I0223 09:00:03.647003 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d" (UID: "e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:00:03 crc kubenswrapper[5047]: I0223 09:00:03.729947 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkxms\" (UniqueName: \"kubernetes.io/projected/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-kube-api-access-kkxms\") on node \"crc\" DevicePath \"\"" Feb 23 09:00:03 crc kubenswrapper[5047]: I0223 09:00:03.729998 5047 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:00:04 crc kubenswrapper[5047]: I0223 09:00:04.078814 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" event={"ID":"e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d","Type":"ContainerDied","Data":"295697bede6aa698299ec28c04a8478b36ad90bd9fa78a3c5463edf4c572355f"} Feb 23 09:00:04 crc kubenswrapper[5047]: I0223 09:00:04.078864 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="295697bede6aa698299ec28c04a8478b36ad90bd9fa78a3c5463edf4c572355f" Feb 23 09:00:04 crc kubenswrapper[5047]: I0223 09:00:04.078933 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-985ld" Feb 23 09:00:04 crc kubenswrapper[5047]: I0223 09:00:04.609067 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn"] Feb 23 09:00:04 crc kubenswrapper[5047]: I0223 09:00:04.615833 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-khwmn"] Feb 23 09:00:06 crc kubenswrapper[5047]: I0223 09:00:06.363025 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d81715-a150-4c19-8c5f-ac88e34317c2" path="/var/lib/kubelet/pods/b0d81715-a150-4c19-8c5f-ac88e34317c2/volumes" Feb 23 09:00:09 crc kubenswrapper[5047]: I0223 09:00:09.042187 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-lh9zc"] Feb 23 09:00:09 crc kubenswrapper[5047]: I0223 09:00:09.051211 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5aa7-account-create-update-7sxdw"] Feb 23 09:00:09 crc kubenswrapper[5047]: I0223 09:00:09.061674 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-lh9zc"] Feb 23 09:00:09 crc kubenswrapper[5047]: I0223 09:00:09.071361 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5aa7-account-create-update-7sxdw"] Feb 23 09:00:10 crc kubenswrapper[5047]: I0223 09:00:10.368037 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d2461d7-38c9-4c73-b4d1-4bfb14579729" path="/var/lib/kubelet/pods/0d2461d7-38c9-4c73-b4d1-4bfb14579729/volumes" Feb 23 09:00:10 crc kubenswrapper[5047]: I0223 09:00:10.369284 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56cc0be6-76ee-4ceb-941c-7de1f44f8e5a" path="/var/lib/kubelet/pods/56cc0be6-76ee-4ceb-941c-7de1f44f8e5a/volumes" Feb 23 09:00:20 crc kubenswrapper[5047]: I0223 09:00:20.033728 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zdtw6"] Feb 23 09:00:20 crc kubenswrapper[5047]: I0223 09:00:20.043938 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zdtw6"] Feb 23 09:00:20 crc kubenswrapper[5047]: I0223 09:00:20.444406 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26ecf089-0275-4659-9180-dd68b3ee8b3a" path="/var/lib/kubelet/pods/26ecf089-0275-4659-9180-dd68b3ee8b3a/volumes" Feb 23 09:00:46 crc kubenswrapper[5047]: I0223 09:00:46.759508 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:00:46 crc kubenswrapper[5047]: I0223 09:00:46.760236 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.104794 5047 scope.go:117] "RemoveContainer" containerID="c2708b681368b38603116c4a55fd3d4ee45d4338cf1ec012ce29a141b51ed45d" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.139543 5047 scope.go:117] "RemoveContainer" containerID="82664f001a677293fec10488e9ce7ea1ab3bc3bd67b4a1161cfbf65efe2a9a95" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.177020 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29530621-csdh2"] Feb 23 09:01:00 crc kubenswrapper[5047]: E0223 09:01:00.178095 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d" containerName="collect-profiles" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.178131 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d" containerName="collect-profiles" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.178512 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0fe3ebe-6b2c-4070-b7da-23f413a0fd0d" containerName="collect-profiles" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.179688 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.199334 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530621-csdh2"] Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.200064 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd8wt\" (UniqueName: \"kubernetes.io/projected/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-kube-api-access-jd8wt\") pod \"keystone-cron-29530621-csdh2\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.200238 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-combined-ca-bundle\") pod \"keystone-cron-29530621-csdh2\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.200426 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-fernet-keys\") pod \"keystone-cron-29530621-csdh2\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.200714 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-config-data\") pod \"keystone-cron-29530621-csdh2\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.218672 5047 scope.go:117] "RemoveContainer" containerID="e8f5ead500e17b9d657ae1be75538f707abe9c64946bc1720280b8e0264adbc1" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.277123 5047 scope.go:117] "RemoveContainer" containerID="afd4a6068fba22f091867e206c4d7e4c0238f6a75c0b5879958e874b98751887" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.303660 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-combined-ca-bundle\") pod \"keystone-cron-29530621-csdh2\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.303729 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-fernet-keys\") pod \"keystone-cron-29530621-csdh2\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.303935 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-config-data\") pod \"keystone-cron-29530621-csdh2\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.303967 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd8wt\" (UniqueName: \"kubernetes.io/projected/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-kube-api-access-jd8wt\") pod \"keystone-cron-29530621-csdh2\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.311896 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-fernet-keys\") pod \"keystone-cron-29530621-csdh2\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.312343 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-config-data\") pod \"keystone-cron-29530621-csdh2\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.313835 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-combined-ca-bundle\") pod \"keystone-cron-29530621-csdh2\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.325932 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd8wt\" (UniqueName: \"kubernetes.io/projected/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-kube-api-access-jd8wt\") pod \"keystone-cron-29530621-csdh2\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.387988 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:00 crc kubenswrapper[5047]: I0223 09:01:00.970430 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530621-csdh2"] Feb 23 09:01:00 crc kubenswrapper[5047]: W0223 09:01:00.979603 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod854c6fd4_0b9e_4479_b732_d9ae5f72ea4f.slice/crio-d6a0af37732d44ec6a55379d811f20eb8477e3b3a8f534e8f55bdca7e046734a WatchSource:0}: Error finding container d6a0af37732d44ec6a55379d811f20eb8477e3b3a8f534e8f55bdca7e046734a: Status 404 returned error can't find the container with id d6a0af37732d44ec6a55379d811f20eb8477e3b3a8f534e8f55bdca7e046734a Feb 23 09:01:01 crc kubenswrapper[5047]: I0223 09:01:01.746312 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530621-csdh2" event={"ID":"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f","Type":"ContainerStarted","Data":"b8b6a129529add188d1a88c8e83dc2d8bb33b5794c67f05f63bbf20295fd9dd5"} Feb 23 09:01:01 crc kubenswrapper[5047]: I0223 09:01:01.746790 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530621-csdh2" event={"ID":"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f","Type":"ContainerStarted","Data":"d6a0af37732d44ec6a55379d811f20eb8477e3b3a8f534e8f55bdca7e046734a"} Feb 23 09:01:01 crc kubenswrapper[5047]: I0223 09:01:01.782009 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29530621-csdh2" podStartSLOduration=1.781971146 podStartE2EDuration="1.781971146s" podCreationTimestamp="2026-02-23 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:01:01.771406511 +0000 UTC m=+8184.022733735" watchObservedRunningTime="2026-02-23 09:01:01.781971146 +0000 UTC m=+8184.033298370" Feb 23 09:01:04 crc kubenswrapper[5047]: I0223 09:01:04.782822 5047 generic.go:334] "Generic (PLEG): container finished" podID="854c6fd4-0b9e-4479-b732-d9ae5f72ea4f" containerID="b8b6a129529add188d1a88c8e83dc2d8bb33b5794c67f05f63bbf20295fd9dd5" exitCode=0 Feb 23 09:01:04 crc kubenswrapper[5047]: I0223 09:01:04.783011 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530621-csdh2" event={"ID":"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f","Type":"ContainerDied","Data":"b8b6a129529add188d1a88c8e83dc2d8bb33b5794c67f05f63bbf20295fd9dd5"} Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.305094 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.371655 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd8wt\" (UniqueName: \"kubernetes.io/projected/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-kube-api-access-jd8wt\") pod \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.371758 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-combined-ca-bundle\") pod \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.372206 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-config-data\") pod \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.372293 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-fernet-keys\") pod \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\" (UID: \"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f\") " Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.377875 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "854c6fd4-0b9e-4479-b732-d9ae5f72ea4f" (UID: "854c6fd4-0b9e-4479-b732-d9ae5f72ea4f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.377894 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-kube-api-access-jd8wt" (OuterVolumeSpecName: "kube-api-access-jd8wt") pod "854c6fd4-0b9e-4479-b732-d9ae5f72ea4f" (UID: "854c6fd4-0b9e-4479-b732-d9ae5f72ea4f"). InnerVolumeSpecName "kube-api-access-jd8wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.410017 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "854c6fd4-0b9e-4479-b732-d9ae5f72ea4f" (UID: "854c6fd4-0b9e-4479-b732-d9ae5f72ea4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.428511 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-config-data" (OuterVolumeSpecName: "config-data") pod "854c6fd4-0b9e-4479-b732-d9ae5f72ea4f" (UID: "854c6fd4-0b9e-4479-b732-d9ae5f72ea4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.475464 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.475884 5047 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.475930 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd8wt\" (UniqueName: \"kubernetes.io/projected/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-kube-api-access-jd8wt\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.475952 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.816407 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530621-csdh2" event={"ID":"854c6fd4-0b9e-4479-b732-d9ae5f72ea4f","Type":"ContainerDied","Data":"d6a0af37732d44ec6a55379d811f20eb8477e3b3a8f534e8f55bdca7e046734a"} Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.816454 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6a0af37732d44ec6a55379d811f20eb8477e3b3a8f534e8f55bdca7e046734a" Feb 23 09:01:06 crc kubenswrapper[5047]: I0223 09:01:06.816501 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530621-csdh2" Feb 23 09:01:16 crc kubenswrapper[5047]: I0223 09:01:16.760083 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:01:16 crc kubenswrapper[5047]: I0223 09:01:16.760981 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:01:18 crc kubenswrapper[5047]: I0223 09:01:18.067007 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0ef9-account-create-update-dwtpm"] Feb 23 09:01:18 crc kubenswrapper[5047]: I0223 09:01:18.078038 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9pg8r"] Feb 23 09:01:18 crc kubenswrapper[5047]: I0223 09:01:18.088248 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-p9mvk"] Feb 23 09:01:18 crc kubenswrapper[5047]: I0223 09:01:18.098130 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9pg8r"] Feb 23 09:01:18 crc kubenswrapper[5047]: I0223 09:01:18.107595 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0ef9-account-create-update-dwtpm"] Feb 23 09:01:18 crc kubenswrapper[5047]: I0223 09:01:18.115617 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-p9mvk"] Feb 23 09:01:18 crc kubenswrapper[5047]: I0223 09:01:18.355969 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a81e655-7763-4211-a07b-8fcf1ab82ede" path="/var/lib/kubelet/pods/7a81e655-7763-4211-a07b-8fcf1ab82ede/volumes" Feb 23 09:01:18 crc kubenswrapper[5047]: I0223 09:01:18.356738 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f76d580-9fff-4d98-854c-01082ff6b82a" path="/var/lib/kubelet/pods/8f76d580-9fff-4d98-854c-01082ff6b82a/volumes" Feb 23 09:01:18 crc kubenswrapper[5047]: I0223 09:01:18.357426 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f13aae8d-09b0-4975-80da-a02d609028d7" path="/var/lib/kubelet/pods/f13aae8d-09b0-4975-80da-a02d609028d7/volumes" Feb 23 09:01:19 crc kubenswrapper[5047]: I0223 09:01:19.055235 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wwttw"] Feb 23 09:01:19 crc kubenswrapper[5047]: I0223 09:01:19.071145 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b6cf-account-create-update-xkq2k"] Feb 23 09:01:19 crc kubenswrapper[5047]: I0223 09:01:19.083036 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-aad1-account-create-update-6p4rv"] Feb 23 09:01:19 crc kubenswrapper[5047]: I0223 09:01:19.096346 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wwttw"] Feb 23 09:01:19 crc kubenswrapper[5047]: I0223 09:01:19.105354 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b6cf-account-create-update-xkq2k"] Feb 23 09:01:19 crc kubenswrapper[5047]: I0223 09:01:19.113638 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-aad1-account-create-update-6p4rv"] Feb 23 09:01:20 crc kubenswrapper[5047]: I0223 09:01:20.364531 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6" path="/var/lib/kubelet/pods/3df2c411-4a79-4b6a-bf8c-4efdd1a28ba6/volumes" Feb 23 09:01:20 crc kubenswrapper[5047]: I0223 09:01:20.365592 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699962d8-9437-4095-8743-67593f336f95" path="/var/lib/kubelet/pods/699962d8-9437-4095-8743-67593f336f95/volumes" Feb 23 09:01:20 crc kubenswrapper[5047]: I0223 09:01:20.366531 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e634979e-62c4-4419-8917-e9e4b008de10" path="/var/lib/kubelet/pods/e634979e-62c4-4419-8917-e9e4b008de10/volumes" Feb 23 09:01:42 crc kubenswrapper[5047]: I0223 09:01:42.086666 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wg7nn"] Feb 23 09:01:42 crc kubenswrapper[5047]: I0223 09:01:42.099437 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wg7nn"] Feb 23 09:01:42 crc kubenswrapper[5047]: I0223 09:01:42.352104 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e16a6f3-b78e-4388-a881-8371a2e2295f" path="/var/lib/kubelet/pods/5e16a6f3-b78e-4388-a881-8371a2e2295f/volumes" Feb 23 09:01:46 crc kubenswrapper[5047]: I0223 09:01:46.759934 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:01:46 crc kubenswrapper[5047]: I0223 09:01:46.760517 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:01:46 crc kubenswrapper[5047]: I0223 09:01:46.760570 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 09:01:46 crc kubenswrapper[5047]: I0223 09:01:46.761458 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:01:46 crc kubenswrapper[5047]: I0223 09:01:46.761516 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" gracePeriod=600 Feb 23 09:01:46 crc kubenswrapper[5047]: E0223 09:01:46.894720 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:01:47 crc kubenswrapper[5047]: I0223 09:01:47.244460 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" exitCode=0 Feb 23 09:01:47 crc kubenswrapper[5047]: I0223 09:01:47.244505 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492"} Feb 23 09:01:47 crc kubenswrapper[5047]: I0223 09:01:47.244537 5047 scope.go:117] "RemoveContainer" containerID="5ec7e4d435932fd8405e5cc018a986356ba6984287b8f0f9c351f6397165ec6c" Feb 23 09:01:47 crc kubenswrapper[5047]: I0223 09:01:47.245683 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:01:47 crc kubenswrapper[5047]: E0223 09:01:47.246323 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:02:00 crc kubenswrapper[5047]: I0223 09:02:00.418778 5047 scope.go:117] "RemoveContainer" containerID="6f3df3e41ece39ce4894e89c3599e80bdf48470ac34e98e79a40c703bb5ad2cb" Feb 23 09:02:00 crc kubenswrapper[5047]: I0223 09:02:00.468937 5047 scope.go:117] "RemoveContainer" containerID="62d3ce1c8e36802de08108bf2d341c7932489c2988b9b51c0a31bb156c86c725" Feb 23 09:02:00 crc kubenswrapper[5047]: I0223 09:02:00.571679 5047 scope.go:117] "RemoveContainer" containerID="518e4d6da9c96b15872e77d6d277ee4e6e57682d3ea3a0de2b053eb45a8a73e0" Feb 23 09:02:00 crc kubenswrapper[5047]: I0223 09:02:00.595518 5047 scope.go:117] "RemoveContainer" containerID="84015c9c9972656e9b3d1708d54e6f2eab6c8245200499b42efa88851b8205f3" Feb 23 09:02:00 crc kubenswrapper[5047]: I0223 09:02:00.651657 5047 scope.go:117] "RemoveContainer" containerID="c96385941abbc12d87ca7d23f31336f6e5447d8ab180955bf9e313dba733d5fb" Feb 23 09:02:00 crc kubenswrapper[5047]: I0223 09:02:00.695344 5047 scope.go:117] "RemoveContainer" containerID="87f6e8f1567802a3589710cbc69996fba4d07e1b87bdd2d285304d8b7b5997b1" Feb 23 09:02:00 crc kubenswrapper[5047]: I0223 09:02:00.754974 5047 scope.go:117] "RemoveContainer" containerID="7badcde11eeef8d36fa1c5d05192588cff9fabb6a6326c412253d56983b6de3a" Feb 23 09:02:01 crc kubenswrapper[5047]: I0223 09:02:01.341786 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:02:01 crc kubenswrapper[5047]: E0223 09:02:01.342797 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:02:02 crc kubenswrapper[5047]: I0223 09:02:02.044995 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-997jz"] Feb 23 09:02:02 crc kubenswrapper[5047]: I0223 09:02:02.053683 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-997jz"] Feb 23 09:02:02 crc kubenswrapper[5047]: I0223 09:02:02.354998 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dafcfc67-d904-4d7b-946f-1f04e880b98f" path="/var/lib/kubelet/pods/dafcfc67-d904-4d7b-946f-1f04e880b98f/volumes" Feb 23 09:02:03 crc kubenswrapper[5047]: I0223 09:02:03.039091 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-49rcc"] Feb 23 09:02:03 crc kubenswrapper[5047]: I0223 09:02:03.054835 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-49rcc"] Feb 23 09:02:04 crc kubenswrapper[5047]: I0223 09:02:04.358930 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dde3e53-3ac7-464f-a504-e1b815844cb4" path="/var/lib/kubelet/pods/3dde3e53-3ac7-464f-a504-e1b815844cb4/volumes" Feb 23 09:02:13 crc kubenswrapper[5047]: I0223 09:02:13.341095 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:02:13 crc kubenswrapper[5047]: E0223 09:02:13.342052 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:02:28 crc kubenswrapper[5047]: I0223 09:02:28.354744 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:02:28 crc kubenswrapper[5047]: E0223 09:02:28.356439 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.191624 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-317c-account-create-update-ppk28"] Feb 23 09:02:38 crc kubenswrapper[5047]: E0223 09:02:38.192501 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854c6fd4-0b9e-4479-b732-d9ae5f72ea4f" containerName="keystone-cron" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.192513 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="854c6fd4-0b9e-4479-b732-d9ae5f72ea4f" containerName="keystone-cron" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.192686 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="854c6fd4-0b9e-4479-b732-d9ae5f72ea4f" containerName="keystone-cron" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.193472 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-317c-account-create-update-ppk28" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.196207 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.275005 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nzgh\" (UniqueName: \"kubernetes.io/projected/fe43cf5f-4813-4df4-ba76-345d096d5816-kube-api-access-2nzgh\") pod \"neutron-317c-account-create-update-ppk28\" (UID: \"fe43cf5f-4813-4df4-ba76-345d096d5816\") " pod="openstack/neutron-317c-account-create-update-ppk28" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.275110 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe43cf5f-4813-4df4-ba76-345d096d5816-operator-scripts\") pod \"neutron-317c-account-create-update-ppk28\" (UID: \"fe43cf5f-4813-4df4-ba76-345d096d5816\") " pod="openstack/neutron-317c-account-create-update-ppk28" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.292325 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-317c-account-create-update-ppk28"] Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.320574 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-rl8wd"] Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.321813 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rl8wd" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.342202 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.382272 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe43cf5f-4813-4df4-ba76-345d096d5816-operator-scripts\") pod \"neutron-317c-account-create-update-ppk28\" (UID: \"fe43cf5f-4813-4df4-ba76-345d096d5816\") " pod="openstack/neutron-317c-account-create-update-ppk28" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.382360 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7knbv\" (UniqueName: \"kubernetes.io/projected/b4565dbe-dc04-452f-a79e-bc09cb299f29-kube-api-access-7knbv\") pod \"root-account-create-update-rl8wd\" (UID: \"b4565dbe-dc04-452f-a79e-bc09cb299f29\") " pod="openstack/root-account-create-update-rl8wd" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.382440 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4565dbe-dc04-452f-a79e-bc09cb299f29-operator-scripts\") pod \"root-account-create-update-rl8wd\" (UID: \"b4565dbe-dc04-452f-a79e-bc09cb299f29\") " pod="openstack/root-account-create-update-rl8wd" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.382469 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nzgh\" (UniqueName: \"kubernetes.io/projected/fe43cf5f-4813-4df4-ba76-345d096d5816-kube-api-access-2nzgh\") pod \"neutron-317c-account-create-update-ppk28\" (UID: \"fe43cf5f-4813-4df4-ba76-345d096d5816\") " pod="openstack/neutron-317c-account-create-update-ppk28" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.383556 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe43cf5f-4813-4df4-ba76-345d096d5816-operator-scripts\") pod \"neutron-317c-account-create-update-ppk28\" (UID: \"fe43cf5f-4813-4df4-ba76-345d096d5816\") " pod="openstack/neutron-317c-account-create-update-ppk28" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.406100 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rl8wd"] Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.419328 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b60b-account-create-update-rwx74"] Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.420777 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b60b-account-create-update-rwx74" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.429475 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.450977 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b60b-account-create-update-rwx74"] Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.483866 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4565dbe-dc04-452f-a79e-bc09cb299f29-operator-scripts\") pod \"root-account-create-update-rl8wd\" (UID: \"b4565dbe-dc04-452f-a79e-bc09cb299f29\") " pod="openstack/root-account-create-update-rl8wd" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.483989 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9vlg\" (UniqueName: \"kubernetes.io/projected/ac795337-f981-4559-a2d7-9e3eb81f9e33-kube-api-access-g9vlg\") pod \"barbican-b60b-account-create-update-rwx74\" (UID: \"ac795337-f981-4559-a2d7-9e3eb81f9e33\") " pod="openstack/barbican-b60b-account-create-update-rwx74" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.484046 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac795337-f981-4559-a2d7-9e3eb81f9e33-operator-scripts\") pod \"barbican-b60b-account-create-update-rwx74\" (UID: \"ac795337-f981-4559-a2d7-9e3eb81f9e33\") " pod="openstack/barbican-b60b-account-create-update-rwx74" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.484084 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7knbv\" (UniqueName: \"kubernetes.io/projected/b4565dbe-dc04-452f-a79e-bc09cb299f29-kube-api-access-7knbv\") pod \"root-account-create-update-rl8wd\" (UID: \"b4565dbe-dc04-452f-a79e-bc09cb299f29\") " pod="openstack/root-account-create-update-rl8wd" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.484978 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4565dbe-dc04-452f-a79e-bc09cb299f29-operator-scripts\") pod \"root-account-create-update-rl8wd\" (UID: \"b4565dbe-dc04-452f-a79e-bc09cb299f29\") " pod="openstack/root-account-create-update-rl8wd" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.498640 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.516312 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nzgh\" (UniqueName: \"kubernetes.io/projected/fe43cf5f-4813-4df4-ba76-345d096d5816-kube-api-access-2nzgh\") pod \"neutron-317c-account-create-update-ppk28\" (UID: \"fe43cf5f-4813-4df4-ba76-345d096d5816\") " pod="openstack/neutron-317c-account-create-update-ppk28" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.527352 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-317c-account-create-update-ppk28" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.557566 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7knbv\" (UniqueName: \"kubernetes.io/projected/b4565dbe-dc04-452f-a79e-bc09cb299f29-kube-api-access-7knbv\") pod \"root-account-create-update-rl8wd\" (UID: \"b4565dbe-dc04-452f-a79e-bc09cb299f29\") " pod="openstack/root-account-create-update-rl8wd" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.592362 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9vlg\" (UniqueName: \"kubernetes.io/projected/ac795337-f981-4559-a2d7-9e3eb81f9e33-kube-api-access-g9vlg\") pod \"barbican-b60b-account-create-update-rwx74\" (UID: \"ac795337-f981-4559-a2d7-9e3eb81f9e33\") " pod="openstack/barbican-b60b-account-create-update-rwx74" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.592848 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac795337-f981-4559-a2d7-9e3eb81f9e33-operator-scripts\") pod \"barbican-b60b-account-create-update-rwx74\" (UID: \"ac795337-f981-4559-a2d7-9e3eb81f9e33\") " pod="openstack/barbican-b60b-account-create-update-rwx74" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.593752 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac795337-f981-4559-a2d7-9e3eb81f9e33-operator-scripts\") pod \"barbican-b60b-account-create-update-rwx74\" (UID: \"ac795337-f981-4559-a2d7-9e3eb81f9e33\") " pod="openstack/barbican-b60b-account-create-update-rwx74" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.647161 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.647369 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="5eaba05e-ecdc-472a-b77f-a4fd716b467e" containerName="openstackclient" containerID="cri-o://69d3101d0dd1f058d3e374fea4657a2a86dab508f6826496a6c180ad1d829e76" gracePeriod=2 Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.656587 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9vlg\" (UniqueName: \"kubernetes.io/projected/ac795337-f981-4559-a2d7-9e3eb81f9e33-kube-api-access-g9vlg\") pod \"barbican-b60b-account-create-update-rwx74\" (UID: \"ac795337-f981-4559-a2d7-9e3eb81f9e33\") " pod="openstack/barbican-b60b-account-create-update-rwx74" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.689805 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rl8wd" Feb 23 09:02:38 crc kubenswrapper[5047]: E0223 09:02:38.702739 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 09:02:38 crc kubenswrapper[5047]: E0223 09:02:38.702791 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data podName:d8f82aad-7df9-4b14-a328-2cc708aeed84 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:39.202776543 +0000 UTC m=+8281.454103677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data") pod "rabbitmq-server-0" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84") : configmap "rabbitmq-config-data" not found Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.709002 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.709830 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-0953-account-create-update-psp84"] Feb 23 09:02:38 crc kubenswrapper[5047]: E0223 09:02:38.710289 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eaba05e-ecdc-472a-b77f-a4fd716b467e" containerName="openstackclient" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.710302 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eaba05e-ecdc-472a-b77f-a4fd716b467e" containerName="openstackclient" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.710542 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eaba05e-ecdc-472a-b77f-a4fd716b467e" containerName="openstackclient" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.711281 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0953-account-create-update-psp84" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.749603 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.775588 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0ef9-account-create-update-xrnmv"] Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.776950 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0ef9-account-create-update-xrnmv" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.777134 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b60b-account-create-update-rwx74" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.807889 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbvp4\" (UniqueName: \"kubernetes.io/projected/7607a87e-f072-4cdf-ba60-40b579d694ab-kube-api-access-wbvp4\") pod \"glance-0953-account-create-update-psp84\" (UID: \"7607a87e-f072-4cdf-ba60-40b579d694ab\") " pod="openstack/glance-0953-account-create-update-psp84" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.808043 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7607a87e-f072-4cdf-ba60-40b579d694ab-operator-scripts\") pod \"glance-0953-account-create-update-psp84\" (UID: \"7607a87e-f072-4cdf-ba60-40b579d694ab\") " pod="openstack/glance-0953-account-create-update-psp84" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.814924 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.839192 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5aa7-account-create-update-khr6g"] Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.840403 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5aa7-account-create-update-khr6g" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.866448 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.903305 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0953-account-create-update-psp84"] Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.916149 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbvp4\" (UniqueName: \"kubernetes.io/projected/7607a87e-f072-4cdf-ba60-40b579d694ab-kube-api-access-wbvp4\") pod \"glance-0953-account-create-update-psp84\" (UID: \"7607a87e-f072-4cdf-ba60-40b579d694ab\") " pod="openstack/glance-0953-account-create-update-psp84" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.916203 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9j7g\" (UniqueName: \"kubernetes.io/projected/a0b8f96e-aaed-43c4-9906-8920be9f478b-kube-api-access-h9j7g\") pod \"nova-api-0ef9-account-create-update-xrnmv\" (UID: \"a0b8f96e-aaed-43c4-9906-8920be9f478b\") " pod="openstack/nova-api-0ef9-account-create-update-xrnmv" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.916264 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0b8f96e-aaed-43c4-9906-8920be9f478b-operator-scripts\") pod \"nova-api-0ef9-account-create-update-xrnmv\" (UID: \"a0b8f96e-aaed-43c4-9906-8920be9f478b\") " pod="openstack/nova-api-0ef9-account-create-update-xrnmv" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.916335 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7607a87e-f072-4cdf-ba60-40b579d694ab-operator-scripts\") pod \"glance-0953-account-create-update-psp84\" (UID: \"7607a87e-f072-4cdf-ba60-40b579d694ab\") " pod="openstack/glance-0953-account-create-update-psp84" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.917019 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7607a87e-f072-4cdf-ba60-40b579d694ab-operator-scripts\") pod \"glance-0953-account-create-update-psp84\" (UID: \"7607a87e-f072-4cdf-ba60-40b579d694ab\") " pod="openstack/glance-0953-account-create-update-psp84" Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.952595 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0ef9-account-create-update-xrnmv"] Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.972968 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5aa7-account-create-update-khr6g"] Feb 23 09:02:38 crc kubenswrapper[5047]: I0223 09:02:38.995049 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-aad1-account-create-update-n2bdk"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.003611 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aad1-account-create-update-n2bdk" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.020002 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.020198 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0b8f96e-aaed-43c4-9906-8920be9f478b-operator-scripts\") pod \"nova-api-0ef9-account-create-update-xrnmv\" (UID: \"a0b8f96e-aaed-43c4-9906-8920be9f478b\") " pod="openstack/nova-api-0ef9-account-create-update-xrnmv" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.020383 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc9cs\" (UniqueName: \"kubernetes.io/projected/f287ea57-5fcf-42d6-a886-fb5f35962785-kube-api-access-dc9cs\") pod \"placement-5aa7-account-create-update-khr6g\" (UID: \"f287ea57-5fcf-42d6-a886-fb5f35962785\") " pod="openstack/placement-5aa7-account-create-update-khr6g" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.020442 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9j7g\" (UniqueName: \"kubernetes.io/projected/a0b8f96e-aaed-43c4-9906-8920be9f478b-kube-api-access-h9j7g\") pod \"nova-api-0ef9-account-create-update-xrnmv\" (UID: \"a0b8f96e-aaed-43c4-9906-8920be9f478b\") " pod="openstack/nova-api-0ef9-account-create-update-xrnmv" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.020957 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0b8f96e-aaed-43c4-9906-8920be9f478b-operator-scripts\") pod \"nova-api-0ef9-account-create-update-xrnmv\" (UID: \"a0b8f96e-aaed-43c4-9906-8920be9f478b\") " pod="openstack/nova-api-0ef9-account-create-update-xrnmv" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.054193 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f287ea57-5fcf-42d6-a886-fb5f35962785-operator-scripts\") pod \"placement-5aa7-account-create-update-khr6g\" (UID: \"f287ea57-5fcf-42d6-a886-fb5f35962785\") " pod="openstack/placement-5aa7-account-create-update-khr6g" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.058658 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbvp4\" (UniqueName: \"kubernetes.io/projected/7607a87e-f072-4cdf-ba60-40b579d694ab-kube-api-access-wbvp4\") pod \"glance-0953-account-create-update-psp84\" (UID: \"7607a87e-f072-4cdf-ba60-40b579d694ab\") " pod="openstack/glance-0953-account-create-update-psp84" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.106449 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9j7g\" (UniqueName: \"kubernetes.io/projected/a0b8f96e-aaed-43c4-9906-8920be9f478b-kube-api-access-h9j7g\") pod \"nova-api-0ef9-account-create-update-xrnmv\" (UID: \"a0b8f96e-aaed-43c4-9906-8920be9f478b\") " pod="openstack/nova-api-0ef9-account-create-update-xrnmv" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.182773 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6c0a-account-create-update-mngkq"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.185035 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6c0a-account-create-update-mngkq" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.200269 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.217743 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-aad1-account-create-update-n2bdk"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.249045 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjvw8\" (UniqueName: \"kubernetes.io/projected/dd7a46ed-9033-4c72-9f92-34816276560b-kube-api-access-vjvw8\") pod \"nova-cell0-aad1-account-create-update-n2bdk\" (UID: \"dd7a46ed-9033-4c72-9f92-34816276560b\") " pod="openstack/nova-cell0-aad1-account-create-update-n2bdk" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.249179 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc9cs\" (UniqueName: \"kubernetes.io/projected/f287ea57-5fcf-42d6-a886-fb5f35962785-kube-api-access-dc9cs\") pod \"placement-5aa7-account-create-update-khr6g\" (UID: \"f287ea57-5fcf-42d6-a886-fb5f35962785\") " pod="openstack/placement-5aa7-account-create-update-khr6g" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.249204 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd7a46ed-9033-4c72-9f92-34816276560b-operator-scripts\") pod \"nova-cell0-aad1-account-create-update-n2bdk\" (UID: \"dd7a46ed-9033-4c72-9f92-34816276560b\") " pod="openstack/nova-cell0-aad1-account-create-update-n2bdk" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.249243 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f287ea57-5fcf-42d6-a886-fb5f35962785-operator-scripts\") pod \"placement-5aa7-account-create-update-khr6g\" (UID: \"f287ea57-5fcf-42d6-a886-fb5f35962785\") " pod="openstack/placement-5aa7-account-create-update-khr6g" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.249987 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f287ea57-5fcf-42d6-a886-fb5f35962785-operator-scripts\") pod \"placement-5aa7-account-create-update-khr6g\" (UID: \"f287ea57-5fcf-42d6-a886-fb5f35962785\") " pod="openstack/placement-5aa7-account-create-update-khr6g" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.250412 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0953-account-create-update-psp84" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.261880 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0ef9-account-create-update-xrnmv" Feb 23 09:02:39 crc kubenswrapper[5047]: E0223 09:02:39.263192 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 09:02:39 crc kubenswrapper[5047]: E0223 09:02:39.263249 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data podName:d8f82aad-7df9-4b14-a328-2cc708aeed84 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:40.26322918 +0000 UTC m=+8282.514556314 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data") pod "rabbitmq-server-0" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84") : configmap "rabbitmq-config-data" not found Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.318523 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b6cf-account-create-update-r57lp"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.328495 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b6cf-account-create-update-r57lp" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.331464 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.356495 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjvw8\" (UniqueName: \"kubernetes.io/projected/dd7a46ed-9033-4c72-9f92-34816276560b-kube-api-access-vjvw8\") pod \"nova-cell0-aad1-account-create-update-n2bdk\" (UID: \"dd7a46ed-9033-4c72-9f92-34816276560b\") " pod="openstack/nova-cell0-aad1-account-create-update-n2bdk" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.356852 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd7a46ed-9033-4c72-9f92-34816276560b-operator-scripts\") pod \"nova-cell0-aad1-account-create-update-n2bdk\" (UID: \"dd7a46ed-9033-4c72-9f92-34816276560b\") " pod="openstack/nova-cell0-aad1-account-create-update-n2bdk" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.356998 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nnkk\" (UniqueName: \"kubernetes.io/projected/4ad1ab7e-d139-471c-9d7c-ceaea0bc0271-kube-api-access-7nnkk\") pod \"cinder-6c0a-account-create-update-mngkq\" (UID: \"4ad1ab7e-d139-471c-9d7c-ceaea0bc0271\") " pod="openstack/cinder-6c0a-account-create-update-mngkq" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.357959 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd7a46ed-9033-4c72-9f92-34816276560b-operator-scripts\") pod \"nova-cell0-aad1-account-create-update-n2bdk\" (UID: \"dd7a46ed-9033-4c72-9f92-34816276560b\") " pod="openstack/nova-cell0-aad1-account-create-update-n2bdk" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.360562 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad1ab7e-d139-471c-9d7c-ceaea0bc0271-operator-scripts\") pod \"cinder-6c0a-account-create-update-mngkq\" (UID: \"4ad1ab7e-d139-471c-9d7c-ceaea0bc0271\") " pod="openstack/cinder-6c0a-account-create-update-mngkq" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.362945 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6c0a-account-create-update-mngkq"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.367817 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc9cs\" (UniqueName: \"kubernetes.io/projected/f287ea57-5fcf-42d6-a886-fb5f35962785-kube-api-access-dc9cs\") pod \"placement-5aa7-account-create-update-khr6g\" (UID: \"f287ea57-5fcf-42d6-a886-fb5f35962785\") " pod="openstack/placement-5aa7-account-create-update-khr6g" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.422119 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b6cf-account-create-update-r57lp"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.428664 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjvw8\" (UniqueName: \"kubernetes.io/projected/dd7a46ed-9033-4c72-9f92-34816276560b-kube-api-access-vjvw8\") pod \"nova-cell0-aad1-account-create-update-n2bdk\" (UID: \"dd7a46ed-9033-4c72-9f92-34816276560b\") " pod="openstack/nova-cell0-aad1-account-create-update-n2bdk" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.471996 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-d754-account-create-update-78cx7"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.475128 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts\") pod \"nova-cell1-b6cf-account-create-update-r57lp\" (UID: \"e444ce7d-b56a-406c-91ff-4623a469c13a\") " pod="openstack/nova-cell1-b6cf-account-create-update-r57lp" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.475197 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nnkk\" (UniqueName: \"kubernetes.io/projected/4ad1ab7e-d139-471c-9d7c-ceaea0bc0271-kube-api-access-7nnkk\") pod \"cinder-6c0a-account-create-update-mngkq\" (UID: \"4ad1ab7e-d139-471c-9d7c-ceaea0bc0271\") " pod="openstack/cinder-6c0a-account-create-update-mngkq" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.475273 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t46w\" (UniqueName: \"kubernetes.io/projected/e444ce7d-b56a-406c-91ff-4623a469c13a-kube-api-access-4t46w\") pod \"nova-cell1-b6cf-account-create-update-r57lp\" (UID: \"e444ce7d-b56a-406c-91ff-4623a469c13a\") " pod="openstack/nova-cell1-b6cf-account-create-update-r57lp" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.475312 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad1ab7e-d139-471c-9d7c-ceaea0bc0271-operator-scripts\") pod \"cinder-6c0a-account-create-update-mngkq\" (UID: \"4ad1ab7e-d139-471c-9d7c-ceaea0bc0271\") " pod="openstack/cinder-6c0a-account-create-update-mngkq" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.478264 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad1ab7e-d139-471c-9d7c-ceaea0bc0271-operator-scripts\") pod \"cinder-6c0a-account-create-update-mngkq\" (UID: \"4ad1ab7e-d139-471c-9d7c-ceaea0bc0271\") " pod="openstack/cinder-6c0a-account-create-update-mngkq" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.481125 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-d754-account-create-update-78cx7"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.501381 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.549338 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nnkk\" (UniqueName: \"kubernetes.io/projected/4ad1ab7e-d139-471c-9d7c-ceaea0bc0271-kube-api-access-7nnkk\") pod \"cinder-6c0a-account-create-update-mngkq\" (UID: \"4ad1ab7e-d139-471c-9d7c-ceaea0bc0271\") " pod="openstack/cinder-6c0a-account-create-update-mngkq" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.581489 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5aa7-account-create-update-khr6g" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.583189 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts\") pod \"nova-cell1-b6cf-account-create-update-r57lp\" (UID: \"e444ce7d-b56a-406c-91ff-4623a469c13a\") " pod="openstack/nova-cell1-b6cf-account-create-update-r57lp" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.583265 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t46w\" (UniqueName: \"kubernetes.io/projected/e444ce7d-b56a-406c-91ff-4623a469c13a-kube-api-access-4t46w\") pod \"nova-cell1-b6cf-account-create-update-r57lp\" (UID: \"e444ce7d-b56a-406c-91ff-4623a469c13a\") " pod="openstack/nova-cell1-b6cf-account-create-update-r57lp" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.583871 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts\") pod \"nova-cell1-b6cf-account-create-update-r57lp\" (UID: \"e444ce7d-b56a-406c-91ff-4623a469c13a\") " pod="openstack/nova-cell1-b6cf-account-create-update-r57lp" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.644038 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-jkd76"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.663467 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.664341 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="346c3f0d-e5fb-40f6-bd1e-65679466165f" containerName="openstack-network-exporter" containerID="cri-o://5b5cdce23bc6b1850489f770e11653c0e0f5c790fb261d4311d63e59f1d91444" gracePeriod=300 Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.674498 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aad1-account-create-update-n2bdk" Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.679322 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t46w\" (UniqueName: \"kubernetes.io/projected/e444ce7d-b56a-406c-91ff-4623a469c13a-kube-api-access-4t46w\") pod \"nova-cell1-b6cf-account-create-update-r57lp\" (UID: \"e444ce7d-b56a-406c-91ff-4623a469c13a\") " pod="openstack/nova-cell1-b6cf-account-create-update-r57lp" Feb 23 09:02:39 crc kubenswrapper[5047]: E0223 09:02:39.687020 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 09:02:39 crc kubenswrapper[5047]: E0223 09:02:39.687097 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data podName:db558a41-6dbf-4b18-af50-6a5311530ef4 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:40.187079487 +0000 UTC m=+8282.438406621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data") pod "rabbitmq-cell1-server-0" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4") : configmap "rabbitmq-cell1-config-data" not found Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.696337 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-jkd76"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.723426 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.724475 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-1" podUID="74acc7db-4095-419a-9b09-afa04283a69f" containerName="openstack-network-exporter" containerID="cri-o://c44ef2aca4d1a0638cb9dfd26e45f7f7dab935e68d88e63b1aa79658027107f1" gracePeriod=300 Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.734486 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.734829 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-2" podUID="385bab5b-7cad-4274-b672-a0614be3f41e" containerName="openstack-network-exporter" containerID="cri-o://f3246bebc9ebdff6972450f73eaa67d34d962a161929ce55ff3505b8bc91d983" gracePeriod=300 Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.858335 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.858801 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-2" podUID="0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" containerName="openstack-network-exporter" containerID="cri-o://dc2a3c442046d0ca90c2e2cf7cce3b69605e0fac97b43c4fa57bbf160f1e4fe4" gracePeriod=300 Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.891014 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.891414 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-1" podUID="146c85be-9d67-4281-873e-b27f5e90d957" containerName="openstack-network-exporter" containerID="cri-o://855e342977c6d200e9eb4077a8fada2e25afccca32a14e64e70912cbfbb54b29" gracePeriod=300 Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.950177 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.950497 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="c59427cb-019c-4f83-af18-75900909e70f" containerName="ovn-northd" containerID="cri-o://5474b9871e571cc4e16fdda4dacca659354e26baaaa9e5970121b7d51f36100b" gracePeriod=30 Feb 23 09:02:39 crc kubenswrapper[5047]: I0223 09:02:39.950657 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="c59427cb-019c-4f83-af18-75900909e70f" containerName="openstack-network-exporter" containerID="cri-o://300cc5bda7124e79cc3bdc763a253160aad67d2c3e2c6d44cbdf59edc78788c3" gracePeriod=30 Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.067067 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.067410 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="9205930b-2303-4b01-a3bf-cf4ef3ad0a49" containerName="openstack-network-exporter" containerID="cri-o://33c766011377115c89385b76818662881b608592e8f5fe9eacc9c3fff7aad9a0" gracePeriod=300 Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.095578 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6c0a-account-create-update-mngkq" Feb 23 09:02:40 crc kubenswrapper[5047]: E0223 09:02:40.208363 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 09:02:40 crc kubenswrapper[5047]: E0223 09:02:40.208505 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data podName:db558a41-6dbf-4b18-af50-6a5311530ef4 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:41.208462551 +0000 UTC m=+8283.459789685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data") pod "rabbitmq-cell1-server-0" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4") : configmap "rabbitmq-cell1-config-data" not found Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.296096 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mlbcp"] Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.299675 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mlbcp"] Feb 23 09:02:40 crc kubenswrapper[5047]: E0223 09:02:40.311369 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 09:02:40 crc kubenswrapper[5047]: E0223 09:02:40.311457 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data podName:d8f82aad-7df9-4b14-a328-2cc708aeed84 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:42.311437955 +0000 UTC m=+8284.562765099 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data") pod "rabbitmq-server-0" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84") : configmap "rabbitmq-config-data" not found Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.336569 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-dcb1-account-create-update-s6wxg"] Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.337874 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-dcb1-account-create-update-s6wxg" Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.353260 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.412667 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8bbc\" (UniqueName: \"kubernetes.io/projected/26d4528b-cbbd-4c90-97e6-73924844d615-kube-api-access-l8bbc\") pod \"aodh-dcb1-account-create-update-s6wxg\" (UID: \"26d4528b-cbbd-4c90-97e6-73924844d615\") " pod="openstack/aodh-dcb1-account-create-update-s6wxg" Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.412887 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d4528b-cbbd-4c90-97e6-73924844d615-operator-scripts\") pod \"aodh-dcb1-account-create-update-s6wxg\" (UID: \"26d4528b-cbbd-4c90-97e6-73924844d615\") " pod="openstack/aodh-dcb1-account-create-update-s6wxg" Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.514310 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d4528b-cbbd-4c90-97e6-73924844d615-operator-scripts\") pod \"aodh-dcb1-account-create-update-s6wxg\" (UID: \"26d4528b-cbbd-4c90-97e6-73924844d615\") " pod="openstack/aodh-dcb1-account-create-update-s6wxg" Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.514375 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8bbc\" (UniqueName: \"kubernetes.io/projected/26d4528b-cbbd-4c90-97e6-73924844d615-kube-api-access-l8bbc\") pod \"aodh-dcb1-account-create-update-s6wxg\" (UID: \"26d4528b-cbbd-4c90-97e6-73924844d615\") " pod="openstack/aodh-dcb1-account-create-update-s6wxg" Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.516137 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d4528b-cbbd-4c90-97e6-73924844d615-operator-scripts\") pod \"aodh-dcb1-account-create-update-s6wxg\" (UID: \"26d4528b-cbbd-4c90-97e6-73924844d615\") " pod="openstack/aodh-dcb1-account-create-update-s6wxg" Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.529248 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="346c3f0d-e5fb-40f6-bd1e-65679466165f" containerName="ovsdbserver-sb" containerID="cri-o://3af56d9850e342eaf92994ecc8a16c1e2545b5684d7c31b87e2addd0e9540af4" gracePeriod=300 Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.551796 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8bbc\" (UniqueName: \"kubernetes.io/projected/26d4528b-cbbd-4c90-97e6-73924844d615-kube-api-access-l8bbc\") pod \"aodh-dcb1-account-create-update-s6wxg\" (UID: \"26d4528b-cbbd-4c90-97e6-73924844d615\") " pod="openstack/aodh-dcb1-account-create-update-s6wxg" Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.591079 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="9205930b-2303-4b01-a3bf-cf4ef3ad0a49" containerName="ovsdbserver-nb" containerID="cri-o://809716648c4107c38d6ed5e1da34f1681092785995acc0e50a5dfbd5e3502aa8" gracePeriod=300 Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.613601 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4323bc60-603e-45a2-81fb-7a8fa07d07c7" path="/var/lib/kubelet/pods/4323bc60-603e-45a2-81fb-7a8fa07d07c7/volumes" Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.619758 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49355e96-b059-44c8-8a2e-f4f0b3486526" path="/var/lib/kubelet/pods/49355e96-b059-44c8-8a2e-f4f0b3486526/volumes" Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.622591 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553" path="/var/lib/kubelet/pods/cd5a5bd3-90cf-4e07-ae1c-53f13ee3a553/volumes" Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.636589 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-dcb1-account-create-update-s6wxg"] Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.636716 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-dcb1-account-create-update-96tpc"] Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.636800 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-dcb1-account-create-update-96tpc"] Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.636863 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-749799ffdc-q8kjx"] Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.637277 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" podUID="51062b9d-51b2-4e47-b577-3cdc144cf0d1" containerName="dnsmasq-dns" containerID="cri-o://071c4bbb4d57ad6d8007e0dcce8f552bbb5d21daa763338dd2cc6338d661611c" gracePeriod=10 Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.683986 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-755d9b4d6f-jbxjv"] Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.684482 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-755d9b4d6f-jbxjv" podUID="f1306003-e12b-4db1-beeb-cd461db0975e" containerName="neutron-api" containerID="cri-o://90d8272e14124df09a1e4603e8bf04a61add7c1cf9078eea0c18f84d9f987fe3" gracePeriod=30 Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.686237 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b6cf-account-create-update-r57lp" Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.687268 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-755d9b4d6f-jbxjv" podUID="f1306003-e12b-4db1-beeb-cd461db0975e" containerName="neutron-httpd" containerID="cri-o://9ea42abec4f78326158bc5cb0cec2641d24efe082290fbd2778b5995dbeb07c2" gracePeriod=30 Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.752990 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-jppvl"] Feb 23 09:02:40 crc kubenswrapper[5047]: E0223 09:02:40.766963 5047 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.778134 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-jppvl"] Feb 23 09:02:40 crc kubenswrapper[5047]: E0223 09:02:40.795259 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts podName:e444ce7d-b56a-406c-91ff-4623a469c13a nodeName:}" failed. No retries permitted until 2026-02-23 09:02:41.295220125 +0000 UTC m=+8283.546547259 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts") pod "nova-cell1-b6cf-account-create-update-r57lp" (UID: "e444ce7d-b56a-406c-91ff-4623a469c13a") : configmap "openstack-cell1-scripts" not found Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.795506 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-qv6xg"] Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.851983 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-qv6xg"] Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.926084 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.926320 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4ce03d60-441e-4a79-acce-d54444aedfcb" containerName="glance-log" containerID="cri-o://ec75f8350065896340138084424783a8ac231166ea7fffbf795299442b505721" gracePeriod=30 Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.926784 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4ce03d60-441e-4a79-acce-d54444aedfcb" containerName="glance-httpd" containerID="cri-o://8c44a4d060fe0afbdab6a1f05ac20d1249c428efb3b01ca24ed8e720e9d10271" gracePeriod=30 Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.943860 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.944119 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3d570b43-d0ff-42d7-a305-9d7c2f9f9881" containerName="cinder-api-log" containerID="cri-o://3cd1af755bfcf4970c210c91d87839f5bae5b4edc4ad2683851e88ccef337c33" gracePeriod=30 Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.947660 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3d570b43-d0ff-42d7-a305-9d7c2f9f9881" containerName="cinder-api" containerID="cri-o://6a1e38661d1c5ddbb45bad7c88e6e2bbc8848d6927477296120c0dddf2445603" gracePeriod=30 Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.957711 5047 generic.go:334] "Generic (PLEG): container finished" podID="0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" containerID="dc2a3c442046d0ca90c2e2cf7cce3b69605e0fac97b43c4fa57bbf160f1e4fe4" exitCode=2 Feb 23 09:02:40 crc kubenswrapper[5047]: I0223 09:02:40.958109 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8","Type":"ContainerDied","Data":"dc2a3c442046d0ca90c2e2cf7cce3b69605e0fac97b43c4fa57bbf160f1e4fe4"} Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.007139 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-317c-account-create-update-ppk28"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.014606 5047 generic.go:334] "Generic (PLEG): container finished" podID="385bab5b-7cad-4274-b672-a0614be3f41e" containerID="f3246bebc9ebdff6972450f73eaa67d34d962a161929ce55ff3505b8bc91d983" exitCode=2 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.014661 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"385bab5b-7cad-4274-b672-a0614be3f41e","Type":"ContainerDied","Data":"f3246bebc9ebdff6972450f73eaa67d34d962a161929ce55ff3505b8bc91d983"} Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.022623 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9205930b-2303-4b01-a3bf-cf4ef3ad0a49/ovsdbserver-nb/0.log" Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.022667 5047 generic.go:334] "Generic (PLEG): container finished" podID="9205930b-2303-4b01-a3bf-cf4ef3ad0a49" containerID="33c766011377115c89385b76818662881b608592e8f5fe9eacc9c3fff7aad9a0" exitCode=2 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.022718 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9205930b-2303-4b01-a3bf-cf4ef3ad0a49","Type":"ContainerDied","Data":"33c766011377115c89385b76818662881b608592e8f5fe9eacc9c3fff7aad9a0"} Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.049429 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0953-account-create-update-psp84"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.070256 5047 generic.go:334] "Generic (PLEG): container finished" podID="c59427cb-019c-4f83-af18-75900909e70f" containerID="300cc5bda7124e79cc3bdc763a253160aad67d2c3e2c6d44cbdf59edc78788c3" exitCode=2 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.070398 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c59427cb-019c-4f83-af18-75900909e70f","Type":"ContainerDied","Data":"300cc5bda7124e79cc3bdc763a253160aad67d2c3e2c6d44cbdf59edc78788c3"} Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.094005 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-dcb1-account-create-update-s6wxg" Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.098061 5047 generic.go:334] "Generic (PLEG): container finished" podID="146c85be-9d67-4281-873e-b27f5e90d957" containerID="855e342977c6d200e9eb4077a8fada2e25afccca32a14e64e70912cbfbb54b29" exitCode=2 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.098126 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"146c85be-9d67-4281-873e-b27f5e90d957","Type":"ContainerDied","Data":"855e342977c6d200e9eb4077a8fada2e25afccca32a14e64e70912cbfbb54b29"} Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.113952 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.114394 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b6c3821a-cbde-41d5-95fd-1e617b2d12bc" containerName="cinder-scheduler" containerID="cri-o://458b0e1642c869d058c9b43e3416f43f71ec7ed0417844e4fdd5515463a4d080" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.114651 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b6c3821a-cbde-41d5-95fd-1e617b2d12bc" containerName="probe" containerID="cri-o://0ff894197ec53c6fa19bcceb0a1998cc1302ed5266ed87a7fab99b18cfd8df6e" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.123128 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_346c3f0d-e5fb-40f6-bd1e-65679466165f/ovsdbserver-sb/0.log" Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.123174 5047 generic.go:334] "Generic (PLEG): container finished" podID="346c3f0d-e5fb-40f6-bd1e-65679466165f" containerID="5b5cdce23bc6b1850489f770e11653c0e0f5c790fb261d4311d63e59f1d91444" exitCode=2 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.123192 5047 generic.go:334] "Generic (PLEG): container finished" podID="346c3f0d-e5fb-40f6-bd1e-65679466165f" containerID="3af56d9850e342eaf92994ecc8a16c1e2545b5684d7c31b87e2addd0e9540af4" exitCode=143 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.123261 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"346c3f0d-e5fb-40f6-bd1e-65679466165f","Type":"ContainerDied","Data":"5b5cdce23bc6b1850489f770e11653c0e0f5c790fb261d4311d63e59f1d91444"} Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.123289 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"346c3f0d-e5fb-40f6-bd1e-65679466165f","Type":"ContainerDied","Data":"3af56d9850e342eaf92994ecc8a16c1e2545b5684d7c31b87e2addd0e9540af4"} Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.132243 5047 generic.go:334] "Generic (PLEG): container finished" podID="74acc7db-4095-419a-9b09-afa04283a69f" containerID="c44ef2aca4d1a0638cb9dfd26e45f7f7dab935e68d88e63b1aa79658027107f1" exitCode=2 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.132295 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"74acc7db-4095-419a-9b09-afa04283a69f","Type":"ContainerDied","Data":"c44ef2aca4d1a0638cb9dfd26e45f7f7dab935e68d88e63b1aa79658027107f1"} Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.142470 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.142998 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="95116420-b62b-402c-bcbe-17026cba0354" containerName="glance-log" containerID="cri-o://438245a707c3bbe792477e7783157c77ef4990e9609e3fae09f2db6284e24d03" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.143446 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="95116420-b62b-402c-bcbe-17026cba0354" containerName="glance-httpd" containerID="cri-o://95c93cfd98d085f0620a320a6bc1bd384f6afe679ed96aea9f76c9b0475d04ab" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.243016 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.279961 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6c0a-account-create-update-mngkq"] Feb 23 09:02:41 crc kubenswrapper[5047]: E0223 09:02:41.308474 5047 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 23 09:02:41 crc kubenswrapper[5047]: E0223 09:02:41.308535 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts podName:e444ce7d-b56a-406c-91ff-4623a469c13a nodeName:}" failed. No retries permitted until 2026-02-23 09:02:42.308520242 +0000 UTC m=+8284.559847376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts") pod "nova-cell1-b6cf-account-create-update-r57lp" (UID: "e444ce7d-b56a-406c-91ff-4623a469c13a") : configmap "openstack-cell1-scripts" not found Feb 23 09:02:41 crc kubenswrapper[5047]: E0223 09:02:41.308534 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 09:02:41 crc kubenswrapper[5047]: E0223 09:02:41.308605 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data podName:db558a41-6dbf-4b18-af50-6a5311530ef4 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:43.308585194 +0000 UTC m=+8285.559912328 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data") pod "rabbitmq-cell1-server-0" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4") : configmap "rabbitmq-cell1-config-data" not found Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.355740 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6ff9694944-nmq4c"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.356519 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6ff9694944-nmq4c" podUID="07094621-ca88-4942-a226-76667658a5bc" containerName="placement-log" containerID="cri-o://20ea81041bb52ceb23d8bb279a41e64793f561c21214179c617e0cd48b7db8c3" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.356645 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6ff9694944-nmq4c" podUID="07094621-ca88-4942-a226-76667658a5bc" containerName="placement-api" containerID="cri-o://7ec9f905acecad1325d415f2ddb8f8369e83f15dfec01b649e5e7392716e0b34" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.401985 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b60b-account-create-update-rwx74"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.427442 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86dd65c656-6n5cp"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.427672 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-86dd65c656-6n5cp" podUID="adff4079-41bc-4cde-bc30-4a29f5302568" containerName="horizon-log" containerID="cri-o://902673b5c0f9f5c778122d89416d764f9c657be00747f6d0aaf5e3bce9ebde6d" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.431632 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d8f82aad-7df9-4b14-a328-2cc708aeed84" containerName="rabbitmq" containerID="cri-o://1b91d13309e4eceb1e8a3d37f030db4d6384384b7d04e3f74e24403de0da2f57" gracePeriod=604800 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.431619 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-86dd65c656-6n5cp" podUID="adff4079-41bc-4cde-bc30-4a29f5302568" containerName="horizon" containerID="cri-o://47ff5d6db1f45daec44c8d4bbc86c28325f137beacfc6286b66cd85eba48a741" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.457034 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5aa7-account-create-update-khr6g"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.484575 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.486084 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="570bec3d-603c-4f92-b183-c5abb7e799d8" containerName="nova-api-api" containerID="cri-o://0aa17ecda203fdcd836c6d88a867e21da21bf1e6661eebac5135520ec1cd39ff" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.484807 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="570bec3d-603c-4f92-b183-c5abb7e799d8" containerName="nova-api-log" containerID="cri-o://a5b2ffecbcd348e06e8ff7bb5a39ad7d2084d7e9a705dfca2cffe091ad053ce8" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.520465 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.520801 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-api" containerID="cri-o://e5e7ee2c8b135051740b1ea7d291484779a9c2e5a708219176467993426ff5fd" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.521236 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-notifier" containerID="cri-o://a0fef05f6987a47a8d1202e9246540bf050838d7c193ea1ef5e1bb51346912c0" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.521295 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-listener" containerID="cri-o://e932f691f1d7bc2d70f16a64edd63fd7ca5272ea9a3dce01dda8b2cbb647641f" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.521537 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-evaluator" containerID="cri-o://7a05e953fc80368078f66f82f035d35c3158450022c92ac887c442e42eb47b06" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.575385 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-fd8nx"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.575434 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-fd8nx"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.600515 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6db8c9f77f-vfbhs"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.600744 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6db8c9f77f-vfbhs" podUID="5adc996d-c7bb-49a2-bea0-909b84c93353" containerName="heat-engine" containerID="cri-o://c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" gracePeriod=60 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.619277 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.619553 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerName="nova-metadata-log" containerID="cri-o://c349d4019c79a38336f264dd317d2df1febd4f029273706fb5fc837b0d162f2a" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.620051 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerName="nova-metadata-metadata" containerID="cri-o://d75d9c5c9f819a4fef6f8458366281a7a7b56194dad2149a7dac9c6288a64b3e" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.652097 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58486f655d-lth95"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.652372 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-58486f655d-lth95" podUID="8596ec60-89ed-43d9-be63-b7130fd0f937" containerName="heat-cfnapi" containerID="cri-o://38bbd56ee9a5e5626e98c5f9e39c04af222fb0e3d5740284ea5093fdcc638927" gracePeriod=60 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.695364 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5448bb6c56-5br5b"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.695719 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5448bb6c56-5br5b" podUID="0c50549a-ecc3-4c2f-a625-4cfa3492fbad" containerName="proxy-httpd" containerID="cri-o://8a9512e77ec9a5c41445f0fc54c6a74435b9b78c0ecdcff32ba1f9acce025e1f" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.696270 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5448bb6c56-5br5b" podUID="0c50549a-ecc3-4c2f-a625-4cfa3492fbad" containerName="proxy-server" containerID="cri-o://cff924505599b5b5ba1b26edde59684e4ca7be4355f211ee64421ecef23a8028" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.723735 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0ef9-account-create-update-xrnmv"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.764959 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.842731 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f58d8574d-t4gn8"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.847586 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5f58d8574d-t4gn8" podUID="50aef737-c888-466c-92d0-9c683267d266" containerName="heat-api" containerID="cri-o://d228a3b0cadcbb3c297b956ba0f819355627a6ecf7412f2e7ba427ce670fe91f" gracePeriod=60 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.885371 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.885567 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 09:02:41 crc kubenswrapper[5047]: E0223 09:02:41.886218 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 09:02:41 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8419493e1fd846703d277695e03fc5eb,Command:[/bin/sh -c #!/bin/bash Feb 23 09:02:41 crc kubenswrapper[5047]: Feb 23 09:02:41 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 09:02:41 crc kubenswrapper[5047]: Feb 23 09:02:41 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 09:02:41 crc kubenswrapper[5047]: Feb 23 09:02:41 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 09:02:41 crc kubenswrapper[5047]: Feb 23 09:02:41 crc kubenswrapper[5047]: if [ -n "barbican" ]; then Feb 23 09:02:41 crc kubenswrapper[5047]: GRANT_DATABASE="barbican" Feb 23 09:02:41 crc kubenswrapper[5047]: else Feb 23 09:02:41 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 09:02:41 crc kubenswrapper[5047]: fi Feb 23 09:02:41 crc kubenswrapper[5047]: Feb 23 09:02:41 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 09:02:41 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 09:02:41 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 09:02:41 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 09:02:41 crc kubenswrapper[5047]: # support updates Feb 23 09:02:41 crc kubenswrapper[5047]: Feb 23 09:02:41 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 09:02:41 crc kubenswrapper[5047]: E0223 09:02:41.887498 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-b60b-account-create-update-rwx74" podUID="ac795337-f981-4559-a2d7-9e3eb81f9e33" Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.940310 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-aad1-account-create-update-n2bdk"] Feb 23 09:02:41 crc kubenswrapper[5047]: W0223 09:02:41.966926 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe43cf5f_4813_4df4_ba76_345d096d5816.slice/crio-314c66a7c913e80ed0dc2d967cfa8947beac3864ebbf48fa33541c4150b9e4e2 WatchSource:0}: Error finding container 314c66a7c913e80ed0dc2d967cfa8947beac3864ebbf48fa33541c4150b9e4e2: Status 404 returned error can't find the container with id 314c66a7c913e80ed0dc2d967cfa8947beac3864ebbf48fa33541c4150b9e4e2 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.970025 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5f75d6c566-lwrqj"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.970877 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" podUID="56f69678-8caa-45a4-8361-a0bf3ef10d19" containerName="barbican-keystone-listener" containerID="cri-o://46026da011a0aa7880379234ae4494704c649f1e05648955f118d375e2a3e1e0" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.970551 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" podUID="56f69678-8caa-45a4-8361-a0bf3ef10d19" containerName="barbican-keystone-listener-log" containerID="cri-o://f80a58364511a0fa76558ab45536d04c4cce2e97d79499c3e5dad8ce40b0217a" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.975627 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="db558a41-6dbf-4b18-af50-6a5311530ef4" containerName="rabbitmq" containerID="cri-o://c118d283468399c97557ee04375c5d709e9aa2386c25e5bb23ba33476c1b630a" gracePeriod=604800 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.987265 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-56c47db55f-krsw7"] Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.987576 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-56c47db55f-krsw7" podUID="5e613490-e007-4f7e-9868-abf59633c7c2" containerName="barbican-worker-log" containerID="cri-o://2b967aa0fbb7606bb8ade55e0690f140e4568286a58a2b7dd614a5b813dee08a" gracePeriod=30 Feb 23 09:02:41 crc kubenswrapper[5047]: I0223 09:02:41.988049 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-56c47db55f-krsw7" podUID="5e613490-e007-4f7e-9868-abf59633c7c2" containerName="barbican-worker" containerID="cri-o://4c4abeb50762236bdb25af75e59326ffcbc0cafbe34c51b67459b228cce0deb5" gracePeriod=30 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:41.996818 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b6cf-account-create-update-r57lp"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.014452 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d854f95bd-qf2l8"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.014686 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d854f95bd-qf2l8" podUID="1058e169-d572-43c5-80b3-d5f2f2c78afb" containerName="barbican-api-log" containerID="cri-o://dcf3076fcf8e8a3596430ddf627b1510267721dfc147bcaa96b9951e0ece0e7c" gracePeriod=30 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.015340 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d854f95bd-qf2l8" podUID="1058e169-d572-43c5-80b3-d5f2f2c78afb" containerName="barbican-api" containerID="cri-o://d60474bd97e9b0d0915a58eae3db7da57f5234e0b02651a966146b0399694399" gracePeriod=30 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.036685 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.038765 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.038963 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="054e8a69-ea75-47c8-bd54-b4475341100f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://06eae257f344b94484be127cf7a2511935e34c00c735dcc879f6b8f3cea4311b" gracePeriod=30 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.060369 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-lmjpr"] Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.040377 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 09:02:42 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8419493e1fd846703d277695e03fc5eb,Command:[/bin/sh -c #!/bin/bash Feb 23 09:02:42 crc kubenswrapper[5047]: Feb 23 09:02:42 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 09:02:42 crc kubenswrapper[5047]: Feb 23 09:02:42 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 09:02:42 crc kubenswrapper[5047]: Feb 23 09:02:42 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 09:02:42 crc kubenswrapper[5047]: Feb 23 09:02:42 crc kubenswrapper[5047]: if [ -n "neutron" ]; then Feb 23 09:02:42 crc kubenswrapper[5047]: GRANT_DATABASE="neutron" Feb 23 09:02:42 crc kubenswrapper[5047]: else Feb 23 09:02:42 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 09:02:42 crc kubenswrapper[5047]: fi Feb 23 09:02:42 crc kubenswrapper[5047]: Feb 23 09:02:42 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 09:02:42 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 09:02:42 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 09:02:42 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 09:02:42 crc kubenswrapper[5047]: # support updates Feb 23 09:02:42 crc kubenswrapper[5047]: Feb 23 09:02:42 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.062586 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-317c-account-create-update-ppk28" podUID="fe43cf5f-4813-4df4-ba76-345d096d5816" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.108653 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-lmjpr"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.122171 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.126917 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-dcb1-account-create-update-s6wxg"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.134937 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.181514 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.182386 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="prometheus" containerID="cri-o://4f7304ad6ed4398aa70c150989b050ad9909a66441b44da2cc965ae4528229d0" gracePeriod=600 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.183154 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="thanos-sidecar" containerID="cri-o://f8aec307320ad2b429f05d28aac10036e7f2f6cdf40d47a064a28a6e546628b0" gracePeriod=600 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.183225 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="config-reloader" containerID="cri-o://e6a7f689edafc92eda919136c37b6c2443ad2862e793b4c249f8c7326840442a" gracePeriod=600 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.193592 5047 generic.go:334] "Generic (PLEG): container finished" podID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerID="c349d4019c79a38336f264dd317d2df1febd4f029273706fb5fc837b0d162f2a" exitCode=143 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.193693 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4051daa4-7ea3-4ab2-ad1f-52353e0ad995","Type":"ContainerDied","Data":"c349d4019c79a38336f264dd317d2df1febd4f029273706fb5fc837b0d162f2a"} Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.199450 5047 generic.go:334] "Generic (PLEG): container finished" podID="f1306003-e12b-4db1-beeb-cd461db0975e" containerID="9ea42abec4f78326158bc5cb0cec2641d24efe082290fbd2778b5995dbeb07c2" exitCode=0 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.199566 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755d9b4d6f-jbxjv" event={"ID":"f1306003-e12b-4db1-beeb-cd461db0975e","Type":"ContainerDied","Data":"9ea42abec4f78326158bc5cb0cec2641d24efe082290fbd2778b5995dbeb07c2"} Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.209228 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.210480 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/alertmanager-metric-storage-0" podUID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" containerName="alertmanager" containerID="cri-o://7061c930f5e73ec3ef1ef35e6ca4955b8a79a7a14eda68ced588ceb181b02467" gracePeriod=120 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.211058 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/alertmanager-metric-storage-0" podUID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" containerName="config-reloader" containerID="cri-o://8c1124f684fe17c61a20660889d0932a297693e7c21894c823edb60c364c744b" gracePeriod=120 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.211420 5047 generic.go:334] "Generic (PLEG): container finished" podID="56f69678-8caa-45a4-8361-a0bf3ef10d19" containerID="f80a58364511a0fa76558ab45536d04c4cce2e97d79499c3e5dad8ce40b0217a" exitCode=143 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.211474 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" event={"ID":"56f69678-8caa-45a4-8361-a0bf3ef10d19","Type":"ContainerDied","Data":"f80a58364511a0fa76558ab45536d04c4cce2e97d79499c3e5dad8ce40b0217a"} Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.226514 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b60b-account-create-update-rwx74"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.241798 5047 generic.go:334] "Generic (PLEG): container finished" podID="3d570b43-d0ff-42d7-a305-9d7c2f9f9881" containerID="3cd1af755bfcf4970c210c91d87839f5bae5b4edc4ad2683851e88ccef337c33" exitCode=143 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.241880 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d570b43-d0ff-42d7-a305-9d7c2f9f9881","Type":"ContainerDied","Data":"3cd1af755bfcf4970c210c91d87839f5bae5b4edc4ad2683851e88ccef337c33"} Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.254612 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-317c-account-create-update-ppk28"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.260211 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.267790 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="74700fa7-59df-4201-a7c4-de815b82208e" containerName="galera" containerID="cri-o://000d3c24fd99b6c2ed581b5f5c4fd5b32ee63c9bba247e2ec7f68a77055306d1" gracePeriod=30 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.269986 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rl8wd"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.277678 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-ovsdbserver-sb\") pod \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.277755 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjxld\" (UniqueName: \"kubernetes.io/projected/5eaba05e-ecdc-472a-b77f-a4fd716b467e-kube-api-access-vjxld\") pod \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.277787 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxts9\" (UniqueName: \"kubernetes.io/projected/51062b9d-51b2-4e47-b577-3cdc144cf0d1-kube-api-access-sxts9\") pod \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.277820 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-dns-svc\") pod \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.277853 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eaba05e-ecdc-472a-b77f-a4fd716b467e-combined-ca-bundle\") pod \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.277895 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-config\") pod \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.277950 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-ovsdbserver-nb\") pod \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\" (UID: \"51062b9d-51b2-4e47-b577-3cdc144cf0d1\") " Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.277992 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5eaba05e-ecdc-472a-b77f-a4fd716b467e-openstack-config-secret\") pod \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.278046 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5eaba05e-ecdc-472a-b77f-a4fd716b467e-openstack-config\") pod \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\" (UID: \"5eaba05e-ecdc-472a-b77f-a4fd716b467e\") " Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.280594 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9205930b-2303-4b01-a3bf-cf4ef3ad0a49/ovsdbserver-nb/0.log" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.292205 5047 generic.go:334] "Generic (PLEG): container finished" podID="9205930b-2303-4b01-a3bf-cf4ef3ad0a49" containerID="809716648c4107c38d6ed5e1da34f1681092785995acc0e50a5dfbd5e3502aa8" exitCode=143 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.292316 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9205930b-2303-4b01-a3bf-cf4ef3ad0a49","Type":"ContainerDied","Data":"809716648c4107c38d6ed5e1da34f1681092785995acc0e50a5dfbd5e3502aa8"} Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.317143 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51062b9d-51b2-4e47-b577-3cdc144cf0d1-kube-api-access-sxts9" (OuterVolumeSpecName: "kube-api-access-sxts9") pod "51062b9d-51b2-4e47-b577-3cdc144cf0d1" (UID: "51062b9d-51b2-4e47-b577-3cdc144cf0d1"). InnerVolumeSpecName "kube-api-access-sxts9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.317991 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eaba05e-ecdc-472a-b77f-a4fd716b467e-kube-api-access-vjxld" (OuterVolumeSpecName: "kube-api-access-vjxld") pod "5eaba05e-ecdc-472a-b77f-a4fd716b467e" (UID: "5eaba05e-ecdc-472a-b77f-a4fd716b467e"). InnerVolumeSpecName "kube-api-access-vjxld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.338456 5047 generic.go:334] "Generic (PLEG): container finished" podID="51062b9d-51b2-4e47-b577-3cdc144cf0d1" containerID="071c4bbb4d57ad6d8007e0dcce8f552bbb5d21daa763338dd2cc6338d661611c" exitCode=0 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.338568 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" event={"ID":"51062b9d-51b2-4e47-b577-3cdc144cf0d1","Type":"ContainerDied","Data":"071c4bbb4d57ad6d8007e0dcce8f552bbb5d21daa763338dd2cc6338d661611c"} Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.338602 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" event={"ID":"51062b9d-51b2-4e47-b577-3cdc144cf0d1","Type":"ContainerDied","Data":"cd82de8cc6bdaf684ad109dbf7950ce05eda902f0be6f3625e0bed3dacdb29d5"} Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.338624 5047 scope.go:117] "RemoveContainer" containerID="071c4bbb4d57ad6d8007e0dcce8f552bbb5d21daa763338dd2cc6338d661611c" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.338776 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749799ffdc-q8kjx" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.346865 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.347160 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.356021 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eaba05e-ecdc-472a-b77f-a4fd716b467e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5eaba05e-ecdc-472a-b77f-a4fd716b467e" (UID: "5eaba05e-ecdc-472a-b77f-a4fd716b467e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.389490 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eaba05e-ecdc-472a-b77f-a4fd716b467e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.389520 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjxld\" (UniqueName: \"kubernetes.io/projected/5eaba05e-ecdc-472a-b77f-a4fd716b467e-kube-api-access-vjxld\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.389529 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxts9\" (UniqueName: \"kubernetes.io/projected/51062b9d-51b2-4e47-b577-3cdc144cf0d1-kube-api-access-sxts9\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.389590 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.389634 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data podName:d8f82aad-7df9-4b14-a328-2cc708aeed84 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:46.389619893 +0000 UTC m=+8288.640947027 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data") pod "rabbitmq-server-0" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84") : configmap "rabbitmq-config-data" not found Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.389959 5047 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.389990 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts podName:e444ce7d-b56a-406c-91ff-4623a469c13a nodeName:}" failed. No retries permitted until 2026-02-23 09:02:44.389982683 +0000 UTC m=+8286.641309817 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts") pod "nova-cell1-b6cf-account-create-update-r57lp" (UID: "e444ce7d-b56a-406c-91ff-4623a469c13a") : configmap "openstack-cell1-scripts" not found Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.412091 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-config" (OuterVolumeSpecName: "config") pod "51062b9d-51b2-4e47-b577-3cdc144cf0d1" (UID: "51062b9d-51b2-4e47-b577-3cdc144cf0d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.412653 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="031265ff-da09-4cb9-b2e8-f16da9486723" path="/var/lib/kubelet/pods/031265ff-da09-4cb9-b2e8-f16da9486723/volumes" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.413402 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a32b6fb-abc2-42ed-94ba-0682376fdc51" path="/var/lib/kubelet/pods/0a32b6fb-abc2-42ed-94ba-0682376fdc51/volumes" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.427530 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a" path="/var/lib/kubelet/pods/0b3b6b52-7c75-4e2a-bc9f-753b3b74c55a/volumes" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.428617 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dd40f96-4019-455e-b292-19cd00b8e616" path="/var/lib/kubelet/pods/8dd40f96-4019-455e-b292-19cd00b8e616/volumes" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.429183 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e48ca0ec-13fa-4342-aafb-b1ef2c09e08b" path="/var/lib/kubelet/pods/e48ca0ec-13fa-4342-aafb-b1ef2c09e08b/volumes" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.447334 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b60b-account-create-update-rwx74" event={"ID":"ac795337-f981-4559-a2d7-9e3eb81f9e33","Type":"ContainerStarted","Data":"88760f5ea52760f2facc3375b5f7aa431a055725be2142fd81de3579c2257db7"} Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.447375 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.447579 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3464c846-13b9-479e-b9af-3d571f03b284" containerName="nova-scheduler-scheduler" containerID="cri-o://86a16a531bca49c39e0367bf3d4345deb39c5463a8f65beb54493af30860a9d3" gracePeriod=30 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.466381 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "51062b9d-51b2-4e47-b577-3cdc144cf0d1" (UID: "51062b9d-51b2-4e47-b577-3cdc144cf0d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.472216 5047 generic.go:334] "Generic (PLEG): container finished" podID="5e613490-e007-4f7e-9868-abf59633c7c2" containerID="2b967aa0fbb7606bb8ade55e0690f140e4568286a58a2b7dd614a5b813dee08a" exitCode=143 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.472294 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56c47db55f-krsw7" event={"ID":"5e613490-e007-4f7e-9868-abf59633c7c2","Type":"ContainerDied","Data":"2b967aa0fbb7606bb8ade55e0690f140e4568286a58a2b7dd614a5b813dee08a"} Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.489055 5047 generic.go:334] "Generic (PLEG): container finished" podID="570bec3d-603c-4f92-b183-c5abb7e799d8" containerID="a5b2ffecbcd348e06e8ff7bb5a39ad7d2084d7e9a705dfca2cffe091ad053ce8" exitCode=143 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.489126 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"570bec3d-603c-4f92-b183-c5abb7e799d8","Type":"ContainerDied","Data":"a5b2ffecbcd348e06e8ff7bb5a39ad7d2084d7e9a705dfca2cffe091ad053ce8"} Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.493894 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.493931 5047 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.499040 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-317c-account-create-update-ppk28" event={"ID":"fe43cf5f-4813-4df4-ba76-345d096d5816","Type":"ContainerStarted","Data":"314c66a7c913e80ed0dc2d967cfa8947beac3864ebbf48fa33541c4150b9e4e2"} Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.509046 5047 generic.go:334] "Generic (PLEG): container finished" podID="95116420-b62b-402c-bcbe-17026cba0354" containerID="438245a707c3bbe792477e7783157c77ef4990e9609e3fae09f2db6284e24d03" exitCode=143 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.509098 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95116420-b62b-402c-bcbe-17026cba0354","Type":"ContainerDied","Data":"438245a707c3bbe792477e7783157c77ef4990e9609e3fae09f2db6284e24d03"} Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.562218 5047 scope.go:117] "RemoveContainer" containerID="f5e5fd8547d7964dd7a8489415f62d3412f9ced9de30736cdacd9fe566d2cd9e" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.562433 5047 generic.go:334] "Generic (PLEG): container finished" podID="5eaba05e-ecdc-472a-b77f-a4fd716b467e" containerID="69d3101d0dd1f058d3e374fea4657a2a86dab508f6826496a6c180ad1d829e76" exitCode=137 Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.562517 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.562625 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.568130 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.573772 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.573858 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6db8c9f77f-vfbhs" podUID="5adc996d-c7bb-49a2-bea0-909b84c93353" containerName="heat-engine" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.580790 5047 generic.go:334] "Generic (PLEG): container finished" podID="4ce03d60-441e-4a79-acce-d54444aedfcb" containerID="ec75f8350065896340138084424783a8ac231166ea7fffbf795299442b505721" exitCode=143 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.580880 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4ce03d60-441e-4a79-acce-d54444aedfcb","Type":"ContainerDied","Data":"ec75f8350065896340138084424783a8ac231166ea7fffbf795299442b505721"} Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.583603 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 809716648c4107c38d6ed5e1da34f1681092785995acc0e50a5dfbd5e3502aa8 is running failed: container process not found" containerID="809716648c4107c38d6ed5e1da34f1681092785995acc0e50a5dfbd5e3502aa8" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.583762 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eaba05e-ecdc-472a-b77f-a4fd716b467e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5eaba05e-ecdc-472a-b77f-a4fd716b467e" (UID: "5eaba05e-ecdc-472a-b77f-a4fd716b467e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.584114 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 809716648c4107c38d6ed5e1da34f1681092785995acc0e50a5dfbd5e3502aa8 is running failed: container process not found" containerID="809716648c4107c38d6ed5e1da34f1681092785995acc0e50a5dfbd5e3502aa8" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.584463 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 809716648c4107c38d6ed5e1da34f1681092785995acc0e50a5dfbd5e3502aa8 is running failed: container process not found" containerID="809716648c4107c38d6ed5e1da34f1681092785995acc0e50a5dfbd5e3502aa8" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.584493 5047 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 809716648c4107c38d6ed5e1da34f1681092785995acc0e50a5dfbd5e3502aa8 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="9205930b-2303-4b01-a3bf-cf4ef3ad0a49" containerName="ovsdbserver-nb" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.584624 5047 generic.go:334] "Generic (PLEG): container finished" podID="07094621-ca88-4942-a226-76667658a5bc" containerID="20ea81041bb52ceb23d8bb279a41e64793f561c21214179c617e0cd48b7db8c3" exitCode=143 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.584667 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ff9694944-nmq4c" event={"ID":"07094621-ca88-4942-a226-76667658a5bc","Type":"ContainerDied","Data":"20ea81041bb52ceb23d8bb279a41e64793f561c21214179c617e0cd48b7db8c3"} Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.593389 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eaba05e-ecdc-472a-b77f-a4fd716b467e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5eaba05e-ecdc-472a-b77f-a4fd716b467e" (UID: "5eaba05e-ecdc-472a-b77f-a4fd716b467e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.595811 5047 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5eaba05e-ecdc-472a-b77f-a4fd716b467e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.595859 5047 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5eaba05e-ecdc-472a-b77f-a4fd716b467e-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.654159 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "51062b9d-51b2-4e47-b577-3cdc144cf0d1" (UID: "51062b9d-51b2-4e47-b577-3cdc144cf0d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.698657 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.726757 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.727058 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="01713a47-8bed-4038-8339-bdcd77e6e1db" containerName="nova-cell1-conductor-conductor" containerID="cri-o://9f09d38933227a4847581a0cdb85bb38728c31775104288e65ecc52ea3a4a431" gracePeriod=30 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.746076 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="054e8a69-ea75-47c8-bd54-b4475341100f" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.1.101:6080/vnc_lite.html\": dial tcp 10.217.1.101:6080: connect: connection refused" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.751715 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "51062b9d-51b2-4e47-b577-3cdc144cf0d1" (UID: "51062b9d-51b2-4e47-b577-3cdc144cf0d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.765683 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.765885 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="14988409-8f11-42ca-bc6d-a4ba3d3056a4" containerName="nova-cell0-conductor-conductor" containerID="cri-o://b55343fde4610941e2de9a407053a8025bb06c2d056689bd185172f9ed4bdd5a" gracePeriod=30 Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.801046 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51062b9d-51b2-4e47-b577-3cdc144cf0d1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.836160 5047 scope.go:117] "RemoveContainer" containerID="071c4bbb4d57ad6d8007e0dcce8f552bbb5d21daa763338dd2cc6338d661611c" Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.841174 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"071c4bbb4d57ad6d8007e0dcce8f552bbb5d21daa763338dd2cc6338d661611c\": container with ID starting with 071c4bbb4d57ad6d8007e0dcce8f552bbb5d21daa763338dd2cc6338d661611c not found: ID does not exist" containerID="071c4bbb4d57ad6d8007e0dcce8f552bbb5d21daa763338dd2cc6338d661611c" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.841224 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071c4bbb4d57ad6d8007e0dcce8f552bbb5d21daa763338dd2cc6338d661611c"} err="failed to get container status \"071c4bbb4d57ad6d8007e0dcce8f552bbb5d21daa763338dd2cc6338d661611c\": rpc error: code = NotFound desc = could not find container \"071c4bbb4d57ad6d8007e0dcce8f552bbb5d21daa763338dd2cc6338d661611c\": container with ID starting with 071c4bbb4d57ad6d8007e0dcce8f552bbb5d21daa763338dd2cc6338d661611c not found: ID does not exist" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.841244 5047 scope.go:117] "RemoveContainer" containerID="f5e5fd8547d7964dd7a8489415f62d3412f9ced9de30736cdacd9fe566d2cd9e" Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.844048 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e5fd8547d7964dd7a8489415f62d3412f9ced9de30736cdacd9fe566d2cd9e\": container with ID starting with f5e5fd8547d7964dd7a8489415f62d3412f9ced9de30736cdacd9fe566d2cd9e not found: ID does not exist" containerID="f5e5fd8547d7964dd7a8489415f62d3412f9ced9de30736cdacd9fe566d2cd9e" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.844080 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e5fd8547d7964dd7a8489415f62d3412f9ced9de30736cdacd9fe566d2cd9e"} err="failed to get container status \"f5e5fd8547d7964dd7a8489415f62d3412f9ced9de30736cdacd9fe566d2cd9e\": rpc error: code = NotFound desc = could not find container \"f5e5fd8547d7964dd7a8489415f62d3412f9ced9de30736cdacd9fe566d2cd9e\": container with ID starting with f5e5fd8547d7964dd7a8489415f62d3412f9ced9de30736cdacd9fe566d2cd9e not found: ID does not exist" Feb 23 09:02:42 crc kubenswrapper[5047]: I0223 09:02:42.844096 5047 scope.go:117] "RemoveContainer" containerID="69d3101d0dd1f058d3e374fea4657a2a86dab508f6826496a6c180ad1d829e76" Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.950913 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b55343fde4610941e2de9a407053a8025bb06c2d056689bd185172f9ed4bdd5a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.966416 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b55343fde4610941e2de9a407053a8025bb06c2d056689bd185172f9ed4bdd5a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.972028 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b55343fde4610941e2de9a407053a8025bb06c2d056689bd185172f9ed4bdd5a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 09:02:42 crc kubenswrapper[5047]: E0223 09:02:42.972090 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="14988409-8f11-42ca-bc6d-a4ba3d3056a4" containerName="nova-cell0-conductor-conductor" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.028103 5047 scope.go:117] "RemoveContainer" containerID="69d3101d0dd1f058d3e374fea4657a2a86dab508f6826496a6c180ad1d829e76" Feb 23 09:02:43 crc kubenswrapper[5047]: E0223 09:02:43.028643 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d3101d0dd1f058d3e374fea4657a2a86dab508f6826496a6c180ad1d829e76\": container with ID starting with 69d3101d0dd1f058d3e374fea4657a2a86dab508f6826496a6c180ad1d829e76 not found: ID does not exist" containerID="69d3101d0dd1f058d3e374fea4657a2a86dab508f6826496a6c180ad1d829e76" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.028666 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d3101d0dd1f058d3e374fea4657a2a86dab508f6826496a6c180ad1d829e76"} err="failed to get container status \"69d3101d0dd1f058d3e374fea4657a2a86dab508f6826496a6c180ad1d829e76\": rpc error: code = NotFound desc = could not find container \"69d3101d0dd1f058d3e374fea4657a2a86dab508f6826496a6c180ad1d829e76\": container with ID starting with 69d3101d0dd1f058d3e374fea4657a2a86dab508f6826496a6c180ad1d829e76 not found: ID does not exist" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.061204 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-749799ffdc-q8kjx"] Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.071821 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-749799ffdc-q8kjx"] Feb 23 09:02:43 crc kubenswrapper[5047]: E0223 09:02:43.316005 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 09:02:43 crc kubenswrapper[5047]: E0223 09:02:43.317768 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data podName:db558a41-6dbf-4b18-af50-6a5311530ef4 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:47.316266362 +0000 UTC m=+8289.567593496 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data") pod "rabbitmq-cell1-server-0" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4") : configmap "rabbitmq-cell1-config-data" not found Feb 23 09:02:43 crc kubenswrapper[5047]: E0223 09:02:43.351060 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f09d38933227a4847581a0cdb85bb38728c31775104288e65ecc52ea3a4a431" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 09:02:43 crc kubenswrapper[5047]: E0223 09:02:43.386114 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f09d38933227a4847581a0cdb85bb38728c31775104288e65ecc52ea3a4a431" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 09:02:43 crc kubenswrapper[5047]: E0223 09:02:43.419094 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f09d38933227a4847581a0cdb85bb38728c31775104288e65ecc52ea3a4a431" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 09:02:43 crc kubenswrapper[5047]: E0223 09:02:43.419176 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="01713a47-8bed-4038-8339-bdcd77e6e1db" containerName="nova-cell1-conductor-conductor" Feb 23 09:02:43 crc kubenswrapper[5047]: E0223 09:02:43.522277 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3af56d9850e342eaf92994ecc8a16c1e2545b5684d7c31b87e2addd0e9540af4 is running failed: container process not found" containerID="3af56d9850e342eaf92994ecc8a16c1e2545b5684d7c31b87e2addd0e9540af4" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 23 09:02:43 crc kubenswrapper[5047]: E0223 09:02:43.526075 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3af56d9850e342eaf92994ecc8a16c1e2545b5684d7c31b87e2addd0e9540af4 is running failed: container process not found" containerID="3af56d9850e342eaf92994ecc8a16c1e2545b5684d7c31b87e2addd0e9540af4" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 23 09:02:43 crc kubenswrapper[5047]: E0223 09:02:43.532957 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3af56d9850e342eaf92994ecc8a16c1e2545b5684d7c31b87e2addd0e9540af4 is running failed: container process not found" containerID="3af56d9850e342eaf92994ecc8a16c1e2545b5684d7c31b87e2addd0e9540af4" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 23 09:02:43 crc kubenswrapper[5047]: E0223 09:02:43.532999 5047 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3af56d9850e342eaf92994ecc8a16c1e2545b5684d7c31b87e2addd0e9540af4 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="346c3f0d-e5fb-40f6-bd1e-65679466165f" containerName="ovsdbserver-sb" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.562422 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-755d9b4d6f-jbxjv" podUID="f1306003-e12b-4db1-beeb-cd461db0975e" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.55:9696/\": dial tcp 10.217.1.55:9696: connect: connection refused" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.619129 5047 generic.go:334] "Generic (PLEG): container finished" podID="b6c3821a-cbde-41d5-95fd-1e617b2d12bc" containerID="0ff894197ec53c6fa19bcceb0a1998cc1302ed5266ed87a7fab99b18cfd8df6e" exitCode=0 Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.619185 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6c3821a-cbde-41d5-95fd-1e617b2d12bc","Type":"ContainerDied","Data":"0ff894197ec53c6fa19bcceb0a1998cc1302ed5266ed87a7fab99b18cfd8df6e"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.622630 5047 generic.go:334] "Generic (PLEG): container finished" podID="054e8a69-ea75-47c8-bd54-b4475341100f" containerID="06eae257f344b94484be127cf7a2511935e34c00c735dcc879f6b8f3cea4311b" exitCode=0 Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.622686 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"054e8a69-ea75-47c8-bd54-b4475341100f","Type":"ContainerDied","Data":"06eae257f344b94484be127cf7a2511935e34c00c735dcc879f6b8f3cea4311b"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.636062 5047 generic.go:334] "Generic (PLEG): container finished" podID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" containerID="8c1124f684fe17c61a20660889d0932a297693e7c21894c823edb60c364c744b" exitCode=0 Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.636093 5047 generic.go:334] "Generic (PLEG): container finished" podID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" containerID="7061c930f5e73ec3ef1ef35e6ca4955b8a79a7a14eda68ced588ceb181b02467" exitCode=0 Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.636128 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6","Type":"ContainerDied","Data":"8c1124f684fe17c61a20660889d0932a297693e7c21894c823edb60c364c744b"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.636149 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6","Type":"ContainerDied","Data":"7061c930f5e73ec3ef1ef35e6ca4955b8a79a7a14eda68ced588ceb181b02467"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.644890 5047 generic.go:334] "Generic (PLEG): container finished" podID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerID="f8aec307320ad2b429f05d28aac10036e7f2f6cdf40d47a064a28a6e546628b0" exitCode=0 Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.644927 5047 generic.go:334] "Generic (PLEG): container finished" podID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerID="e6a7f689edafc92eda919136c37b6c2443ad2862e793b4c249f8c7326840442a" exitCode=0 Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.644935 5047 generic.go:334] "Generic (PLEG): container finished" podID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerID="4f7304ad6ed4398aa70c150989b050ad9909a66441b44da2cc965ae4528229d0" exitCode=0 Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.644976 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9ac0b4e9-66ad-4719-b036-d9e833ad7a37","Type":"ContainerDied","Data":"f8aec307320ad2b429f05d28aac10036e7f2f6cdf40d47a064a28a6e546628b0"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.644994 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9ac0b4e9-66ad-4719-b036-d9e833ad7a37","Type":"ContainerDied","Data":"e6a7f689edafc92eda919136c37b6c2443ad2862e793b4c249f8c7326840442a"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.645002 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9ac0b4e9-66ad-4719-b036-d9e833ad7a37","Type":"ContainerDied","Data":"4f7304ad6ed4398aa70c150989b050ad9909a66441b44da2cc965ae4528229d0"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.656158 5047 generic.go:334] "Generic (PLEG): container finished" podID="1058e169-d572-43c5-80b3-d5f2f2c78afb" containerID="dcf3076fcf8e8a3596430ddf627b1510267721dfc147bcaa96b9951e0ece0e7c" exitCode=143 Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.656220 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d854f95bd-qf2l8" event={"ID":"1058e169-d572-43c5-80b3-d5f2f2c78afb","Type":"ContainerDied","Data":"dcf3076fcf8e8a3596430ddf627b1510267721dfc147bcaa96b9951e0ece0e7c"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.662779 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_346c3f0d-e5fb-40f6-bd1e-65679466165f/ovsdbserver-sb/0.log" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.662857 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"346c3f0d-e5fb-40f6-bd1e-65679466165f","Type":"ContainerDied","Data":"8e0a04b317d902506666f366cdbba232698e745e2cfd26007172c793701ab106"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.662889 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e0a04b317d902506666f366cdbba232698e745e2cfd26007172c793701ab106" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.666780 5047 generic.go:334] "Generic (PLEG): container finished" podID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerID="7a05e953fc80368078f66f82f035d35c3158450022c92ac887c442e42eb47b06" exitCode=0 Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.666801 5047 generic.go:334] "Generic (PLEG): container finished" podID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerID="e5e7ee2c8b135051740b1ea7d291484779a9c2e5a708219176467993426ff5fd" exitCode=0 Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.666828 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da9c6500-238e-415a-9e31-e7bf9ccdd205","Type":"ContainerDied","Data":"7a05e953fc80368078f66f82f035d35c3158450022c92ac887c442e42eb47b06"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.666844 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da9c6500-238e-415a-9e31-e7bf9ccdd205","Type":"ContainerDied","Data":"e5e7ee2c8b135051740b1ea7d291484779a9c2e5a708219176467993426ff5fd"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.683405 5047 generic.go:334] "Generic (PLEG): container finished" podID="0c50549a-ecc3-4c2f-a625-4cfa3492fbad" containerID="cff924505599b5b5ba1b26edde59684e4ca7be4355f211ee64421ecef23a8028" exitCode=0 Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.683435 5047 generic.go:334] "Generic (PLEG): container finished" podID="0c50549a-ecc3-4c2f-a625-4cfa3492fbad" containerID="8a9512e77ec9a5c41445f0fc54c6a74435b9b78c0ecdcff32ba1f9acce025e1f" exitCode=0 Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.683474 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5448bb6c56-5br5b" event={"ID":"0c50549a-ecc3-4c2f-a625-4cfa3492fbad","Type":"ContainerDied","Data":"cff924505599b5b5ba1b26edde59684e4ca7be4355f211ee64421ecef23a8028"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.683499 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5448bb6c56-5br5b" event={"ID":"0c50549a-ecc3-4c2f-a625-4cfa3492fbad","Type":"ContainerDied","Data":"8a9512e77ec9a5c41445f0fc54c6a74435b9b78c0ecdcff32ba1f9acce025e1f"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.695627 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rl8wd" event={"ID":"b4565dbe-dc04-452f-a79e-bc09cb299f29","Type":"ContainerStarted","Data":"758a2b9824548a6ea893ec6c329b255e312a6dbcad06f364ad1875b79ec04052"} Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.789723 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_346c3f0d-e5fb-40f6-bd1e-65679466165f/ovsdbserver-sb/0.log" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.789783 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.829050 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-317c-account-create-update-ppk28" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.840176 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b60b-account-create-update-rwx74" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.841076 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-combined-ca-bundle\") pod \"346c3f0d-e5fb-40f6-bd1e-65679466165f\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.841177 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-ovsdbserver-sb-tls-certs\") pod \"346c3f0d-e5fb-40f6-bd1e-65679466165f\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.841239 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346c3f0d-e5fb-40f6-bd1e-65679466165f-config\") pod \"346c3f0d-e5fb-40f6-bd1e-65679466165f\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.841748 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3\") pod \"346c3f0d-e5fb-40f6-bd1e-65679466165f\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.841792 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/346c3f0d-e5fb-40f6-bd1e-65679466165f-scripts\") pod \"346c3f0d-e5fb-40f6-bd1e-65679466165f\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.841811 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/346c3f0d-e5fb-40f6-bd1e-65679466165f-ovsdb-rundir\") pod \"346c3f0d-e5fb-40f6-bd1e-65679466165f\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.841997 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9n4c\" (UniqueName: \"kubernetes.io/projected/346c3f0d-e5fb-40f6-bd1e-65679466165f-kube-api-access-l9n4c\") pod \"346c3f0d-e5fb-40f6-bd1e-65679466165f\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.842069 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-metrics-certs-tls-certs\") pod \"346c3f0d-e5fb-40f6-bd1e-65679466165f\" (UID: \"346c3f0d-e5fb-40f6-bd1e-65679466165f\") " Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.843503 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/346c3f0d-e5fb-40f6-bd1e-65679466165f-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "346c3f0d-e5fb-40f6-bd1e-65679466165f" (UID: "346c3f0d-e5fb-40f6-bd1e-65679466165f"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.844727 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/346c3f0d-e5fb-40f6-bd1e-65679466165f-config" (OuterVolumeSpecName: "config") pod "346c3f0d-e5fb-40f6-bd1e-65679466165f" (UID: "346c3f0d-e5fb-40f6-bd1e-65679466165f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.845179 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/346c3f0d-e5fb-40f6-bd1e-65679466165f-scripts" (OuterVolumeSpecName: "scripts") pod "346c3f0d-e5fb-40f6-bd1e-65679466165f" (UID: "346c3f0d-e5fb-40f6-bd1e-65679466165f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.864144 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346c3f0d-e5fb-40f6-bd1e-65679466165f-kube-api-access-l9n4c" (OuterVolumeSpecName: "kube-api-access-l9n4c") pod "346c3f0d-e5fb-40f6-bd1e-65679466165f" (UID: "346c3f0d-e5fb-40f6-bd1e-65679466165f"). InnerVolumeSpecName "kube-api-access-l9n4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.882084 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5448bb6c56-5br5b" podUID="0c50549a-ecc3-4c2f-a625-4cfa3492fbad" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.1.59:8080/healthcheck\": dial tcp 10.217.1.59:8080: connect: connection refused" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.884462 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5448bb6c56-5br5b" podUID="0c50549a-ecc3-4c2f-a625-4cfa3492fbad" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.59:8080/healthcheck\": dial tcp 10.217.1.59:8080: connect: connection refused" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.894080 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "346c3f0d-e5fb-40f6-bd1e-65679466165f" (UID: "346c3f0d-e5fb-40f6-bd1e-65679466165f"). InnerVolumeSpecName "pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.910155 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "346c3f0d-e5fb-40f6-bd1e-65679466165f" (UID: "346c3f0d-e5fb-40f6-bd1e-65679466165f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.944960 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nzgh\" (UniqueName: \"kubernetes.io/projected/fe43cf5f-4813-4df4-ba76-345d096d5816-kube-api-access-2nzgh\") pod \"fe43cf5f-4813-4df4-ba76-345d096d5816\" (UID: \"fe43cf5f-4813-4df4-ba76-345d096d5816\") " Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.945192 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe43cf5f-4813-4df4-ba76-345d096d5816-operator-scripts\") pod \"fe43cf5f-4813-4df4-ba76-345d096d5816\" (UID: \"fe43cf5f-4813-4df4-ba76-345d096d5816\") " Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.945218 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac795337-f981-4559-a2d7-9e3eb81f9e33-operator-scripts\") pod \"ac795337-f981-4559-a2d7-9e3eb81f9e33\" (UID: \"ac795337-f981-4559-a2d7-9e3eb81f9e33\") " Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.945355 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9vlg\" (UniqueName: \"kubernetes.io/projected/ac795337-f981-4559-a2d7-9e3eb81f9e33-kube-api-access-g9vlg\") pod \"ac795337-f981-4559-a2d7-9e3eb81f9e33\" (UID: \"ac795337-f981-4559-a2d7-9e3eb81f9e33\") " Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.945814 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/346c3f0d-e5fb-40f6-bd1e-65679466165f-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.945827 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/346c3f0d-e5fb-40f6-bd1e-65679466165f-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.945837 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9n4c\" (UniqueName: \"kubernetes.io/projected/346c3f0d-e5fb-40f6-bd1e-65679466165f-kube-api-access-l9n4c\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.945845 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.945853 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346c3f0d-e5fb-40f6-bd1e-65679466165f-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.945873 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3\") on node \"crc\" " Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.948108 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac795337-f981-4559-a2d7-9e3eb81f9e33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac795337-f981-4559-a2d7-9e3eb81f9e33" (UID: "ac795337-f981-4559-a2d7-9e3eb81f9e33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.948327 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe43cf5f-4813-4df4-ba76-345d096d5816-kube-api-access-2nzgh" (OuterVolumeSpecName: "kube-api-access-2nzgh") pod "fe43cf5f-4813-4df4-ba76-345d096d5816" (UID: "fe43cf5f-4813-4df4-ba76-345d096d5816"). InnerVolumeSpecName "kube-api-access-2nzgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.948327 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe43cf5f-4813-4df4-ba76-345d096d5816-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe43cf5f-4813-4df4-ba76-345d096d5816" (UID: "fe43cf5f-4813-4df4-ba76-345d096d5816"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.951215 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac795337-f981-4559-a2d7-9e3eb81f9e33-kube-api-access-g9vlg" (OuterVolumeSpecName: "kube-api-access-g9vlg") pod "ac795337-f981-4559-a2d7-9e3eb81f9e33" (UID: "ac795337-f981-4559-a2d7-9e3eb81f9e33"). InnerVolumeSpecName "kube-api-access-g9vlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.992880 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:02:43 crc kubenswrapper[5047]: I0223 09:02:43.993097 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3") on node "crc" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.007318 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "346c3f0d-e5fb-40f6-bd1e-65679466165f" (UID: "346c3f0d-e5fb-40f6-bd1e-65679466165f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.040748 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "346c3f0d-e5fb-40f6-bd1e-65679466165f" (UID: "346c3f0d-e5fb-40f6-bd1e-65679466165f"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.048503 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nzgh\" (UniqueName: \"kubernetes.io/projected/fe43cf5f-4813-4df4-ba76-345d096d5816-kube-api-access-2nzgh\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.048535 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.048546 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe43cf5f-4813-4df4-ba76-345d096d5816-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.048555 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac795337-f981-4559-a2d7-9e3eb81f9e33-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.048564 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cad832b9-27d5-4c65-a09f-8f4b1b3b48b3\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.048575 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9vlg\" (UniqueName: \"kubernetes.io/projected/ac795337-f981-4559-a2d7-9e3eb81f9e33-kube-api-access-g9vlg\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.048583 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/346c3f0d-e5fb-40f6-bd1e-65679466165f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.126322 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.146106 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.197386 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="3d570b43-d0ff-42d7-a305-9d7c2f9f9881" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.69:8776/healthcheck\": read tcp 10.217.0.2:43812->10.217.1.69:8776: read: connection reset by peer" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.210461 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254279 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-tls-assets\") pod \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254328 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-secret-combined-ca-bundle\") pod \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254353 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-config-out\") pod \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254442 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-alertmanager-metric-storage-db\") pod \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254469 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-config-volume\") pod \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254487 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-1\") pod \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254518 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-0\") pod \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254544 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbnck\" (UniqueName: \"kubernetes.io/projected/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-kube-api-access-dbnck\") pod \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254631 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-config-out\") pod \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254656 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-cluster-tls-config\") pod \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254680 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254720 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254755 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-tls-assets\") pod \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254788 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-config\") pod \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254829 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config\") pod \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254866 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-web-config\") pod \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\" (UID: \"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.254896 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-thanos-prometheus-http-client-file\") pod \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.255112 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") pod \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.255150 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-2\") pod \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.255184 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5pdn\" (UniqueName: \"kubernetes.io/projected/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-kube-api-access-t5pdn\") pod \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\" (UID: \"9ac0b4e9-66ad-4719-b036-d9e833ad7a37\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.257669 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "9ac0b4e9-66ad-4719-b036-d9e833ad7a37" (UID: "9ac0b4e9-66ad-4719-b036-d9e833ad7a37"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.258050 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "9ac0b4e9-66ad-4719-b036-d9e833ad7a37" (UID: "9ac0b4e9-66ad-4719-b036-d9e833ad7a37"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.263377 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-config-volume" (OuterVolumeSpecName: "config-volume") pod "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" (UID: "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.265932 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-kube-api-access-dbnck" (OuterVolumeSpecName: "kube-api-access-dbnck") pod "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" (UID: "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6"). InnerVolumeSpecName "kube-api-access-dbnck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.268130 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-kube-api-access-t5pdn" (OuterVolumeSpecName: "kube-api-access-t5pdn") pod "9ac0b4e9-66ad-4719-b036-d9e833ad7a37" (UID: "9ac0b4e9-66ad-4719-b036-d9e833ad7a37"). InnerVolumeSpecName "kube-api-access-t5pdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.268496 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-config-out" (OuterVolumeSpecName: "config-out") pod "9ac0b4e9-66ad-4719-b036-d9e833ad7a37" (UID: "9ac0b4e9-66ad-4719-b036-d9e833ad7a37"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.270332 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-alertmanager-metric-storage-db" (OuterVolumeSpecName: "alertmanager-metric-storage-db") pod "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" (UID: "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6"). InnerVolumeSpecName "alertmanager-metric-storage-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.271259 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "9ac0b4e9-66ad-4719-b036-d9e833ad7a37" (UID: "9ac0b4e9-66ad-4719-b036-d9e833ad7a37"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.271826 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "9ac0b4e9-66ad-4719-b036-d9e833ad7a37" (UID: "9ac0b4e9-66ad-4719-b036-d9e833ad7a37"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.274269 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "9ac0b4e9-66ad-4719-b036-d9e833ad7a37" (UID: "9ac0b4e9-66ad-4719-b036-d9e833ad7a37"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.276782 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "9ac0b4e9-66ad-4719-b036-d9e833ad7a37" (UID: "9ac0b4e9-66ad-4719-b036-d9e833ad7a37"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.280295 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "9ac0b4e9-66ad-4719-b036-d9e833ad7a37" (UID: "9ac0b4e9-66ad-4719-b036-d9e833ad7a37"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.292252 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9ac0b4e9-66ad-4719-b036-d9e833ad7a37" (UID: "9ac0b4e9-66ad-4719-b036-d9e833ad7a37"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.294882 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-config" (OuterVolumeSpecName: "config") pod "9ac0b4e9-66ad-4719-b036-d9e833ad7a37" (UID: "9ac0b4e9-66ad-4719-b036-d9e833ad7a37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.297882 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-config-out" (OuterVolumeSpecName: "config-out") pod "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" (UID: "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.298244 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" (UID: "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.311839 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" (UID: "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.317647 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "9ac0b4e9-66ad-4719-b036-d9e833ad7a37" (UID: "9ac0b4e9-66ad-4719-b036-d9e833ad7a37"). InnerVolumeSpecName "pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.358120 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-vencrypt-tls-certs\") pod \"054e8a69-ea75-47c8-bd54-b4475341100f\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.358210 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-config-data\") pod \"054e8a69-ea75-47c8-bd54-b4475341100f\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.358296 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-nova-novncproxy-tls-certs\") pod \"054e8a69-ea75-47c8-bd54-b4475341100f\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.358445 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-combined-ca-bundle\") pod \"054e8a69-ea75-47c8-bd54-b4475341100f\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.358533 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj9xl\" (UniqueName: \"kubernetes.io/projected/054e8a69-ea75-47c8-bd54-b4475341100f-kube-api-access-tj9xl\") pod \"054e8a69-ea75-47c8-bd54-b4475341100f\" (UID: \"054e8a69-ea75-47c8-bd54-b4475341100f\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359122 5047 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-config-out\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359137 5047 reconciler_common.go:293] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-cluster-tls-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359146 5047 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359200 5047 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359212 5047 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359237 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359246 5047 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359276 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") on node \"crc\" " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359288 5047 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359298 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5pdn\" (UniqueName: \"kubernetes.io/projected/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-kube-api-access-t5pdn\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359307 5047 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359315 5047 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359323 5047 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-config-out\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359366 5047 reconciler_common.go:293] "Volume detached for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-alertmanager-metric-storage-db\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359376 5047 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359401 5047 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359461 5047 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.359472 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbnck\" (UniqueName: \"kubernetes.io/projected/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-kube-api-access-dbnck\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.368898 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.371982 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/054e8a69-ea75-47c8-bd54-b4475341100f-kube-api-access-tj9xl" (OuterVolumeSpecName: "kube-api-access-tj9xl") pod "054e8a69-ea75-47c8-bd54-b4475341100f" (UID: "054e8a69-ea75-47c8-bd54-b4475341100f"). InnerVolumeSpecName "kube-api-access-tj9xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.372595 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config" (OuterVolumeSpecName: "web-config") pod "9ac0b4e9-66ad-4719-b036-d9e833ad7a37" (UID: "9ac0b4e9-66ad-4719-b036-d9e833ad7a37"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.413091 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.413395 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5") on node "crc" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.423006 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51062b9d-51b2-4e47-b577-3cdc144cf0d1" path="/var/lib/kubelet/pods/51062b9d-51b2-4e47-b577-3cdc144cf0d1/volumes" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.423705 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eaba05e-ecdc-472a-b77f-a4fd716b467e" path="/var/lib/kubelet/pods/5eaba05e-ecdc-472a-b77f-a4fd716b467e/volumes" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.434545 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "054e8a69-ea75-47c8-bd54-b4475341100f" (UID: "054e8a69-ea75-47c8-bd54-b4475341100f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.455340 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-config-data" (OuterVolumeSpecName: "config-data") pod "054e8a69-ea75-47c8-bd54-b4475341100f" (UID: "054e8a69-ea75-47c8-bd54-b4475341100f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.460048 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86a16a531bca49c39e0367bf3d4345deb39c5463a8f65beb54493af30860a9d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.460847 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-combined-ca-bundle\") pod \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.460934 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-log-httpd\") pod \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.461025 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzzhn\" (UniqueName: \"kubernetes.io/projected/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-kube-api-access-gzzhn\") pod \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.461110 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-run-httpd\") pod \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.461157 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-internal-tls-certs\") pod \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.461174 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-public-tls-certs\") pod \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.461205 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-config-data\") pod \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.461257 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-etc-swift\") pod \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\" (UID: \"0c50549a-ecc3-4c2f-a625-4cfa3492fbad\") " Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.461663 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0c50549a-ecc3-4c2f-a625-4cfa3492fbad" (UID: "0c50549a-ecc3-4c2f-a625-4cfa3492fbad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.461831 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.461848 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.461861 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.461873 5047 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9ac0b4e9-66ad-4719-b036-d9e833ad7a37-web-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.461884 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj9xl\" (UniqueName: \"kubernetes.io/projected/054e8a69-ea75-47c8-bd54-b4475341100f-kube-api-access-tj9xl\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.461896 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed3272f7-28c4-49a9-8b3a-4faf611bb7f5\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.464672 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0c50549a-ecc3-4c2f-a625-4cfa3492fbad" (UID: "0c50549a-ecc3-4c2f-a625-4cfa3492fbad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.470017 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86a16a531bca49c39e0367bf3d4345deb39c5463a8f65beb54493af30860a9d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.477205 5047 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.477269 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts podName:e444ce7d-b56a-406c-91ff-4623a469c13a nodeName:}" failed. No retries permitted until 2026-02-23 09:02:48.477249975 +0000 UTC m=+8290.728577109 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts") pod "nova-cell1-b6cf-account-create-update-r57lp" (UID: "e444ce7d-b56a-406c-91ff-4623a469c13a") : configmap "openstack-cell1-scripts" not found Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.478443 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0c50549a-ecc3-4c2f-a625-4cfa3492fbad" (UID: "0c50549a-ecc3-4c2f-a625-4cfa3492fbad"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.503319 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="86a16a531bca49c39e0367bf3d4345deb39c5463a8f65beb54493af30860a9d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.503373 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3464c846-13b9-479e-b9af-3d571f03b284" containerName="nova-scheduler-scheduler" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.534304 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-kube-api-access-gzzhn" (OuterVolumeSpecName: "kube-api-access-gzzhn") pod "0c50549a-ecc3-4c2f-a625-4cfa3492fbad" (UID: "0c50549a-ecc3-4c2f-a625-4cfa3492fbad"). InnerVolumeSpecName "kube-api-access-gzzhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.535441 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "054e8a69-ea75-47c8-bd54-b4475341100f" (UID: "054e8a69-ea75-47c8-bd54-b4475341100f"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.599835 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.599871 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzzhn\" (UniqueName: \"kubernetes.io/projected/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-kube-api-access-gzzhn\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.599886 5047 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.599919 5047 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.602510 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0ef9-account-create-update-xrnmv"] Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.617618 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5aa7-account-create-update-khr6g"] Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.656282 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6c0a-account-create-update-mngkq"] Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.671473 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "054e8a69-ea75-47c8-bd54-b4475341100f" (UID: "054e8a69-ea75-47c8-bd54-b4475341100f"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.673005 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0953-account-create-update-psp84"] Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.671035 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-web-config" (OuterVolumeSpecName: "web-config") pod "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" (UID: "ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.701233 5047 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6-web-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.701260 5047 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/054e8a69-ea75-47c8-bd54-b4475341100f-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.702205 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-dcb1-account-create-update-s6wxg"] Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.707408 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="95116420-b62b-402c-bcbe-17026cba0354" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.114:9292/healthcheck\": read tcp 10.217.0.2:53624->10.217.1.114:9292: read: connection reset by peer" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.707711 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="95116420-b62b-402c-bcbe-17026cba0354" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.114:9292/healthcheck\": read tcp 10.217.0.2:53628->10.217.1.114:9292: read: connection reset by peer" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.716038 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 09:02:44 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8419493e1fd846703d277695e03fc5eb,Command:[/bin/sh -c #!/bin/bash Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: if [ -n "nova_api" ]; then Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="nova_api" Feb 23 09:02:44 crc kubenswrapper[5047]: else Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 09:02:44 crc kubenswrapper[5047]: fi Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 09:02:44 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 09:02:44 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 09:02:44 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 09:02:44 crc kubenswrapper[5047]: # support updates Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.718880 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-0ef9-account-create-update-xrnmv" podUID="a0b8f96e-aaed-43c4-9906-8920be9f478b" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.726412 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"054e8a69-ea75-47c8-bd54-b4475341100f","Type":"ContainerDied","Data":"b1f821f0c0aa971eb2f41a23bef8446760fe40b1f778e18fd5ac0eb656946e38"} Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.726465 5047 scope.go:117] "RemoveContainer" containerID="06eae257f344b94484be127cf7a2511935e34c00c735dcc879f6b8f3cea4311b" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.727168 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.748003 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 09:02:44 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8419493e1fd846703d277695e03fc5eb,Command:[/bin/sh -c #!/bin/bash Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: if [ -n "placement" ]; then Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="placement" Feb 23 09:02:44 crc kubenswrapper[5047]: else Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 09:02:44 crc kubenswrapper[5047]: fi Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 09:02:44 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 09:02:44 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 09:02:44 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 09:02:44 crc kubenswrapper[5047]: # support updates Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.748093 5047 generic.go:334] "Generic (PLEG): container finished" podID="4ce03d60-441e-4a79-acce-d54444aedfcb" containerID="8c44a4d060fe0afbdab6a1f05ac20d1249c428efb3b01ca24ed8e720e9d10271" exitCode=0 Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.748182 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4ce03d60-441e-4a79-acce-d54444aedfcb","Type":"ContainerDied","Data":"8c44a4d060fe0afbdab6a1f05ac20d1249c428efb3b01ca24ed8e720e9d10271"} Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.749857 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-5aa7-account-create-update-khr6g" podUID="f287ea57-5fcf-42d6-a886-fb5f35962785" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.763098 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-aad1-account-create-update-n2bdk"] Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.764583 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0c50549a-ecc3-4c2f-a625-4cfa3492fbad" (UID: "0c50549a-ecc3-4c2f-a625-4cfa3492fbad"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.768849 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5448bb6c56-5br5b" event={"ID":"0c50549a-ecc3-4c2f-a625-4cfa3492fbad","Type":"ContainerDied","Data":"8add52e375d3e75b9aee7e2b15793bfcdbaf51c6a30afc98e9cd3ea8dcb9b935"} Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.769313 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5448bb6c56-5br5b" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.775328 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 09:02:44 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8419493e1fd846703d277695e03fc5eb,Command:[/bin/sh -c #!/bin/bash Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: if [ -n "cinder" ]; then Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="cinder" Feb 23 09:02:44 crc kubenswrapper[5047]: else Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 09:02:44 crc kubenswrapper[5047]: fi Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 09:02:44 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 09:02:44 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 09:02:44 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 09:02:44 crc kubenswrapper[5047]: # support updates Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.775509 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 09:02:44 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8419493e1fd846703d277695e03fc5eb,Command:[/bin/sh -c #!/bin/bash Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: if [ -n "glance" ]; then Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="glance" Feb 23 09:02:44 crc kubenswrapper[5047]: else Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 09:02:44 crc kubenswrapper[5047]: fi Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 09:02:44 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 09:02:44 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 09:02:44 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 09:02:44 crc kubenswrapper[5047]: # support updates Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.777330 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-6c0a-account-create-update-mngkq" podUID="4ad1ab7e-d139-471c-9d7c-ceaea0bc0271" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.777561 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-0953-account-create-update-psp84" podUID="7607a87e-f072-4cdf-ba60-40b579d694ab" Feb 23 09:02:44 crc kubenswrapper[5047]: W0223 09:02:44.792782 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26d4528b_cbbd_4c90_97e6_73924844d615.slice/crio-d933e50d0b2c355d9304b88dfa2ef6915b28e41d72fb265398d0ebceb147a3fb WatchSource:0}: Error finding container d933e50d0b2c355d9304b88dfa2ef6915b28e41d72fb265398d0ebceb147a3fb: Status 404 returned error can't find the container with id d933e50d0b2c355d9304b88dfa2ef6915b28e41d72fb265398d0ebceb147a3fb Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.800437 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.800881 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 09:02:44 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8419493e1fd846703d277695e03fc5eb,Command:[/bin/sh -c #!/bin/bash Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: if [ -n "nova_cell1" ]; then Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="nova_cell1" Feb 23 09:02:44 crc kubenswrapper[5047]: else Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 09:02:44 crc kubenswrapper[5047]: fi Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 09:02:44 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 09:02:44 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 09:02:44 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 09:02:44 crc kubenswrapper[5047]: # support updates Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.801316 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9ac0b4e9-66ad-4719-b036-d9e833ad7a37","Type":"ContainerDied","Data":"ea290947b23bacf10558492b4748209193c0c5af0f3e71634623034bd1b1875d"} Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.802257 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-b6cf-account-create-update-r57lp" podUID="e444ce7d-b56a-406c-91ff-4623a469c13a" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.803500 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.805978 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-config-data" (OuterVolumeSpecName: "config-data") pod "0c50549a-ecc3-4c2f-a625-4cfa3492fbad" (UID: "0c50549a-ecc3-4c2f-a625-4cfa3492fbad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.810803 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 09:02:44 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8419493e1fd846703d277695e03fc5eb,Command:[/bin/sh -c #!/bin/bash Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: if [ -n "aodh" ]; then Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="aodh" Feb 23 09:02:44 crc kubenswrapper[5047]: else Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 09:02:44 crc kubenswrapper[5047]: fi Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 09:02:44 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 09:02:44 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 09:02:44 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 09:02:44 crc kubenswrapper[5047]: # support updates Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.813662 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c50549a-ecc3-4c2f-a625-4cfa3492fbad" (UID: "0c50549a-ecc3-4c2f-a625-4cfa3492fbad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.814623 5047 generic.go:334] "Generic (PLEG): container finished" podID="3d570b43-d0ff-42d7-a305-9d7c2f9f9881" containerID="6a1e38661d1c5ddbb45bad7c88e6e2bbc8848d6927477296120c0dddf2445603" exitCode=0 Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.814684 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d570b43-d0ff-42d7-a305-9d7c2f9f9881","Type":"ContainerDied","Data":"6a1e38661d1c5ddbb45bad7c88e6e2bbc8848d6927477296120c0dddf2445603"} Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.814786 5047 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 09:02:44 crc kubenswrapper[5047]: container &Container{Name:mariadb-account-create-update,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8419493e1fd846703d277695e03fc5eb,Command:[/bin/sh -c #!/bin/bash Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: if [ -n "nova_cell0" ]; then Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="nova_cell0" Feb 23 09:02:44 crc kubenswrapper[5047]: else Feb 23 09:02:44 crc kubenswrapper[5047]: GRANT_DATABASE="*" Feb 23 09:02:44 crc kubenswrapper[5047]: fi Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: # going for maximum compatibility here: Feb 23 09:02:44 crc kubenswrapper[5047]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 09:02:44 crc kubenswrapper[5047]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 09:02:44 crc kubenswrapper[5047]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 09:02:44 crc kubenswrapper[5047]: # support updates Feb 23 09:02:44 crc kubenswrapper[5047]: Feb 23 09:02:44 crc kubenswrapper[5047]: $MYSQL_CMD < logger="UnhandledError" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.815799 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"aodh-db-secret\\\" not found\"" pod="openstack/aodh-dcb1-account-create-update-s6wxg" podUID="26d4528b-cbbd-4c90-97e6-73924844d615" Feb 23 09:02:44 crc kubenswrapper[5047]: E0223 09:02:44.818389 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-aad1-account-create-update-n2bdk" podUID="dd7a46ed-9033-4c72-9f92-34816276560b" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.821284 5047 generic.go:334] "Generic (PLEG): container finished" podID="74700fa7-59df-4201-a7c4-de815b82208e" containerID="000d3c24fd99b6c2ed581b5f5c4fd5b32ee63c9bba247e2ec7f68a77055306d1" exitCode=0 Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.821419 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"74700fa7-59df-4201-a7c4-de815b82208e","Type":"ContainerDied","Data":"000d3c24fd99b6c2ed581b5f5c4fd5b32ee63c9bba247e2ec7f68a77055306d1"} Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.824811 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0ef9-account-create-update-xrnmv" event={"ID":"a0b8f96e-aaed-43c4-9906-8920be9f478b","Type":"ContainerStarted","Data":"1a2b26d7323362b47bf174cdc8fecc7164224daffae79c29438876b966d26f72"} Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.879229 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b60b-account-create-update-rwx74" event={"ID":"ac795337-f981-4559-a2d7-9e3eb81f9e33","Type":"ContainerDied","Data":"88760f5ea52760f2facc3375b5f7aa431a055725be2142fd81de3579c2257db7"} Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.879501 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b60b-account-create-update-rwx74" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.884228 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9205930b-2303-4b01-a3bf-cf4ef3ad0a49/ovsdbserver-nb/0.log" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.884298 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9205930b-2303-4b01-a3bf-cf4ef3ad0a49","Type":"ContainerDied","Data":"b5e9c347395e22b45b3d92ad8de4b552ed83b5cfa51c306ad4ea5f6100207545"} Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.884321 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5e9c347395e22b45b3d92ad8de4b552ed83b5cfa51c306ad4ea5f6100207545" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.888271 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b6cf-account-create-update-r57lp"] Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.889716 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-317c-account-create-update-ppk28" event={"ID":"fe43cf5f-4813-4df4-ba76-345d096d5816","Type":"ContainerDied","Data":"314c66a7c913e80ed0dc2d967cfa8947beac3864ebbf48fa33541c4150b9e4e2"} Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.889790 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-317c-account-create-update-ppk28" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.897331 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5aa7-account-create-update-khr6g" event={"ID":"f287ea57-5fcf-42d6-a886-fb5f35962785","Type":"ContainerStarted","Data":"02ec53b96752e8e5029bb80b6b3c9fe601e7db569a181e58159dfb34f109eb6f"} Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.905867 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.905924 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.939748 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6","Type":"ContainerDied","Data":"fd9e969f29217f252ede0efae00289691d2ded6c6fe9ba4cdecd9365036422ea"} Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.939930 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.946059 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0c50549a-ecc3-4c2f-a625-4cfa3492fbad" (UID: "0c50549a-ecc3-4c2f-a625-4cfa3492fbad"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.953033 5047 generic.go:334] "Generic (PLEG): container finished" podID="b4565dbe-dc04-452f-a79e-bc09cb299f29" containerID="e1cb954a9885c04c4bbd7e3ec4832eaa364057f49f38e8abbfdcbdccf00749a0" exitCode=1 Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.953169 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.954686 5047 scope.go:117] "RemoveContainer" containerID="e1cb954a9885c04c4bbd7e3ec4832eaa364057f49f38e8abbfdcbdccf00749a0" Feb 23 09:02:44 crc kubenswrapper[5047]: I0223 09:02:44.955441 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rl8wd" event={"ID":"b4565dbe-dc04-452f-a79e-bc09cb299f29","Type":"ContainerDied","Data":"e1cb954a9885c04c4bbd7e3ec4832eaa364057f49f38e8abbfdcbdccf00749a0"} Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.007634 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c50549a-ecc3-4c2f-a625-4cfa3492fbad-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.095484 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-86dd65c656-6n5cp" podUID="adff4079-41bc-4cde-bc30-4a29f5302568" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.118:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:49750->10.217.1.118:8443: read: connection reset by peer" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.114778 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.106:8775/\": read tcp 10.217.0.2:40978->10.217.1.106:8775: read: connection reset by peer" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.114769 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.106:8775/\": read tcp 10.217.0.2:40976->10.217.1.106:8775: read: connection reset by peer" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.335137 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.335457 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="ceilometer-central-agent" containerID="cri-o://a65243912ed9e00054f4ebb306fb5fe73793f4ea7a467cd196686a7efbdd8df4" gracePeriod=30 Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.335950 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="proxy-httpd" containerID="cri-o://a0215ac3fbe253cdcd33e23224e66c3483f6042b430140b93acbd73cad3aa678" gracePeriod=30 Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.336012 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="sg-core" containerID="cri-o://60141a0931e5bb65b57978e1903e918449f3377efc61f92213ef2a271cd7c68c" gracePeriod=30 Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.336061 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="ceilometer-notification-agent" containerID="cri-o://5fe5d75b2b3b7484337c60c1fa98ffc49a3af8f8d35cefe2ba43edd510fddd68" gracePeriod=30 Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.385819 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.386043 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="2d714b4f-02d1-433b-ba95-54c199957dce" containerName="kube-state-metrics" containerID="cri-o://f23fd67e4cf7fa54402cf3da162be46abfc25906ea3cdcb5916c37e37fe8051a" gracePeriod=30 Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.462149 5047 scope.go:117] "RemoveContainer" containerID="cff924505599b5b5ba1b26edde59684e4ca7be4355f211ee64421ecef23a8028" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.462398 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-58486f655d-lth95" podUID="8596ec60-89ed-43d9-be63-b7130fd0f937" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.1.129:8000/healthcheck\": read tcp 10.217.0.2:48144->10.217.1.129:8000: read: connection reset by peer" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.513999 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5f58d8574d-t4gn8" podUID="50aef737-c888-466c-92d0-9c683267d266" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.1.128:8004/healthcheck\": read tcp 10.217.0.2:51124->10.217.1.128:8004: read: connection reset by peer" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.584865 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5d854f95bd-qf2l8" podUID="1058e169-d572-43c5-80b3-d5f2f2c78afb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.48:9311/healthcheck\": read tcp 10.217.0.2:47794->10.217.1.48:9311: read: connection reset by peer" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.585289 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5d854f95bd-qf2l8" podUID="1058e169-d572-43c5-80b3-d5f2f2c78afb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.48:9311/healthcheck\": read tcp 10.217.0.2:47802->10.217.1.48:9311: read: connection reset by peer" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.646047 5047 scope.go:117] "RemoveContainer" containerID="8a9512e77ec9a5c41445f0fc54c6a74435b9b78c0ecdcff32ba1f9acce025e1f" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.780115 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.780352 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="39cf7673-4d38-49e0-9b86-f80c3949fd06" containerName="memcached" containerID="cri-o://2fb193ba990e8911587f36aa7890ab58c29c03d0ee9c8cdec4e9fb37cf8c3f1b" gracePeriod=30 Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.791543 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ce6c-account-create-update-cm686"] Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793019 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c50549a-ecc3-4c2f-a625-4cfa3492fbad" containerName="proxy-server" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793035 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c50549a-ecc3-4c2f-a625-4cfa3492fbad" containerName="proxy-server" Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793055 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346c3f0d-e5fb-40f6-bd1e-65679466165f" containerName="ovsdbserver-sb" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793061 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="346c3f0d-e5fb-40f6-bd1e-65679466165f" containerName="ovsdbserver-sb" Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793074 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="prometheus" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793081 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="prometheus" Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793091 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51062b9d-51b2-4e47-b577-3cdc144cf0d1" containerName="dnsmasq-dns" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793596 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="51062b9d-51b2-4e47-b577-3cdc144cf0d1" containerName="dnsmasq-dns" Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793610 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="054e8a69-ea75-47c8-bd54-b4475341100f" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793627 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="054e8a69-ea75-47c8-bd54-b4475341100f" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793645 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346c3f0d-e5fb-40f6-bd1e-65679466165f" containerName="openstack-network-exporter" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793653 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="346c3f0d-e5fb-40f6-bd1e-65679466165f" containerName="openstack-network-exporter" Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793661 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c50549a-ecc3-4c2f-a625-4cfa3492fbad" containerName="proxy-httpd" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793667 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c50549a-ecc3-4c2f-a625-4cfa3492fbad" containerName="proxy-httpd" Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793678 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" containerName="init-config-reloader" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793684 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" containerName="init-config-reloader" Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793693 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" containerName="alertmanager" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793699 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" containerName="alertmanager" Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793708 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" containerName="config-reloader" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793713 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" containerName="config-reloader" Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793724 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51062b9d-51b2-4e47-b577-3cdc144cf0d1" containerName="init" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793731 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="51062b9d-51b2-4e47-b577-3cdc144cf0d1" containerName="init" Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793746 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="config-reloader" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793752 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="config-reloader" Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793764 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="thanos-sidecar" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793769 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="thanos-sidecar" Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.793779 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="init-config-reloader" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793785 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="init-config-reloader" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793985 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="054e8a69-ea75-47c8-bd54-b4475341100f" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.793998 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c50549a-ecc3-4c2f-a625-4cfa3492fbad" containerName="proxy-httpd" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.794005 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="346c3f0d-e5fb-40f6-bd1e-65679466165f" containerName="ovsdbserver-sb" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.794017 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="thanos-sidecar" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.794072 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="config-reloader" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.794084 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c50549a-ecc3-4c2f-a625-4cfa3492fbad" containerName="proxy-server" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.794095 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" containerName="alertmanager" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.794105 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" containerName="prometheus" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.794116 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" containerName="config-reloader" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.794125 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="346c3f0d-e5fb-40f6-bd1e-65679466165f" containerName="openstack-network-exporter" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.794132 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="51062b9d-51b2-4e47-b577-3cdc144cf0d1" containerName="dnsmasq-dns" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.797529 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.806174 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.833839 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ce6c-account-create-update-cm686"] Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.864357 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d8f82aad-7df9-4b14-a328-2cc708aeed84" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.11:5671: connect: connection refused" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.913725 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="db558a41-6dbf-4b18-af50-6a5311530ef4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.12:5671: connect: connection refused" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.961588 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rh4b\" (UniqueName: \"kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b\") pod \"keystone-ce6c-account-create-update-cm686\" (UID: \"73633a8a-1272-44da-a229-00dd09693a39\") " pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.961714 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts\") pod \"keystone-ce6c-account-create-update-cm686\" (UID: \"73633a8a-1272-44da-a229-00dd09693a39\") " pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.991968 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-75f6dd64bf-sswjk"] Feb 23 09:02:45 crc kubenswrapper[5047]: I0223 09:02:45.992490 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-75f6dd64bf-sswjk" podUID="55697218-de1b-424f-b5ff-2d0806e54a96" containerName="keystone-api" containerID="cri-o://a2aa3c71cf56d92d78a6319101098dd98b5f40cdc1f1960e2e0f189d122e8285" gracePeriod=30 Feb 23 09:02:45 crc kubenswrapper[5047]: E0223 09:02:45.993045 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5474b9871e571cc4e16fdda4dacca659354e26baaaa9e5970121b7d51f36100b" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.010977 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5474b9871e571cc4e16fdda4dacca659354e26baaaa9e5970121b7d51f36100b" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.018881 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5474b9871e571cc4e16fdda4dacca659354e26baaaa9e5970121b7d51f36100b" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.018962 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="c59427cb-019c-4f83-af18-75900909e70f" containerName="ovn-northd" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.043297 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-cron-29530621-csdh2"] Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.054163 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9205930b-2303-4b01-a3bf-cf4ef3ad0a49/ovsdbserver-nb/0.log" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.054251 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.064679 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-ovsdbserver-nb-tls-certs\") pod \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.066199 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e\") pod \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.066268 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-scripts\") pod \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.066290 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-combined-ca-bundle\") pod \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.066306 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-config\") pod \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.066333 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-metrics-certs-tls-certs\") pod \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.066400 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxpj9\" (UniqueName: \"kubernetes.io/projected/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-kube-api-access-kxpj9\") pod \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.066423 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-ovsdb-rundir\") pod \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\" (UID: \"9205930b-2303-4b01-a3bf-cf4ef3ad0a49\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.066659 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rh4b\" (UniqueName: \"kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b\") pod \"keystone-ce6c-account-create-update-cm686\" (UID: \"73633a8a-1272-44da-a229-00dd09693a39\") " pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.066723 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts\") pod \"keystone-ce6c-account-create-update-cm686\" (UID: \"73633a8a-1272-44da-a229-00dd09693a39\") " pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.066826 5047 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.066869 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts podName:73633a8a-1272-44da-a229-00dd09693a39 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:46.566855393 +0000 UTC m=+8288.818182527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts") pod "keystone-ce6c-account-create-update-cm686" (UID: "73633a8a-1272-44da-a229-00dd09693a39") : configmap "openstack-scripts" not found Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.077951 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-scripts" (OuterVolumeSpecName: "scripts") pod "9205930b-2303-4b01-a3bf-cf4ef3ad0a49" (UID: "9205930b-2303-4b01-a3bf-cf4ef3ad0a49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.079161 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "9205930b-2303-4b01-a3bf-cf4ef3ad0a49" (UID: "9205930b-2303-4b01-a3bf-cf4ef3ad0a49"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.079786 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-config" (OuterVolumeSpecName: "config") pod "9205930b-2303-4b01-a3bf-cf4ef3ad0a49" (UID: "9205930b-2303-4b01-a3bf-cf4ef3ad0a49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.083071 5047 scope.go:117] "RemoveContainer" containerID="f8aec307320ad2b429f05d28aac10036e7f2f6cdf40d47a064a28a6e546628b0" Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.084271 5047 projected.go:194] Error preparing data for projected volume kube-api-access-2rh4b for pod openstack/keystone-ce6c-account-create-update-cm686: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.084343 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b podName:73633a8a-1272-44da-a229-00dd09693a39 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:46.584320713 +0000 UTC m=+8288.835647847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2rh4b" (UniqueName: "kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b") pod "keystone-ce6c-account-create-update-cm686" (UID: "73633a8a-1272-44da-a229-00dd09693a39") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.092978 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-kube-api-access-kxpj9" (OuterVolumeSpecName: "kube-api-access-kxpj9") pod "9205930b-2303-4b01-a3bf-cf4ef3ad0a49" (UID: "9205930b-2303-4b01-a3bf-cf4ef3ad0a49"). InnerVolumeSpecName "kube-api-access-kxpj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.097920 5047 generic.go:334] "Generic (PLEG): container finished" podID="8596ec60-89ed-43d9-be63-b7130fd0f937" containerID="38bbd56ee9a5e5626e98c5f9e39c04af222fb0e3d5740284ea5093fdcc638927" exitCode=0 Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.098018 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-cron-29530621-csdh2"] Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.098056 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58486f655d-lth95" event={"ID":"8596ec60-89ed-43d9-be63-b7130fd0f937","Type":"ContainerDied","Data":"38bbd56ee9a5e5626e98c5f9e39c04af222fb0e3d5740284ea5093fdcc638927"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.099980 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.103690 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ce6c-account-create-update-cm686"] Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.106058 5047 generic.go:334] "Generic (PLEG): container finished" podID="95116420-b62b-402c-bcbe-17026cba0354" containerID="95c93cfd98d085f0620a320a6bc1bd384f6afe679ed96aea9f76c9b0475d04ab" exitCode=0 Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.106129 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95116420-b62b-402c-bcbe-17026cba0354","Type":"ContainerDied","Data":"95c93cfd98d085f0620a320a6bc1bd384f6afe679ed96aea9f76c9b0475d04ab"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.108700 5047 generic.go:334] "Generic (PLEG): container finished" podID="07094621-ca88-4942-a226-76667658a5bc" containerID="7ec9f905acecad1325d415f2ddb8f8369e83f15dfec01b649e5e7392716e0b34" exitCode=0 Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.108741 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ff9694944-nmq4c" event={"ID":"07094621-ca88-4942-a226-76667658a5bc","Type":"ContainerDied","Data":"7ec9f905acecad1325d415f2ddb8f8369e83f15dfec01b649e5e7392716e0b34"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.108757 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ff9694944-nmq4c" event={"ID":"07094621-ca88-4942-a226-76667658a5bc","Type":"ContainerDied","Data":"40ddd88a6a1cc45ff5d983c62212c82d17cddd1b683063a0a534936a73694212"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.108766 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40ddd88a6a1cc45ff5d983c62212c82d17cddd1b683063a0a534936a73694212" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.109538 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0ef9-account-create-update-xrnmv" event={"ID":"a0b8f96e-aaed-43c4-9906-8920be9f478b","Type":"ContainerDied","Data":"1a2b26d7323362b47bf174cdc8fecc7164224daffae79c29438876b966d26f72"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.109563 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a2b26d7323362b47bf174cdc8fecc7164224daffae79c29438876b966d26f72" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.110308 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b6cf-account-create-update-r57lp" event={"ID":"e444ce7d-b56a-406c-91ff-4623a469c13a","Type":"ContainerStarted","Data":"481f36f5de5af782dc8c93afd0fa15a20caca6864cba4c5a5b85e8bbeb86442d"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.112410 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5aa7-account-create-update-khr6g" event={"ID":"f287ea57-5fcf-42d6-a886-fb5f35962785","Type":"ContainerDied","Data":"02ec53b96752e8e5029bb80b6b3c9fe601e7db569a181e58159dfb34f109eb6f"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.112432 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02ec53b96752e8e5029bb80b6b3c9fe601e7db569a181e58159dfb34f109eb6f" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.113729 5047 generic.go:334] "Generic (PLEG): container finished" podID="50aef737-c888-466c-92d0-9c683267d266" containerID="d228a3b0cadcbb3c297b956ba0f819355627a6ecf7412f2e7ba427ce670fe91f" exitCode=0 Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.113766 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f58d8574d-t4gn8" event={"ID":"50aef737-c888-466c-92d0-9c683267d266","Type":"ContainerDied","Data":"d228a3b0cadcbb3c297b956ba0f819355627a6ecf7412f2e7ba427ce670fe91f"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.116301 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.123300 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-dcb1-account-create-update-s6wxg" event={"ID":"26d4528b-cbbd-4c90-97e6-73924844d615","Type":"ContainerStarted","Data":"d933e50d0b2c355d9304b88dfa2ef6915b28e41d72fb265398d0ebceb147a3fb"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.128260 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rl8wd"] Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.133014 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "9205930b-2303-4b01-a3bf-cf4ef3ad0a49" (UID: "9205930b-2303-4b01-a3bf-cf4ef3ad0a49"). InnerVolumeSpecName "pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.133430 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"74700fa7-59df-4201-a7c4-de815b82208e","Type":"ContainerDied","Data":"4bd063b40bce2917ee513c5473c64da14c8fa9673242bb9cb0ad59ab6f069bdf"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.133455 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bd063b40bce2917ee513c5473c64da14c8fa9673242bb9cb0ad59ab6f069bdf" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.134791 5047 generic.go:334] "Generic (PLEG): container finished" podID="2d714b4f-02d1-433b-ba95-54c199957dce" containerID="f23fd67e4cf7fa54402cf3da162be46abfc25906ea3cdcb5916c37e37fe8051a" exitCode=2 Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.134824 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2d714b4f-02d1-433b-ba95-54c199957dce","Type":"ContainerDied","Data":"f23fd67e4cf7fa54402cf3da162be46abfc25906ea3cdcb5916c37e37fe8051a"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.141793 5047 generic.go:334] "Generic (PLEG): container finished" podID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerID="a0215ac3fbe253cdcd33e23224e66c3483f6042b430140b93acbd73cad3aa678" exitCode=0 Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.141825 5047 generic.go:334] "Generic (PLEG): container finished" podID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerID="60141a0931e5bb65b57978e1903e918449f3377efc61f92213ef2a271cd7c68c" exitCode=2 Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.142969 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b24cc8b9-bf0e-45d8-85a9-7c3937896968","Type":"ContainerDied","Data":"a0215ac3fbe253cdcd33e23224e66c3483f6042b430140b93acbd73cad3aa678"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.143007 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b24cc8b9-bf0e-45d8-85a9-7c3937896968","Type":"ContainerDied","Data":"60141a0931e5bb65b57978e1903e918449f3377efc61f92213ef2a271cd7c68c"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.153721 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aad1-account-create-update-n2bdk" event={"ID":"dd7a46ed-9033-4c72-9f92-34816276560b","Type":"ContainerStarted","Data":"38bcacf8966c3b7e4b625c708afec5920738c423aeb323ca9d190fd247101942"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.159283 5047 scope.go:117] "RemoveContainer" containerID="e6a7f689edafc92eda919136c37b6c2443ad2862e793b4c249f8c7326840442a" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.171003 5047 generic.go:334] "Generic (PLEG): container finished" podID="adff4079-41bc-4cde-bc30-4a29f5302568" containerID="47ff5d6db1f45daec44c8d4bbc86c28325f137beacfc6286b66cd85eba48a741" exitCode=0 Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.171096 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86dd65c656-6n5cp" event={"ID":"adff4079-41bc-4cde-bc30-4a29f5302568","Type":"ContainerDied","Data":"47ff5d6db1f45daec44c8d4bbc86c28325f137beacfc6286b66cd85eba48a741"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.171922 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxpj9\" (UniqueName: \"kubernetes.io/projected/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-kube-api-access-kxpj9\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.171945 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.171967 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e\") on node \"crc\" " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.171978 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.171988 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.175392 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.178853 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6c0a-account-create-update-mngkq" event={"ID":"4ad1ab7e-d139-471c-9d7c-ceaea0bc0271","Type":"ContainerStarted","Data":"86d1fdb997c8a7b9c4ec3111182e27f9ead55cc4be32d2b1ccfdf79eba4ad122"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.180508 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6ff9694944-nmq4c" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.208296 5047 generic.go:334] "Generic (PLEG): container finished" podID="1058e169-d572-43c5-80b3-d5f2f2c78afb" containerID="d60474bd97e9b0d0915a58eae3db7da57f5234e0b02651a966146b0399694399" exitCode=0 Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.208387 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d854f95bd-qf2l8" event={"ID":"1058e169-d572-43c5-80b3-d5f2f2c78afb","Type":"ContainerDied","Data":"d60474bd97e9b0d0915a58eae3db7da57f5234e0b02651a966146b0399694399"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.210111 5047 scope.go:117] "RemoveContainer" containerID="4f7304ad6ed4398aa70c150989b050ad9909a66441b44da2cc965ae4528229d0" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.214671 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5448bb6c56-5br5b"] Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.217214 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5aa7-account-create-update-khr6g" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.217324 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0953-account-create-update-psp84" event={"ID":"7607a87e-f072-4cdf-ba60-40b579d694ab","Type":"ContainerStarted","Data":"d0c3640f49ef24287e8b18c6b16b35c5a762a34a00b9946debda1cf73e838699"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.219697 5047 generic.go:334] "Generic (PLEG): container finished" podID="570bec3d-603c-4f92-b183-c5abb7e799d8" containerID="0aa17ecda203fdcd836c6d88a867e21da21bf1e6661eebac5135520ec1cd39ff" exitCode=0 Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.219839 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"570bec3d-603c-4f92-b183-c5abb7e799d8","Type":"ContainerDied","Data":"0aa17ecda203fdcd836c6d88a867e21da21bf1e6661eebac5135520ec1cd39ff"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.227930 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5448bb6c56-5br5b"] Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.238034 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4ce03d60-441e-4a79-acce-d54444aedfcb","Type":"ContainerDied","Data":"74a9ad11394e36f5cc68ae559052ccef981d45f632432c7f9fd20dee8c669234"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.238073 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74a9ad11394e36f5cc68ae559052ccef981d45f632432c7f9fd20dee8c669234" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.239286 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.242277 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0ef9-account-create-update-xrnmv" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.243364 5047 generic.go:334] "Generic (PLEG): container finished" podID="3464c846-13b9-479e-b9af-3d571f03b284" containerID="86a16a531bca49c39e0367bf3d4345deb39c5463a8f65beb54493af30860a9d3" exitCode=0 Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.243415 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3464c846-13b9-479e-b9af-3d571f03b284","Type":"ContainerDied","Data":"86a16a531bca49c39e0367bf3d4345deb39c5463a8f65beb54493af30860a9d3"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.246497 5047 generic.go:334] "Generic (PLEG): container finished" podID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerID="d75d9c5c9f819a4fef6f8458366281a7a7b56194dad2149a7dac9c6288a64b3e" exitCode=0 Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.246554 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4051daa4-7ea3-4ab2-ad1f-52353e0ad995","Type":"ContainerDied","Data":"d75d9c5c9f819a4fef6f8458366281a7a7b56194dad2149a7dac9c6288a64b3e"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.261130 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.270753 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.271677 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d570b43-d0ff-42d7-a305-9d7c2f9f9881","Type":"ContainerDied","Data":"521f1cfb6ff27fc630ac83fac3af6c49d9aff0ef6a40c0df0f03027347f320db"} Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.271829 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.275813 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-etc-machine-id\") pod \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.275952 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg9zb\" (UniqueName: \"kubernetes.io/projected/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-kube-api-access-fg9zb\") pod \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.276045 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-config-data\") pod \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.276096 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-logs\") pod \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.276155 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-config-data-custom\") pod \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.276609 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-scripts\") pod \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.276729 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-public-tls-certs\") pod \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.276812 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-internal-tls-certs\") pod \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.276851 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-combined-ca-bundle\") pod \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\" (UID: \"3d570b43-d0ff-42d7-a305-9d7c2f9f9881\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.278023 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3d570b43-d0ff-42d7-a305-9d7c2f9f9881" (UID: "3d570b43-d0ff-42d7-a305-9d7c2f9f9881"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.285558 5047 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.292081 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-logs" (OuterVolumeSpecName: "logs") pod "3d570b43-d0ff-42d7-a305-9d7c2f9f9881" (UID: "3d570b43-d0ff-42d7-a305-9d7c2f9f9881"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.299605 5047 scope.go:117] "RemoveContainer" containerID="b6c3d6a20c48b88b3ebe6dbb452453fb26811d2587e79ad60243317acb4b1af9" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.299839 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.310424 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-2rh4b operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-ce6c-account-create-update-cm686" podUID="73633a8a-1272-44da-a229-00dd09693a39" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.330667 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.377766 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="054e8a69-ea75-47c8-bd54-b4475341100f" path="/var/lib/kubelet/pods/054e8a69-ea75-47c8-bd54-b4475341100f/volumes" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.378564 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c50549a-ecc3-4c2f-a625-4cfa3492fbad" path="/var/lib/kubelet/pods/0c50549a-ecc3-4c2f-a625-4cfa3492fbad/volumes" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.379534 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="854c6fd4-0b9e-4479-b732-d9ae5f72ea4f" path="/var/lib/kubelet/pods/854c6fd4-0b9e-4479-b732-d9ae5f72ea4f/volumes" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.380552 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-kube-api-access-fg9zb" (OuterVolumeSpecName: "kube-api-access-fg9zb") pod "3d570b43-d0ff-42d7-a305-9d7c2f9f9881" (UID: "3d570b43-d0ff-42d7-a305-9d7c2f9f9881"). InnerVolumeSpecName "kube-api-access-fg9zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.387324 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-operator-scripts\") pod \"74700fa7-59df-4201-a7c4-de815b82208e\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.387400 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ce03d60-441e-4a79-acce-d54444aedfcb-httpd-run\") pod \"4ce03d60-441e-4a79-acce-d54444aedfcb\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.387458 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-combined-ca-bundle\") pod \"4ce03d60-441e-4a79-acce-d54444aedfcb\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.387493 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-public-tls-certs\") pod \"07094621-ca88-4942-a226-76667658a5bc\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.387536 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-config-data-default\") pod \"74700fa7-59df-4201-a7c4-de815b82208e\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.387585 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-combined-ca-bundle\") pod \"07094621-ca88-4942-a226-76667658a5bc\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.387615 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-scripts\") pod \"07094621-ca88-4942-a226-76667658a5bc\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.387637 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce03d60-441e-4a79-acce-d54444aedfcb-logs\") pod \"4ce03d60-441e-4a79-acce-d54444aedfcb\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.387688 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6gr4\" (UniqueName: \"kubernetes.io/projected/07094621-ca88-4942-a226-76667658a5bc-kube-api-access-m6gr4\") pod \"07094621-ca88-4942-a226-76667658a5bc\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.387723 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-internal-tls-certs\") pod \"4ce03d60-441e-4a79-acce-d54444aedfcb\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.387792 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-kolla-config\") pod \"74700fa7-59df-4201-a7c4-de815b82208e\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.387923 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce03d60-441e-4a79-acce-d54444aedfcb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4ce03d60-441e-4a79-acce-d54444aedfcb" (UID: "4ce03d60-441e-4a79-acce-d54444aedfcb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.388274 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74700fa7-59df-4201-a7c4-de815b82208e" (UID: "74700fa7-59df-4201-a7c4-de815b82208e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.391620 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "74700fa7-59df-4201-a7c4-de815b82208e" (UID: "74700fa7-59df-4201-a7c4-de815b82208e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.395419 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce03d60-441e-4a79-acce-d54444aedfcb-logs" (OuterVolumeSpecName: "logs") pod "4ce03d60-441e-4a79-acce-d54444aedfcb" (UID: "4ce03d60-441e-4a79-acce-d54444aedfcb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.387847 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-scripts\") pod \"4ce03d60-441e-4a79-acce-d54444aedfcb\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.395745 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95116420-b62b-402c-bcbe-17026cba0354-logs\") pod \"95116420-b62b-402c-bcbe-17026cba0354\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.395797 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-config-data\") pod \"07094621-ca88-4942-a226-76667658a5bc\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.395862 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc9cs\" (UniqueName: \"kubernetes.io/projected/f287ea57-5fcf-42d6-a886-fb5f35962785-kube-api-access-dc9cs\") pod \"f287ea57-5fcf-42d6-a886-fb5f35962785\" (UID: \"f287ea57-5fcf-42d6-a886-fb5f35962785\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.395928 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07094621-ca88-4942-a226-76667658a5bc-logs\") pod \"07094621-ca88-4942-a226-76667658a5bc\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.395951 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "74700fa7-59df-4201-a7c4-de815b82208e" (UID: "74700fa7-59df-4201-a7c4-de815b82208e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.396751 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07094621-ca88-4942-a226-76667658a5bc-logs" (OuterVolumeSpecName: "logs") pod "07094621-ca88-4942-a226-76667658a5bc" (UID: "07094621-ca88-4942-a226-76667658a5bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.397114 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95116420-b62b-402c-bcbe-17026cba0354-logs" (OuterVolumeSpecName: "logs") pod "95116420-b62b-402c-bcbe-17026cba0354" (UID: "95116420-b62b-402c-bcbe-17026cba0354"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.401592 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0\") pod \"74700fa7-59df-4201-a7c4-de815b82208e\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.401679 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/74700fa7-59df-4201-a7c4-de815b82208e-galera-tls-certs\") pod \"74700fa7-59df-4201-a7c4-de815b82208e\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.401722 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/74700fa7-59df-4201-a7c4-de815b82208e-config-data-generated\") pod \"74700fa7-59df-4201-a7c4-de815b82208e\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.401798 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-scripts\") pod \"95116420-b62b-402c-bcbe-17026cba0354\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.401831 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f287ea57-5fcf-42d6-a886-fb5f35962785-operator-scripts\") pod \"f287ea57-5fcf-42d6-a886-fb5f35962785\" (UID: \"f287ea57-5fcf-42d6-a886-fb5f35962785\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.401868 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0b8f96e-aaed-43c4-9906-8920be9f478b-operator-scripts\") pod \"a0b8f96e-aaed-43c4-9906-8920be9f478b\" (UID: \"a0b8f96e-aaed-43c4-9906-8920be9f478b\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.401930 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-internal-tls-certs\") pod \"07094621-ca88-4942-a226-76667658a5bc\" (UID: \"07094621-ca88-4942-a226-76667658a5bc\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.401969 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vldcx\" (UniqueName: \"kubernetes.io/projected/74700fa7-59df-4201-a7c4-de815b82208e-kube-api-access-vldcx\") pod \"74700fa7-59df-4201-a7c4-de815b82208e\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.402036 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9j7g\" (UniqueName: \"kubernetes.io/projected/a0b8f96e-aaed-43c4-9906-8920be9f478b-kube-api-access-h9j7g\") pod \"a0b8f96e-aaed-43c4-9906-8920be9f478b\" (UID: \"a0b8f96e-aaed-43c4-9906-8920be9f478b\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.402061 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxwvf\" (UniqueName: \"kubernetes.io/projected/4ce03d60-441e-4a79-acce-d54444aedfcb-kube-api-access-bxwvf\") pod \"4ce03d60-441e-4a79-acce-d54444aedfcb\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.402116 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-config-data\") pod \"4ce03d60-441e-4a79-acce-d54444aedfcb\" (UID: \"4ce03d60-441e-4a79-acce-d54444aedfcb\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.402151 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74700fa7-59df-4201-a7c4-de815b82208e-combined-ca-bundle\") pod \"74700fa7-59df-4201-a7c4-de815b82208e\" (UID: \"74700fa7-59df-4201-a7c4-de815b82208e\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.403568 5047 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.403592 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95116420-b62b-402c-bcbe-17026cba0354-logs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.403604 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07094621-ca88-4942-a226-76667658a5bc-logs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.403622 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg9zb\" (UniqueName: \"kubernetes.io/projected/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-kube-api-access-fg9zb\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.403636 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-logs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.403649 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.403662 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ce03d60-441e-4a79-acce-d54444aedfcb-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.403676 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/74700fa7-59df-4201-a7c4-de815b82208e-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.403688 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce03d60-441e-4a79-acce-d54444aedfcb-logs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.403768 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.403830 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data podName:d8f82aad-7df9-4b14-a328-2cc708aeed84 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:54.403811279 +0000 UTC m=+8296.655138413 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data") pod "rabbitmq-server-0" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84") : configmap "rabbitmq-config-data" not found Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.421681 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-scripts" (OuterVolumeSpecName: "scripts") pod "3d570b43-d0ff-42d7-a305-9d7c2f9f9881" (UID: "3d570b43-d0ff-42d7-a305-9d7c2f9f9881"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.422935 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74700fa7-59df-4201-a7c4-de815b82208e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "74700fa7-59df-4201-a7c4-de815b82208e" (UID: "74700fa7-59df-4201-a7c4-de815b82208e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.424351 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f287ea57-5fcf-42d6-a886-fb5f35962785-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f287ea57-5fcf-42d6-a886-fb5f35962785" (UID: "f287ea57-5fcf-42d6-a886-fb5f35962785"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.424977 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0b8f96e-aaed-43c4-9906-8920be9f478b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0b8f96e-aaed-43c4-9906-8920be9f478b" (UID: "a0b8f96e-aaed-43c4-9906-8920be9f478b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.427660 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.428129 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-scripts" (OuterVolumeSpecName: "scripts") pod "07094621-ca88-4942-a226-76667658a5bc" (UID: "07094621-ca88-4942-a226-76667658a5bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.444386 5047 scope.go:117] "RemoveContainer" containerID="8c1124f684fe17c61a20660889d0932a297693e7c21894c823edb60c364c744b" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.458276 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce03d60-441e-4a79-acce-d54444aedfcb-kube-api-access-bxwvf" (OuterVolumeSpecName: "kube-api-access-bxwvf") pod "4ce03d60-441e-4a79-acce-d54444aedfcb" (UID: "4ce03d60-441e-4a79-acce-d54444aedfcb"). InnerVolumeSpecName "kube-api-access-bxwvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.467033 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3d570b43-d0ff-42d7-a305-9d7c2f9f9881" (UID: "3d570b43-d0ff-42d7-a305-9d7c2f9f9881"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.467811 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07094621-ca88-4942-a226-76667658a5bc-kube-api-access-m6gr4" (OuterVolumeSpecName: "kube-api-access-m6gr4") pod "07094621-ca88-4942-a226-76667658a5bc" (UID: "07094621-ca88-4942-a226-76667658a5bc"). InnerVolumeSpecName "kube-api-access-m6gr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.478419 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0b8f96e-aaed-43c4-9906-8920be9f478b-kube-api-access-h9j7g" (OuterVolumeSpecName: "kube-api-access-h9j7g") pod "a0b8f96e-aaed-43c4-9906-8920be9f478b" (UID: "a0b8f96e-aaed-43c4-9906-8920be9f478b"). InnerVolumeSpecName "kube-api-access-h9j7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.501826 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-scripts" (OuterVolumeSpecName: "scripts") pod "4ce03d60-441e-4a79-acce-d54444aedfcb" (UID: "4ce03d60-441e-4a79-acce-d54444aedfcb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.503560 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f287ea57-5fcf-42d6-a886-fb5f35962785-kube-api-access-dc9cs" (OuterVolumeSpecName: "kube-api-access-dc9cs") pod "f287ea57-5fcf-42d6-a886-fb5f35962785" (UID: "f287ea57-5fcf-42d6-a886-fb5f35962785"). InnerVolumeSpecName "kube-api-access-dc9cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.503680 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74700fa7-59df-4201-a7c4-de815b82208e-kube-api-access-vldcx" (OuterVolumeSpecName: "kube-api-access-vldcx") pod "74700fa7-59df-4201-a7c4-de815b82208e" (UID: "74700fa7-59df-4201-a7c4-de815b82208e"). InnerVolumeSpecName "kube-api-access-vldcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.504560 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq6kc\" (UniqueName: \"kubernetes.io/projected/570bec3d-603c-4f92-b183-c5abb7e799d8-kube-api-access-dq6kc\") pod \"570bec3d-603c-4f92-b183-c5abb7e799d8\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.504599 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/570bec3d-603c-4f92-b183-c5abb7e799d8-logs\") pod \"570bec3d-603c-4f92-b183-c5abb7e799d8\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.504641 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-combined-ca-bundle\") pod \"570bec3d-603c-4f92-b183-c5abb7e799d8\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.504689 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-public-tls-certs\") pod \"570bec3d-603c-4f92-b183-c5abb7e799d8\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.504774 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t95r9\" (UniqueName: \"kubernetes.io/projected/95116420-b62b-402c-bcbe-17026cba0354-kube-api-access-t95r9\") pod \"95116420-b62b-402c-bcbe-17026cba0354\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.504795 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-internal-tls-certs\") pod \"570bec3d-603c-4f92-b183-c5abb7e799d8\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.504864 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-combined-ca-bundle\") pod \"95116420-b62b-402c-bcbe-17026cba0354\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.504997 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95116420-b62b-402c-bcbe-17026cba0354-httpd-run\") pod \"95116420-b62b-402c-bcbe-17026cba0354\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505024 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-config-data\") pod \"95116420-b62b-402c-bcbe-17026cba0354\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505069 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-config-data\") pod \"570bec3d-603c-4f92-b183-c5abb7e799d8\" (UID: \"570bec3d-603c-4f92-b183-c5abb7e799d8\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505101 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-public-tls-certs\") pod \"95116420-b62b-402c-bcbe-17026cba0354\" (UID: \"95116420-b62b-402c-bcbe-17026cba0354\") " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505558 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505578 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc9cs\" (UniqueName: \"kubernetes.io/projected/f287ea57-5fcf-42d6-a886-fb5f35962785-kube-api-access-dc9cs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505588 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/74700fa7-59df-4201-a7c4-de815b82208e-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505597 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f287ea57-5fcf-42d6-a886-fb5f35962785-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505608 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0b8f96e-aaed-43c4-9906-8920be9f478b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505617 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vldcx\" (UniqueName: \"kubernetes.io/projected/74700fa7-59df-4201-a7c4-de815b82208e-kube-api-access-vldcx\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505626 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505635 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9j7g\" (UniqueName: \"kubernetes.io/projected/a0b8f96e-aaed-43c4-9906-8920be9f478b-kube-api-access-h9j7g\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505644 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxwvf\" (UniqueName: \"kubernetes.io/projected/4ce03d60-441e-4a79-acce-d54444aedfcb-kube-api-access-bxwvf\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505653 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505661 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.505669 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6gr4\" (UniqueName: \"kubernetes.io/projected/07094621-ca88-4942-a226-76667658a5bc-kube-api-access-m6gr4\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.507528 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/570bec3d-603c-4f92-b183-c5abb7e799d8-logs" (OuterVolumeSpecName: "logs") pod "570bec3d-603c-4f92-b183-c5abb7e799d8" (UID: "570bec3d-603c-4f92-b183-c5abb7e799d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.508657 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95116420-b62b-402c-bcbe-17026cba0354-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "95116420-b62b-402c-bcbe-17026cba0354" (UID: "95116420-b62b-402c-bcbe-17026cba0354"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.517240 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-scripts" (OuterVolumeSpecName: "scripts") pod "95116420-b62b-402c-bcbe-17026cba0354" (UID: "95116420-b62b-402c-bcbe-17026cba0354"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.537741 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95116420-b62b-402c-bcbe-17026cba0354-kube-api-access-t95r9" (OuterVolumeSpecName: "kube-api-access-t95r9") pod "95116420-b62b-402c-bcbe-17026cba0354" (UID: "95116420-b62b-402c-bcbe-17026cba0354"). InnerVolumeSpecName "kube-api-access-t95r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.576081 5047 scope.go:117] "RemoveContainer" containerID="7061c930f5e73ec3ef1ef35e6ca4955b8a79a7a14eda68ced588ceb181b02467" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.607114 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/570bec3d-603c-4f92-b183-c5abb7e799d8-kube-api-access-dq6kc" (OuterVolumeSpecName: "kube-api-access-dq6kc") pod "570bec3d-603c-4f92-b183-c5abb7e799d8" (UID: "570bec3d-603c-4f92-b183-c5abb7e799d8"). InnerVolumeSpecName "kube-api-access-dq6kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.610559 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rh4b\" (UniqueName: \"kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b\") pod \"keystone-ce6c-account-create-update-cm686\" (UID: \"73633a8a-1272-44da-a229-00dd09693a39\") " pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.610684 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts\") pod \"keystone-ce6c-account-create-update-cm686\" (UID: \"73633a8a-1272-44da-a229-00dd09693a39\") " pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.610939 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95116420-b62b-402c-bcbe-17026cba0354-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.610962 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq6kc\" (UniqueName: \"kubernetes.io/projected/570bec3d-603c-4f92-b183-c5abb7e799d8-kube-api-access-dq6kc\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.610978 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/570bec3d-603c-4f92-b183-c5abb7e799d8-logs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.610990 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.611005 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t95r9\" (UniqueName: \"kubernetes.io/projected/95116420-b62b-402c-bcbe-17026cba0354-kube-api-access-t95r9\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.611082 5047 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.611139 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts podName:73633a8a-1272-44da-a229-00dd09693a39 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:47.611121803 +0000 UTC m=+8289.862448937 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts") pod "keystone-ce6c-account-create-update-cm686" (UID: "73633a8a-1272-44da-a229-00dd09693a39") : configmap "openstack-scripts" not found Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.628135 5047 projected.go:194] Error preparing data for projected volume kube-api-access-2rh4b for pod openstack/keystone-ce6c-account-create-update-cm686: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 09:02:46 crc kubenswrapper[5047]: E0223 09:02:46.628231 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b podName:73633a8a-1272-44da-a229-00dd09693a39 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:47.628188053 +0000 UTC m=+8289.879515187 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2rh4b" (UniqueName: "kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b") pod "keystone-ce6c-account-create-update-cm686" (UID: "73633a8a-1272-44da-a229-00dd09693a39") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.837244 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0" (OuterVolumeSpecName: "mysql-db") pod "74700fa7-59df-4201-a7c4-de815b82208e" (UID: "74700fa7-59df-4201-a7c4-de815b82208e"). InnerVolumeSpecName "pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.845711 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.846144 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e") on node "crc" Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.968262 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0\") on node \"crc\" " Feb 23 09:02:46 crc kubenswrapper[5047]: I0223 09:02:46.968635 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a7b4455-624b-4359-bed6-93674ec9fe0e\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.047019 5047 scope.go:117] "RemoveContainer" containerID="a616b652208daefb07b38cd626733fc90a295a0a07102e8c6660ad4f1c65e03e" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.082112 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.082166 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.082181 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.082196 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-317c-account-create-update-ppk28"] Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.082208 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-317c-account-create-update-ppk28"] Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.082220 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b60b-account-create-update-rwx74"] Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.082244 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b60b-account-create-update-rwx74"] Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.082259 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.082270 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.131062 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.169441 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9205930b-2303-4b01-a3bf-cf4ef3ad0a49" (UID: "9205930b-2303-4b01-a3bf-cf4ef3ad0a49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.188687 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4m9l\" (UniqueName: \"kubernetes.io/projected/2d714b4f-02d1-433b-ba95-54c199957dce-kube-api-access-q4m9l\") pod \"2d714b4f-02d1-433b-ba95-54c199957dce\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.189003 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-kube-state-metrics-tls-certs\") pod \"2d714b4f-02d1-433b-ba95-54c199957dce\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.189518 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-combined-ca-bundle\") pod \"2d714b4f-02d1-433b-ba95-54c199957dce\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.189571 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-kube-state-metrics-tls-config\") pod \"2d714b4f-02d1-433b-ba95-54c199957dce\" (UID: \"2d714b4f-02d1-433b-ba95-54c199957dce\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.190345 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.231312 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d714b4f-02d1-433b-ba95-54c199957dce-kube-api-access-q4m9l" (OuterVolumeSpecName: "kube-api-access-q4m9l") pod "2d714b4f-02d1-433b-ba95-54c199957dce" (UID: "2d714b4f-02d1-433b-ba95-54c199957dce"). InnerVolumeSpecName "kube-api-access-q4m9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.292878 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4m9l\" (UniqueName: \"kubernetes.io/projected/2d714b4f-02d1-433b-ba95-54c199957dce-kube-api-access-q4m9l\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.320628 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2d714b4f-02d1-433b-ba95-54c199957dce","Type":"ContainerDied","Data":"1134957310d5ea06b0897ec767a16c8f525402bf936aa6b654d8cce8ae60c26e"} Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.321110 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.322765 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74700fa7-59df-4201-a7c4-de815b82208e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74700fa7-59df-4201-a7c4-de815b82208e" (UID: "74700fa7-59df-4201-a7c4-de815b82208e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.323923 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aad1-account-create-update-n2bdk" event={"ID":"dd7a46ed-9033-4c72-9f92-34816276560b","Type":"ContainerDied","Data":"38bcacf8966c3b7e4b625c708afec5920738c423aeb323ca9d190fd247101942"} Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.323969 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38bcacf8966c3b7e4b625c708afec5920738c423aeb323ca9d190fd247101942" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.327810 5047 scope.go:117] "RemoveContainer" containerID="6a1e38661d1c5ddbb45bad7c88e6e2bbc8848d6927477296120c0dddf2445603" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.328049 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6c0a-account-create-update-mngkq" event={"ID":"4ad1ab7e-d139-471c-9d7c-ceaea0bc0271","Type":"ContainerDied","Data":"86d1fdb997c8a7b9c4ec3111182e27f9ead55cc4be32d2b1ccfdf79eba4ad122"} Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.328469 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86d1fdb997c8a7b9c4ec3111182e27f9ead55cc4be32d2b1ccfdf79eba4ad122" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.333704 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58486f655d-lth95" event={"ID":"8596ec60-89ed-43d9-be63-b7130fd0f937","Type":"ContainerDied","Data":"08643c152f10fec24bf41bde57987b1345ca076fd1ae2d95553fdcbd849612e5"} Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.333759 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08643c152f10fec24bf41bde57987b1345ca076fd1ae2d95553fdcbd849612e5" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.344578 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4051daa4-7ea3-4ab2-ad1f-52353e0ad995","Type":"ContainerDied","Data":"0e668d66a5382cbefe7b815098f939d1374da758802c5066e2cf912cdf260755"} Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.344683 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e668d66a5382cbefe7b815098f939d1374da758802c5066e2cf912cdf260755" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.351929 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95116420-b62b-402c-bcbe-17026cba0354","Type":"ContainerDied","Data":"e459a4cfaf687ea8deefc617116bb6434c5009226114df0c9539ae23a43598d7"} Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.352028 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.368051 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"570bec3d-603c-4f92-b183-c5abb7e799d8","Type":"ContainerDied","Data":"ff62d255a189ffb7e6fe77844e53d3e02120440be58166aca5aaf2e16703b99a"} Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.368147 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.394893 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74700fa7-59df-4201-a7c4-de815b82208e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: E0223 09:02:47.394708 5047 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 09:02:47 crc kubenswrapper[5047]: E0223 09:02:47.395161 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data podName:db558a41-6dbf-4b18-af50-6a5311530ef4 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:55.395146631 +0000 UTC m=+8297.646473765 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data") pod "rabbitmq-cell1-server-0" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4") : configmap "rabbitmq-cell1-config-data" not found Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.395832 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f58d8574d-t4gn8" event={"ID":"50aef737-c888-466c-92d0-9c683267d266","Type":"ContainerDied","Data":"397109ef6a97ec926b5e9f308fb40a8bd07bff7b12782befbb76658a1a9c9250"} Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.395964 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="397109ef6a97ec926b5e9f308fb40a8bd07bff7b12782befbb76658a1a9c9250" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.403807 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3464c846-13b9-479e-b9af-3d571f03b284","Type":"ContainerDied","Data":"94047716d66aceb29fd35f503786f8bac1aca08ab50d7f10030a5a48d9126a7f"} Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.403847 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94047716d66aceb29fd35f503786f8bac1aca08ab50d7f10030a5a48d9126a7f" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.404942 5047 generic.go:334] "Generic (PLEG): container finished" podID="39cf7673-4d38-49e0-9b86-f80c3949fd06" containerID="2fb193ba990e8911587f36aa7890ab58c29c03d0ee9c8cdec4e9fb37cf8c3f1b" exitCode=0 Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.404996 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"39cf7673-4d38-49e0-9b86-f80c3949fd06","Type":"ContainerDied","Data":"2fb193ba990e8911587f36aa7890ab58c29c03d0ee9c8cdec4e9fb37cf8c3f1b"} Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.406093 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d854f95bd-qf2l8" event={"ID":"1058e169-d572-43c5-80b3-d5f2f2c78afb","Type":"ContainerDied","Data":"ded9d9ca87a57a22eb2252345cbf14e358b71875343afbe48bf8dba37631cd84"} Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.406119 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ded9d9ca87a57a22eb2252345cbf14e358b71875343afbe48bf8dba37631cd84" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.419796 5047 generic.go:334] "Generic (PLEG): container finished" podID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerID="a65243912ed9e00054f4ebb306fb5fe73793f4ea7a467cd196686a7efbdd8df4" exitCode=0 Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.419948 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b24cc8b9-bf0e-45d8-85a9-7c3937896968","Type":"ContainerDied","Data":"a65243912ed9e00054f4ebb306fb5fe73793f4ea7a467cd196686a7efbdd8df4"} Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.424278 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.428484 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rl8wd" event={"ID":"b4565dbe-dc04-452f-a79e-bc09cb299f29","Type":"ContainerStarted","Data":"5c9b91f2cf76cf8687be7fff6e9e192f705038da3a424902d902a1cc7dfc91f5"} Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.428764 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6ff9694944-nmq4c" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.429154 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/root-account-create-update-rl8wd" podUID="b4565dbe-dc04-452f-a79e-bc09cb299f29" containerName="mariadb-account-create-update" containerID="cri-o://5c9b91f2cf76cf8687be7fff6e9e192f705038da3a424902d902a1cc7dfc91f5" gracePeriod=30 Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.429309 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5aa7-account-create-update-khr6g" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.429356 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.429391 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.429416 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0ef9-account-create-update-xrnmv" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.434176 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.449403 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "570bec3d-603c-4f92-b183-c5abb7e799d8" (UID: "570bec3d-603c-4f92-b183-c5abb7e799d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.452082 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d570b43-d0ff-42d7-a305-9d7c2f9f9881" (UID: "3d570b43-d0ff-42d7-a305-9d7c2f9f9881"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.479739 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.480101 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-rl8wd" podStartSLOduration=9.48007736 podStartE2EDuration="9.48007736s" podCreationTimestamp="2026-02-23 09:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:02:47.456957416 +0000 UTC m=+8289.708284570" watchObservedRunningTime="2026-02-23 09:02:47.48007736 +0000 UTC m=+8289.731404484" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.497008 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-logs\") pod \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.497226 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfq4j\" (UniqueName: \"kubernetes.io/projected/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-kube-api-access-wfq4j\") pod \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.497364 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-nova-metadata-tls-certs\") pod \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.497441 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-config-data\") pod \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.497470 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-combined-ca-bundle\") pod \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.499768 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-logs" (OuterVolumeSpecName: "logs") pod "4051daa4-7ea3-4ab2-ad1f-52353e0ad995" (UID: "4051daa4-7ea3-4ab2-ad1f-52353e0ad995"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.506354 5047 scope.go:117] "RemoveContainer" containerID="3cd1af755bfcf4970c210c91d87839f5bae5b4edc4ad2683851e88ccef337c33" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.509030 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-logs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.509069 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.509085 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.517592 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.519240 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0") on node "crc" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.525947 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ce03d60-441e-4a79-acce-d54444aedfcb" (UID: "4ce03d60-441e-4a79-acce-d54444aedfcb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.539179 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74700fa7-59df-4201-a7c4-de815b82208e-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "74700fa7-59df-4201-a7c4-de815b82208e" (UID: "74700fa7-59df-4201-a7c4-de815b82208e"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.542525 5047 scope.go:117] "RemoveContainer" containerID="f23fd67e4cf7fa54402cf3da162be46abfc25906ea3cdcb5916c37e37fe8051a" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.547771 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.549628 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-kube-api-access-wfq4j" (OuterVolumeSpecName: "kube-api-access-wfq4j") pod "4051daa4-7ea3-4ab2-ad1f-52353e0ad995" (UID: "4051daa4-7ea3-4ab2-ad1f-52353e0ad995"). InnerVolumeSpecName "kube-api-access-wfq4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.549952 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.566023 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.590788 5047 scope.go:117] "RemoveContainer" containerID="95c93cfd98d085f0620a320a6bc1bd384f6afe679ed96aea9f76c9b0475d04ab" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.590800 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "95116420-b62b-402c-bcbe-17026cba0354" (UID: "95116420-b62b-402c-bcbe-17026cba0354"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.590866 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="753e53f0-d44b-4af2-9aff-eabc1a46d537" containerName="galera" containerID="cri-o://4373b06bebc37262c61b269b46fec7e7e686ae9f047fd098351acbd0de2a4b21" gracePeriod=29 Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.590886 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5aa7-account-create-update-khr6g"] Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.600052 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5aa7-account-create-update-khr6g"] Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.612096 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-config-data-custom\") pod \"50aef737-c888-466c-92d0-9c683267d266\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.612173 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-config-data-custom\") pod \"1058e169-d572-43c5-80b3-d5f2f2c78afb\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.612232 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-internal-tls-certs\") pod \"50aef737-c888-466c-92d0-9c683267d266\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.612291 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-combined-ca-bundle\") pod \"1058e169-d572-43c5-80b3-d5f2f2c78afb\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.612337 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2925\" (UniqueName: \"kubernetes.io/projected/1058e169-d572-43c5-80b3-d5f2f2c78afb-kube-api-access-q2925\") pod \"1058e169-d572-43c5-80b3-d5f2f2c78afb\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.612477 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1058e169-d572-43c5-80b3-d5f2f2c78afb-logs\") pod \"1058e169-d572-43c5-80b3-d5f2f2c78afb\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.612521 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-combined-ca-bundle\") pod \"50aef737-c888-466c-92d0-9c683267d266\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.612558 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5czz\" (UniqueName: \"kubernetes.io/projected/3464c846-13b9-479e-b9af-3d571f03b284-kube-api-access-r5czz\") pod \"3464c846-13b9-479e-b9af-3d571f03b284\" (UID: \"3464c846-13b9-479e-b9af-3d571f03b284\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.612778 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgmlt\" (UniqueName: \"kubernetes.io/projected/50aef737-c888-466c-92d0-9c683267d266-kube-api-access-jgmlt\") pod \"50aef737-c888-466c-92d0-9c683267d266\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.612835 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-public-tls-certs\") pod \"1058e169-d572-43c5-80b3-d5f2f2c78afb\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.612886 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-config-data\") pod \"1058e169-d572-43c5-80b3-d5f2f2c78afb\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.612932 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3464c846-13b9-479e-b9af-3d571f03b284-combined-ca-bundle\") pod \"3464c846-13b9-479e-b9af-3d571f03b284\" (UID: \"3464c846-13b9-479e-b9af-3d571f03b284\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.613043 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-public-tls-certs\") pod \"50aef737-c888-466c-92d0-9c683267d266\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.613091 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-config-data\") pod \"50aef737-c888-466c-92d0-9c683267d266\" (UID: \"50aef737-c888-466c-92d0-9c683267d266\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.613123 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-internal-tls-certs\") pod \"1058e169-d572-43c5-80b3-d5f2f2c78afb\" (UID: \"1058e169-d572-43c5-80b3-d5f2f2c78afb\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.613174 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3464c846-13b9-479e-b9af-3d571f03b284-config-data\") pod \"3464c846-13b9-479e-b9af-3d571f03b284\" (UID: \"3464c846-13b9-479e-b9af-3d571f03b284\") " Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.613641 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts\") pod \"keystone-ce6c-account-create-update-cm686\" (UID: \"73633a8a-1272-44da-a229-00dd09693a39\") " pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.619523 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10eb0f3e-7b1d-4327-9a9e-e03588e517e0\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.619560 5047 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/74700fa7-59df-4201-a7c4-de815b82208e-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.619573 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.619583 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfq4j\" (UniqueName: \"kubernetes.io/projected/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-kube-api-access-wfq4j\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.619595 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: E0223 09:02:47.620266 5047 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 23 09:02:47 crc kubenswrapper[5047]: E0223 09:02:47.620334 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts podName:73633a8a-1272-44da-a229-00dd09693a39 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:49.620308747 +0000 UTC m=+8291.871635881 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts") pod "keystone-ce6c-account-create-update-cm686" (UID: "73633a8a-1272-44da-a229-00dd09693a39") : configmap "openstack-scripts" not found Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.627509 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1058e169-d572-43c5-80b3-d5f2f2c78afb-logs" (OuterVolumeSpecName: "logs") pod "1058e169-d572-43c5-80b3-d5f2f2c78afb" (UID: "1058e169-d572-43c5-80b3-d5f2f2c78afb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.641491 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0ef9-account-create-update-xrnmv"] Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.646700 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95116420-b62b-402c-bcbe-17026cba0354" (UID: "95116420-b62b-402c-bcbe-17026cba0354"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.648319 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3464c846-13b9-479e-b9af-3d571f03b284-kube-api-access-r5czz" (OuterVolumeSpecName: "kube-api-access-r5czz") pod "3464c846-13b9-479e-b9af-3d571f03b284" (UID: "3464c846-13b9-479e-b9af-3d571f03b284"). InnerVolumeSpecName "kube-api-access-r5czz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.652323 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0ef9-account-create-update-xrnmv"] Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.652530 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50aef737-c888-466c-92d0-9c683267d266-kube-api-access-jgmlt" (OuterVolumeSpecName: "kube-api-access-jgmlt") pod "50aef737-c888-466c-92d0-9c683267d266" (UID: "50aef737-c888-466c-92d0-9c683267d266"). InnerVolumeSpecName "kube-api-access-jgmlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.668163 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1058e169-d572-43c5-80b3-d5f2f2c78afb-kube-api-access-q2925" (OuterVolumeSpecName: "kube-api-access-q2925") pod "1058e169-d572-43c5-80b3-d5f2f2c78afb" (UID: "1058e169-d572-43c5-80b3-d5f2f2c78afb"). InnerVolumeSpecName "kube-api-access-q2925". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.680062 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "50aef737-c888-466c-92d0-9c683267d266" (UID: "50aef737-c888-466c-92d0-9c683267d266"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.681343 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1058e169-d572-43c5-80b3-d5f2f2c78afb" (UID: "1058e169-d572-43c5-80b3-d5f2f2c78afb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.723658 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rh4b\" (UniqueName: \"kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b\") pod \"keystone-ce6c-account-create-update-cm686\" (UID: \"73633a8a-1272-44da-a229-00dd09693a39\") " pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.723988 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.724007 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.724027 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2925\" (UniqueName: \"kubernetes.io/projected/1058e169-d572-43c5-80b3-d5f2f2c78afb-kube-api-access-q2925\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.724041 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1058e169-d572-43c5-80b3-d5f2f2c78afb-logs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.724054 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5czz\" (UniqueName: \"kubernetes.io/projected/3464c846-13b9-479e-b9af-3d571f03b284-kube-api-access-r5czz\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.724066 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgmlt\" (UniqueName: \"kubernetes.io/projected/50aef737-c888-466c-92d0-9c683267d266-kube-api-access-jgmlt\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.724083 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: E0223 09:02:47.729167 5047 projected.go:194] Error preparing data for projected volume kube-api-access-2rh4b for pod openstack/keystone-ce6c-account-create-update-cm686: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 09:02:47 crc kubenswrapper[5047]: E0223 09:02:47.729272 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b podName:73633a8a-1272-44da-a229-00dd09693a39 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:49.729246312 +0000 UTC m=+8291.980573446 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2rh4b" (UniqueName: "kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b") pod "keystone-ce6c-account-create-update-cm686" (UID: "73633a8a-1272-44da-a229-00dd09693a39") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.764113 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9205930b-2303-4b01-a3bf-cf4ef3ad0a49" (UID: "9205930b-2303-4b01-a3bf-cf4ef3ad0a49"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.831390 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.869457 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d714b4f-02d1-433b-ba95-54c199957dce" (UID: "2d714b4f-02d1-433b-ba95-54c199957dce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.880955 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-config-data" (OuterVolumeSpecName: "config-data") pod "570bec3d-603c-4f92-b183-c5abb7e799d8" (UID: "570bec3d-603c-4f92-b183-c5abb7e799d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.887751 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "9205930b-2303-4b01-a3bf-cf4ef3ad0a49" (UID: "9205930b-2303-4b01-a3bf-cf4ef3ad0a49"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.891459 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50aef737-c888-466c-92d0-9c683267d266" (UID: "50aef737-c888-466c-92d0-9c683267d266"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.904404 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4051daa4-7ea3-4ab2-ad1f-52353e0ad995" (UID: "4051daa4-7ea3-4ab2-ad1f-52353e0ad995"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.917458 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4ce03d60-441e-4a79-acce-d54444aedfcb" (UID: "4ce03d60-441e-4a79-acce-d54444aedfcb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.936366 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.936403 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.936413 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.936422 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.936433 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9205930b-2303-4b01-a3bf-cf4ef3ad0a49-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.936442 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:47 crc kubenswrapper[5047]: E0223 09:02:47.951531 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b55343fde4610941e2de9a407053a8025bb06c2d056689bd185172f9ed4bdd5a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 09:02:47 crc kubenswrapper[5047]: E0223 09:02:47.965184 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b55343fde4610941e2de9a407053a8025bb06c2d056689bd185172f9ed4bdd5a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 09:02:47 crc kubenswrapper[5047]: E0223 09:02:47.967535 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b55343fde4610941e2de9a407053a8025bb06c2d056689bd185172f9ed4bdd5a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 09:02:47 crc kubenswrapper[5047]: E0223 09:02:47.967572 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="14988409-8f11-42ca-bc6d-a4ba3d3056a4" containerName="nova-cell0-conductor-conductor" Feb 23 09:02:47 crc kubenswrapper[5047]: I0223 09:02:47.992694 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3d570b43-d0ff-42d7-a305-9d7c2f9f9881" (UID: "3d570b43-d0ff-42d7-a305-9d7c2f9f9881"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.005895 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3464c846-13b9-479e-b9af-3d571f03b284-config-data" (OuterVolumeSpecName: "config-data") pod "3464c846-13b9-479e-b9af-3d571f03b284" (UID: "3464c846-13b9-479e-b9af-3d571f03b284"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.051664 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3464c846-13b9-479e-b9af-3d571f03b284-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.051722 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.078431 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "2d714b4f-02d1-433b-ba95-54c199957dce" (UID: "2d714b4f-02d1-433b-ba95-54c199957dce"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.152703 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-config-data" (OuterVolumeSpecName: "config-data") pod "07094621-ca88-4942-a226-76667658a5bc" (UID: "07094621-ca88-4942-a226-76667658a5bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.156465 5047 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.156508 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.187531 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "570bec3d-603c-4f92-b183-c5abb7e799d8" (UID: "570bec3d-603c-4f92-b183-c5abb7e799d8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.203977 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3d570b43-d0ff-42d7-a305-9d7c2f9f9881" (UID: "3d570b43-d0ff-42d7-a305-9d7c2f9f9881"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.204469 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1058e169-d572-43c5-80b3-d5f2f2c78afb" (UID: "1058e169-d572-43c5-80b3-d5f2f2c78afb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.245299 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.250227 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-config-data" (OuterVolumeSpecName: "config-data") pod "3d570b43-d0ff-42d7-a305-9d7c2f9f9881" (UID: "3d570b43-d0ff-42d7-a305-9d7c2f9f9881"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.254361 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6c0a-account-create-update-mngkq" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.261672 5047 scope.go:117] "RemoveContainer" containerID="438245a707c3bbe792477e7783157c77ef4990e9609e3fae09f2db6284e24d03" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.262118 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-combined-ca-bundle\") pod \"8596ec60-89ed-43d9-be63-b7130fd0f937\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.262183 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-public-tls-certs\") pod \"8596ec60-89ed-43d9-be63-b7130fd0f937\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.262309 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-config-data-custom\") pod \"8596ec60-89ed-43d9-be63-b7130fd0f937\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.262341 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-config-data\") pod \"8596ec60-89ed-43d9-be63-b7130fd0f937\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.262416 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpx64\" (UniqueName: \"kubernetes.io/projected/8596ec60-89ed-43d9-be63-b7130fd0f937-kube-api-access-gpx64\") pod \"8596ec60-89ed-43d9-be63-b7130fd0f937\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.262486 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-internal-tls-certs\") pod \"8596ec60-89ed-43d9-be63-b7130fd0f937\" (UID: \"8596ec60-89ed-43d9-be63-b7130fd0f937\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.271833 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.271984 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.271999 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.272266 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d570b43-d0ff-42d7-a305-9d7c2f9f9881-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.272320 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8596ec60-89ed-43d9-be63-b7130fd0f937-kube-api-access-gpx64" (OuterVolumeSpecName: "kube-api-access-gpx64") pod "8596ec60-89ed-43d9-be63-b7130fd0f937" (UID: "8596ec60-89ed-43d9-be63-b7130fd0f937"). InnerVolumeSpecName "kube-api-access-gpx64". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.274066 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-config-data" (OuterVolumeSpecName: "config-data") pod "4051daa4-7ea3-4ab2-ad1f-52353e0ad995" (UID: "4051daa4-7ea3-4ab2-ad1f-52353e0ad995"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.275310 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aad1-account-create-update-n2bdk" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.276975 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8596ec60-89ed-43d9-be63-b7130fd0f937" (UID: "8596ec60-89ed-43d9-be63-b7130fd0f937"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.280483 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "2d714b4f-02d1-433b-ba95-54c199957dce" (UID: "2d714b4f-02d1-433b-ba95-54c199957dce"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.284518 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3464c846-13b9-479e-b9af-3d571f03b284-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3464c846-13b9-479e-b9af-3d571f03b284" (UID: "3464c846-13b9-479e-b9af-3d571f03b284"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.309161 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b6cf-account-create-update-r57lp" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.309269 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.311768 5047 scope.go:117] "RemoveContainer" containerID="0aa17ecda203fdcd836c6d88a867e21da21bf1e6661eebac5135520ec1cd39ff" Feb 23 09:02:48 crc kubenswrapper[5047]: E0223 09:02:48.318778 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f09d38933227a4847581a0cdb85bb38728c31775104288e65ecc52ea3a4a431" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.335611 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1058e169-d572-43c5-80b3-d5f2f2c78afb" (UID: "1058e169-d572-43c5-80b3-d5f2f2c78afb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: E0223 09:02:48.342859 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f09d38933227a4847581a0cdb85bb38728c31775104288e65ecc52ea3a4a431" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.343089 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-dcb1-account-create-update-s6wxg" Feb 23 09:02:48 crc kubenswrapper[5047]: E0223 09:02:48.353977 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f09d38933227a4847581a0cdb85bb38728c31775104288e65ecc52ea3a4a431" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 09:02:48 crc kubenswrapper[5047]: E0223 09:02:48.354028 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="01713a47-8bed-4038-8339-bdcd77e6e1db" containerName="nova-cell1-conductor-conductor" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.354384 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.373101 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0953-account-create-update-psp84" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.376485 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="346c3f0d-e5fb-40f6-bd1e-65679466165f" path="/var/lib/kubelet/pods/346c3f0d-e5fb-40f6-bd1e-65679466165f/volumes" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.378939 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac0b4e9-66ad-4719-b036-d9e833ad7a37" path="/var/lib/kubelet/pods/9ac0b4e9-66ad-4719-b036-d9e833ad7a37/volumes" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.384381 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d4528b-cbbd-4c90-97e6-73924844d615-operator-scripts\") pod \"26d4528b-cbbd-4c90-97e6-73924844d615\" (UID: \"26d4528b-cbbd-4c90-97e6-73924844d615\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.401454 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t46w\" (UniqueName: \"kubernetes.io/projected/e444ce7d-b56a-406c-91ff-4623a469c13a-kube-api-access-4t46w\") pod \"e444ce7d-b56a-406c-91ff-4623a469c13a\" (UID: \"e444ce7d-b56a-406c-91ff-4623a469c13a\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.401477 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nnkk\" (UniqueName: \"kubernetes.io/projected/4ad1ab7e-d139-471c-9d7c-ceaea0bc0271-kube-api-access-7nnkk\") pod \"4ad1ab7e-d139-471c-9d7c-ceaea0bc0271\" (UID: \"4ad1ab7e-d139-471c-9d7c-ceaea0bc0271\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.401594 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8bbc\" (UniqueName: \"kubernetes.io/projected/26d4528b-cbbd-4c90-97e6-73924844d615-kube-api-access-l8bbc\") pod \"26d4528b-cbbd-4c90-97e6-73924844d615\" (UID: \"26d4528b-cbbd-4c90-97e6-73924844d615\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.401628 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjvw8\" (UniqueName: \"kubernetes.io/projected/dd7a46ed-9033-4c72-9f92-34816276560b-kube-api-access-vjvw8\") pod \"dd7a46ed-9033-4c72-9f92-34816276560b\" (UID: \"dd7a46ed-9033-4c72-9f92-34816276560b\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.401652 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd7a46ed-9033-4c72-9f92-34816276560b-operator-scripts\") pod \"dd7a46ed-9033-4c72-9f92-34816276560b\" (UID: \"dd7a46ed-9033-4c72-9f92-34816276560b\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.401682 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts\") pod \"e444ce7d-b56a-406c-91ff-4623a469c13a\" (UID: \"e444ce7d-b56a-406c-91ff-4623a469c13a\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.401729 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad1ab7e-d139-471c-9d7c-ceaea0bc0271-operator-scripts\") pod \"4ad1ab7e-d139-471c-9d7c-ceaea0bc0271\" (UID: \"4ad1ab7e-d139-471c-9d7c-ceaea0bc0271\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.402323 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0b8f96e-aaed-43c4-9906-8920be9f478b" path="/var/lib/kubelet/pods/a0b8f96e-aaed-43c4-9906-8920be9f478b/volumes" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.402548 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.402562 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.402573 5047 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d714b4f-02d1-433b-ba95-54c199957dce-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.402584 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.402594 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3464c846-13b9-479e-b9af-3d571f03b284-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.402603 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpx64\" (UniqueName: \"kubernetes.io/projected/8596ec60-89ed-43d9-be63-b7130fd0f937-kube-api-access-gpx64\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.402700 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac795337-f981-4559-a2d7-9e3eb81f9e33" path="/var/lib/kubelet/pods/ac795337-f981-4559-a2d7-9e3eb81f9e33/volumes" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.402814 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rl8wd" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.388586 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d4528b-cbbd-4c90-97e6-73924844d615-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26d4528b-cbbd-4c90-97e6-73924844d615" (UID: "26d4528b-cbbd-4c90-97e6-73924844d615"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.403057 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ad1ab7e-d139-471c-9d7c-ceaea0bc0271-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ad1ab7e-d139-471c-9d7c-ceaea0bc0271" (UID: "4ad1ab7e-d139-471c-9d7c-ceaea0bc0271"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.403095 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.408050 5047 scope.go:117] "RemoveContainer" containerID="a5b2ffecbcd348e06e8ff7bb5a39ad7d2084d7e9a705dfca2cffe091ad053ce8" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.409929 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e444ce7d-b56a-406c-91ff-4623a469c13a" (UID: "e444ce7d-b56a-406c-91ff-4623a469c13a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.411140 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd7a46ed-9033-4c72-9f92-34816276560b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd7a46ed-9033-4c72-9f92-34816276560b" (UID: "dd7a46ed-9033-4c72-9f92-34816276560b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.417765 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6" path="/var/lib/kubelet/pods/ea20c1ae-ae2b-4eae-a5ff-160708d7a9e6/volumes" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.420001 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f287ea57-5fcf-42d6-a886-fb5f35962785" path="/var/lib/kubelet/pods/f287ea57-5fcf-42d6-a886-fb5f35962785/volumes" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.420477 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe43cf5f-4813-4df4-ba76-345d096d5816" path="/var/lib/kubelet/pods/fe43cf5f-4813-4df4-ba76-345d096d5816/volumes" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.422883 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "50aef737-c888-466c-92d0-9c683267d266" (UID: "50aef737-c888-466c-92d0-9c683267d266"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.453875 5047 generic.go:334] "Generic (PLEG): container finished" podID="b4565dbe-dc04-452f-a79e-bc09cb299f29" containerID="5c9b91f2cf76cf8687be7fff6e9e192f705038da3a424902d902a1cc7dfc91f5" exitCode=1 Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.453994 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rl8wd" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.481098 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d4528b-cbbd-4c90-97e6-73924844d615-kube-api-access-l8bbc" (OuterVolumeSpecName: "kube-api-access-l8bbc") pod "26d4528b-cbbd-4c90-97e6-73924844d615" (UID: "26d4528b-cbbd-4c90-97e6-73924844d615"). InnerVolumeSpecName "kube-api-access-l8bbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.496643 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e444ce7d-b56a-406c-91ff-4623a469c13a-kube-api-access-4t46w" (OuterVolumeSpecName: "kube-api-access-4t46w") pod "e444ce7d-b56a-406c-91ff-4623a469c13a" (UID: "e444ce7d-b56a-406c-91ff-4623a469c13a"). InnerVolumeSpecName "kube-api-access-4t46w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.500743 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7a46ed-9033-4c72-9f92-34816276560b-kube-api-access-vjvw8" (OuterVolumeSpecName: "kube-api-access-vjvw8") pod "dd7a46ed-9033-4c72-9f92-34816276560b" (UID: "dd7a46ed-9033-4c72-9f92-34816276560b"). InnerVolumeSpecName "kube-api-access-vjvw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.503300 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0953-account-create-update-psp84" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.503929 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4051daa4-7ea3-4ab2-ad1f-52353e0ad995" (UID: "4051daa4-7ea3-4ab2-ad1f-52353e0ad995"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.504126 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/39cf7673-4d38-49e0-9b86-f80c3949fd06-memcached-tls-certs\") pod \"39cf7673-4d38-49e0-9b86-f80c3949fd06\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.504256 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xmtb\" (UniqueName: \"kubernetes.io/projected/39cf7673-4d38-49e0-9b86-f80c3949fd06-kube-api-access-6xmtb\") pod \"39cf7673-4d38-49e0-9b86-f80c3949fd06\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.504311 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/39cf7673-4d38-49e0-9b86-f80c3949fd06-kolla-config\") pod \"39cf7673-4d38-49e0-9b86-f80c3949fd06\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.504375 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4565dbe-dc04-452f-a79e-bc09cb299f29-operator-scripts\") pod \"b4565dbe-dc04-452f-a79e-bc09cb299f29\" (UID: \"b4565dbe-dc04-452f-a79e-bc09cb299f29\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.504419 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7knbv\" (UniqueName: \"kubernetes.io/projected/b4565dbe-dc04-452f-a79e-bc09cb299f29-kube-api-access-7knbv\") pod \"b4565dbe-dc04-452f-a79e-bc09cb299f29\" (UID: \"b4565dbe-dc04-452f-a79e-bc09cb299f29\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.504458 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39cf7673-4d38-49e0-9b86-f80c3949fd06-config-data\") pod \"39cf7673-4d38-49e0-9b86-f80c3949fd06\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.504682 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7607a87e-f072-4cdf-ba60-40b579d694ab-operator-scripts\") pod \"7607a87e-f072-4cdf-ba60-40b579d694ab\" (UID: \"7607a87e-f072-4cdf-ba60-40b579d694ab\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.504718 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbvp4\" (UniqueName: \"kubernetes.io/projected/7607a87e-f072-4cdf-ba60-40b579d694ab-kube-api-access-wbvp4\") pod \"7607a87e-f072-4cdf-ba60-40b579d694ab\" (UID: \"7607a87e-f072-4cdf-ba60-40b579d694ab\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.505038 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-nova-metadata-tls-certs\") pod \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\" (UID: \"4051daa4-7ea3-4ab2-ad1f-52353e0ad995\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.505062 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39cf7673-4d38-49e0-9b86-f80c3949fd06-combined-ca-bundle\") pod \"39cf7673-4d38-49e0-9b86-f80c3949fd06\" (UID: \"39cf7673-4d38-49e0-9b86-f80c3949fd06\") " Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.505165 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ad1ab7e-d139-471c-9d7c-ceaea0bc0271-kube-api-access-7nnkk" (OuterVolumeSpecName: "kube-api-access-7nnkk") pod "4ad1ab7e-d139-471c-9d7c-ceaea0bc0271" (UID: "4ad1ab7e-d139-471c-9d7c-ceaea0bc0271"). InnerVolumeSpecName "kube-api-access-7nnkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.505610 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7607a87e-f072-4cdf-ba60-40b579d694ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7607a87e-f072-4cdf-ba60-40b579d694ab" (UID: "7607a87e-f072-4cdf-ba60-40b579d694ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.505958 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39cf7673-4d38-49e0-9b86-f80c3949fd06-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "39cf7673-4d38-49e0-9b86-f80c3949fd06" (UID: "39cf7673-4d38-49e0-9b86-f80c3949fd06"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: W0223 09:02:48.508096 5047 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4051daa4-7ea3-4ab2-ad1f-52353e0ad995/volumes/kubernetes.io~secret/nova-metadata-tls-certs Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.508118 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4051daa4-7ea3-4ab2-ad1f-52353e0ad995" (UID: "4051daa4-7ea3-4ab2-ad1f-52353e0ad995"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.508446 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4565dbe-dc04-452f-a79e-bc09cb299f29-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4565dbe-dc04-452f-a79e-bc09cb299f29" (UID: "b4565dbe-dc04-452f-a79e-bc09cb299f29"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.508484 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8bbc\" (UniqueName: \"kubernetes.io/projected/26d4528b-cbbd-4c90-97e6-73924844d615-kube-api-access-l8bbc\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.509486 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39cf7673-4d38-49e0-9b86-f80c3949fd06-config-data" (OuterVolumeSpecName: "config-data") pod "39cf7673-4d38-49e0-9b86-f80c3949fd06" (UID: "39cf7673-4d38-49e0-9b86-f80c3949fd06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.513262 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjvw8\" (UniqueName: \"kubernetes.io/projected/dd7a46ed-9033-4c72-9f92-34816276560b-kube-api-access-vjvw8\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.515427 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd7a46ed-9033-4c72-9f92-34816276560b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.515457 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e444ce7d-b56a-406c-91ff-4623a469c13a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.515467 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad1ab7e-d139-471c-9d7c-ceaea0bc0271-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.515477 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d4528b-cbbd-4c90-97e6-73924844d615-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.515487 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7607a87e-f072-4cdf-ba60-40b579d694ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.515497 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t46w\" (UniqueName: \"kubernetes.io/projected/e444ce7d-b56a-406c-91ff-4623a469c13a-kube-api-access-4t46w\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.515508 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nnkk\" (UniqueName: \"kubernetes.io/projected/4ad1ab7e-d139-471c-9d7c-ceaea0bc0271-kube-api-access-7nnkk\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.515518 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.522495 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rl8wd" event={"ID":"b4565dbe-dc04-452f-a79e-bc09cb299f29","Type":"ContainerDied","Data":"5c9b91f2cf76cf8687be7fff6e9e192f705038da3a424902d902a1cc7dfc91f5"} Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.522545 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.522567 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.522766 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.522777 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rl8wd" event={"ID":"b4565dbe-dc04-452f-a79e-bc09cb299f29","Type":"ContainerDied","Data":"758a2b9824548a6ea893ec6c329b255e312a6dbcad06f364ad1875b79ec04052"} Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.522808 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0953-account-create-update-psp84" event={"ID":"7607a87e-f072-4cdf-ba60-40b579d694ab","Type":"ContainerDied","Data":"d0c3640f49ef24287e8b18c6b16b35c5a762a34a00b9946debda1cf73e838699"} Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.522827 5047 scope.go:117] "RemoveContainer" containerID="5c9b91f2cf76cf8687be7fff6e9e192f705038da3a424902d902a1cc7dfc91f5" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.527368 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-dcb1-account-create-update-s6wxg" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.527992 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-dcb1-account-create-update-s6wxg" event={"ID":"26d4528b-cbbd-4c90-97e6-73924844d615","Type":"ContainerDied","Data":"d933e50d0b2c355d9304b88dfa2ef6915b28e41d72fb265398d0ebceb147a3fb"} Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.536637 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.536785 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"39cf7673-4d38-49e0-9b86-f80c3949fd06","Type":"ContainerDied","Data":"01cddf8ee5f86dd10417671e610044e5e1f989298e6722e7beab0a6a68c0b35b"} Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.556111 5047 generic.go:334] "Generic (PLEG): container finished" podID="d8f82aad-7df9-4b14-a328-2cc708aeed84" containerID="1b91d13309e4eceb1e8a3d37f030db4d6384384b7d04e3f74e24403de0da2f57" exitCode=0 Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.556144 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7607a87e-f072-4cdf-ba60-40b579d694ab-kube-api-access-wbvp4" (OuterVolumeSpecName: "kube-api-access-wbvp4") pod "7607a87e-f072-4cdf-ba60-40b579d694ab" (UID: "7607a87e-f072-4cdf-ba60-40b579d694ab"). InnerVolumeSpecName "kube-api-access-wbvp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.556198 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d8f82aad-7df9-4b14-a328-2cc708aeed84","Type":"ContainerDied","Data":"1b91d13309e4eceb1e8a3d37f030db4d6384384b7d04e3f74e24403de0da2f57"} Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.560721 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aad1-account-create-update-n2bdk" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.560788 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b6cf-account-create-update-r57lp" event={"ID":"e444ce7d-b56a-406c-91ff-4623a469c13a","Type":"ContainerDied","Data":"481f36f5de5af782dc8c93afd0fa15a20caca6864cba4c5a5b85e8bbeb86442d"} Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.560828 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b6cf-account-create-update-r57lp" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.561369 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.561431 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.561463 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f58d8574d-t4gn8" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.561493 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58486f655d-lth95" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.561531 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d854f95bd-qf2l8" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.561554 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6c0a-account-create-update-mngkq" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.561576 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.566195 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39cf7673-4d38-49e0-9b86-f80c3949fd06-kube-api-access-6xmtb" (OuterVolumeSpecName: "kube-api-access-6xmtb") pod "39cf7673-4d38-49e0-9b86-f80c3949fd06" (UID: "39cf7673-4d38-49e0-9b86-f80c3949fd06"). InnerVolumeSpecName "kube-api-access-6xmtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.568128 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4565dbe-dc04-452f-a79e-bc09cb299f29-kube-api-access-7knbv" (OuterVolumeSpecName: "kube-api-access-7knbv") pod "b4565dbe-dc04-452f-a79e-bc09cb299f29" (UID: "b4565dbe-dc04-452f-a79e-bc09cb299f29"). InnerVolumeSpecName "kube-api-access-7knbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.574244 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.579282 5047 scope.go:117] "RemoveContainer" containerID="e1cb954a9885c04c4bbd7e3ec4832eaa364057f49f38e8abbfdcbdccf00749a0" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.591183 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8596ec60-89ed-43d9-be63-b7130fd0f937" (UID: "8596ec60-89ed-43d9-be63-b7130fd0f937"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.595594 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.617135 5047 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4051daa4-7ea3-4ab2-ad1f-52353e0ad995-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.617169 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xmtb\" (UniqueName: \"kubernetes.io/projected/39cf7673-4d38-49e0-9b86-f80c3949fd06-kube-api-access-6xmtb\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.617179 5047 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/39cf7673-4d38-49e0-9b86-f80c3949fd06-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.617187 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.617195 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4565dbe-dc04-452f-a79e-bc09cb299f29-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.617204 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7knbv\" (UniqueName: \"kubernetes.io/projected/b4565dbe-dc04-452f-a79e-bc09cb299f29-kube-api-access-7knbv\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.617214 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/39cf7673-4d38-49e0-9b86-f80c3949fd06-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.617223 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbvp4\" (UniqueName: \"kubernetes.io/projected/7607a87e-f072-4cdf-ba60-40b579d694ab-kube-api-access-wbvp4\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.629984 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.634847 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.650308 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "570bec3d-603c-4f92-b183-c5abb7e799d8" (UID: "570bec3d-603c-4f92-b183-c5abb7e799d8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.661367 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-config-data" (OuterVolumeSpecName: "config-data") pod "4ce03d60-441e-4a79-acce-d54444aedfcb" (UID: "4ce03d60-441e-4a79-acce-d54444aedfcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.688313 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1058e169-d572-43c5-80b3-d5f2f2c78afb" (UID: "1058e169-d572-43c5-80b3-d5f2f2c78afb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.694128 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "07094621-ca88-4942-a226-76667658a5bc" (UID: "07094621-ca88-4942-a226-76667658a5bc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.719022 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce03d60-441e-4a79-acce-d54444aedfcb-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.719052 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/570bec3d-603c-4f92-b183-c5abb7e799d8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.719060 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.719068 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.754083 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "07094621-ca88-4942-a226-76667658a5bc" (UID: "07094621-ca88-4942-a226-76667658a5bc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.761095 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-config-data" (OuterVolumeSpecName: "config-data") pod "95116420-b62b-402c-bcbe-17026cba0354" (UID: "95116420-b62b-402c-bcbe-17026cba0354"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.771067 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-config-data" (OuterVolumeSpecName: "config-data") pod "1058e169-d572-43c5-80b3-d5f2f2c78afb" (UID: "1058e169-d572-43c5-80b3-d5f2f2c78afb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.793030 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07094621-ca88-4942-a226-76667658a5bc" (UID: "07094621-ca88-4942-a226-76667658a5bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.804277 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39cf7673-4d38-49e0-9b86-f80c3949fd06-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "39cf7673-4d38-49e0-9b86-f80c3949fd06" (UID: "39cf7673-4d38-49e0-9b86-f80c3949fd06"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.810026 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-config-data" (OuterVolumeSpecName: "config-data") pod "8596ec60-89ed-43d9-be63-b7130fd0f937" (UID: "8596ec60-89ed-43d9-be63-b7130fd0f937"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.817478 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39cf7673-4d38-49e0-9b86-f80c3949fd06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39cf7673-4d38-49e0-9b86-f80c3949fd06" (UID: "39cf7673-4d38-49e0-9b86-f80c3949fd06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.820831 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1058e169-d572-43c5-80b3-d5f2f2c78afb-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.820866 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.820877 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95116420-b62b-402c-bcbe-17026cba0354-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.820888 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39cf7673-4d38-49e0-9b86-f80c3949fd06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.820898 5047 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/39cf7673-4d38-49e0-9b86-f80c3949fd06-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.820921 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.820930 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07094621-ca88-4942-a226-76667658a5bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.843164 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8596ec60-89ed-43d9-be63-b7130fd0f937" (UID: "8596ec60-89ed-43d9-be63-b7130fd0f937"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.845396 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-config-data" (OuterVolumeSpecName: "config-data") pod "50aef737-c888-466c-92d0-9c683267d266" (UID: "50aef737-c888-466c-92d0-9c683267d266"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.852025 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "50aef737-c888-466c-92d0-9c683267d266" (UID: "50aef737-c888-466c-92d0-9c683267d266"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.870027 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8596ec60-89ed-43d9-be63-b7130fd0f937" (UID: "8596ec60-89ed-43d9-be63-b7130fd0f937"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.922328 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.922380 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.922395 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8596ec60-89ed-43d9-be63-b7130fd0f937-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:48 crc kubenswrapper[5047]: I0223 09:02:48.922406 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50aef737-c888-466c-92d0-9c683267d266-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.597261 5047 generic.go:334] "Generic (PLEG): container finished" podID="56f69678-8caa-45a4-8361-a0bf3ef10d19" containerID="46026da011a0aa7880379234ae4494704c649f1e05648955f118d375e2a3e1e0" exitCode=0 Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.597571 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" event={"ID":"56f69678-8caa-45a4-8361-a0bf3ef10d19","Type":"ContainerDied","Data":"46026da011a0aa7880379234ae4494704c649f1e05648955f118d375e2a3e1e0"} Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.608437 5047 generic.go:334] "Generic (PLEG): container finished" podID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerID="e932f691f1d7bc2d70f16a64edd63fd7ca5272ea9a3dce01dda8b2cbb647641f" exitCode=0 Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.608470 5047 generic.go:334] "Generic (PLEG): container finished" podID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerID="a0fef05f6987a47a8d1202e9246540bf050838d7c193ea1ef5e1bb51346912c0" exitCode=0 Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.608505 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da9c6500-238e-415a-9e31-e7bf9ccdd205","Type":"ContainerDied","Data":"e932f691f1d7bc2d70f16a64edd63fd7ca5272ea9a3dce01dda8b2cbb647641f"} Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.608556 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da9c6500-238e-415a-9e31-e7bf9ccdd205","Type":"ContainerDied","Data":"a0fef05f6987a47a8d1202e9246540bf050838d7c193ea1ef5e1bb51346912c0"} Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.610346 5047 generic.go:334] "Generic (PLEG): container finished" podID="5e613490-e007-4f7e-9868-abf59633c7c2" containerID="4c4abeb50762236bdb25af75e59326ffcbc0cafbe34c51b67459b228cce0deb5" exitCode=0 Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.610396 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56c47db55f-krsw7" event={"ID":"5e613490-e007-4f7e-9868-abf59633c7c2","Type":"ContainerDied","Data":"4c4abeb50762236bdb25af75e59326ffcbc0cafbe34c51b67459b228cce0deb5"} Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.624963 5047 generic.go:334] "Generic (PLEG): container finished" podID="db558a41-6dbf-4b18-af50-6a5311530ef4" containerID="c118d283468399c97557ee04375c5d709e9aa2386c25e5bb23ba33476c1b630a" exitCode=0 Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.625026 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"db558a41-6dbf-4b18-af50-6a5311530ef4","Type":"ContainerDied","Data":"c118d283468399c97557ee04375c5d709e9aa2386c25e5bb23ba33476c1b630a"} Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.625046 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"db558a41-6dbf-4b18-af50-6a5311530ef4","Type":"ContainerDied","Data":"6d9d8ea0af57fce16d9c37676ecf917da449db33ee3b80528f6a720b64a675bc"} Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.625055 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d9d8ea0af57fce16d9c37676ecf917da449db33ee3b80528f6a720b64a675bc" Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.628067 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d8f82aad-7df9-4b14-a328-2cc708aeed84","Type":"ContainerDied","Data":"7c68d16a654220b5726f957209a20f588b0ff5fdaa881dc113e2ec4864b7f892"} Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.628137 5047 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c68d16a654220b5726f957209a20f588b0ff5fdaa881dc113e2ec4864b7f892" Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.632697 5047 generic.go:334] "Generic (PLEG): container finished" podID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerID="5fe5d75b2b3b7484337c60c1fa98ffc49a3af8f8d35cefe2ba43edd510fddd68" exitCode=0 Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.632750 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b24cc8b9-bf0e-45d8-85a9-7c3937896968","Type":"ContainerDied","Data":"5fe5d75b2b3b7484337c60c1fa98ffc49a3af8f8d35cefe2ba43edd510fddd68"} Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.637328 5047 generic.go:334] "Generic (PLEG): container finished" podID="14988409-8f11-42ca-bc6d-a4ba3d3056a4" containerID="b55343fde4610941e2de9a407053a8025bb06c2d056689bd185172f9ed4bdd5a" exitCode=0 Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.637354 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"14988409-8f11-42ca-bc6d-a4ba3d3056a4","Type":"ContainerDied","Data":"b55343fde4610941e2de9a407053a8025bb06c2d056689bd185172f9ed4bdd5a"} Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.659333 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts\") pod \"keystone-ce6c-account-create-update-cm686\" (UID: \"73633a8a-1272-44da-a229-00dd09693a39\") " pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:49 crc kubenswrapper[5047]: E0223 09:02:49.659590 5047 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 23 09:02:49 crc kubenswrapper[5047]: E0223 09:02:49.659640 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts podName:73633a8a-1272-44da-a229-00dd09693a39 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:53.659626168 +0000 UTC m=+8295.910953302 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts") pod "keystone-ce6c-account-create-update-cm686" (UID: "73633a8a-1272-44da-a229-00dd09693a39") : configmap "openstack-scripts" not found Feb 23 09:02:49 crc kubenswrapper[5047]: I0223 09:02:49.760746 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rh4b\" (UniqueName: \"kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b\") pod \"keystone-ce6c-account-create-update-cm686\" (UID: \"73633a8a-1272-44da-a229-00dd09693a39\") " pod="openstack/keystone-ce6c-account-create-update-cm686" Feb 23 09:02:49 crc kubenswrapper[5047]: E0223 09:02:49.765416 5047 projected.go:194] Error preparing data for projected volume kube-api-access-2rh4b for pod openstack/keystone-ce6c-account-create-update-cm686: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 09:02:49 crc kubenswrapper[5047]: E0223 09:02:49.765505 5047 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b podName:73633a8a-1272-44da-a229-00dd09693a39 nodeName:}" failed. No retries permitted until 2026-02-23 09:02:53.765464999 +0000 UTC m=+8296.016792133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2rh4b" (UniqueName: "kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b") pod "keystone-ce6c-account-create-update-cm686" (UID: "73633a8a-1272-44da-a229-00dd09693a39") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.184324 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.193604 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.194454 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.212522 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.214356 5047 scope.go:117] "RemoveContainer" containerID="5c9b91f2cf76cf8687be7fff6e9e192f705038da3a424902d902a1cc7dfc91f5" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.217273 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-dcb1-account-create-update-s6wxg"] Feb 23 09:02:50 crc kubenswrapper[5047]: E0223 09:02:50.219492 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c9b91f2cf76cf8687be7fff6e9e192f705038da3a424902d902a1cc7dfc91f5\": container with ID starting with 5c9b91f2cf76cf8687be7fff6e9e192f705038da3a424902d902a1cc7dfc91f5 not found: ID does not exist" containerID="5c9b91f2cf76cf8687be7fff6e9e192f705038da3a424902d902a1cc7dfc91f5" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.219532 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9b91f2cf76cf8687be7fff6e9e192f705038da3a424902d902a1cc7dfc91f5"} err="failed to get container status \"5c9b91f2cf76cf8687be7fff6e9e192f705038da3a424902d902a1cc7dfc91f5\": rpc error: code = NotFound desc = could not find container \"5c9b91f2cf76cf8687be7fff6e9e192f705038da3a424902d902a1cc7dfc91f5\": container with ID starting with 5c9b91f2cf76cf8687be7fff6e9e192f705038da3a424902d902a1cc7dfc91f5 not found: ID does not exist" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.219558 5047 scope.go:117] "RemoveContainer" containerID="e1cb954a9885c04c4bbd7e3ec4832eaa364057f49f38e8abbfdcbdccf00749a0" Feb 23 09:02:50 crc kubenswrapper[5047]: E0223 09:02:50.224374 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1cb954a9885c04c4bbd7e3ec4832eaa364057f49f38e8abbfdcbdccf00749a0\": container with ID starting with e1cb954a9885c04c4bbd7e3ec4832eaa364057f49f38e8abbfdcbdccf00749a0 not found: ID does not exist" containerID="e1cb954a9885c04c4bbd7e3ec4832eaa364057f49f38e8abbfdcbdccf00749a0" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.224413 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1cb954a9885c04c4bbd7e3ec4832eaa364057f49f38e8abbfdcbdccf00749a0"} err="failed to get container status \"e1cb954a9885c04c4bbd7e3ec4832eaa364057f49f38e8abbfdcbdccf00749a0\": rpc error: code = NotFound desc = could not find container \"e1cb954a9885c04c4bbd7e3ec4832eaa364057f49f38e8abbfdcbdccf00749a0\": container with ID starting with e1cb954a9885c04c4bbd7e3ec4832eaa364057f49f38e8abbfdcbdccf00749a0 not found: ID does not exist" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.224503 5047 scope.go:117] "RemoveContainer" containerID="2fb193ba990e8911587f36aa7890ab58c29c03d0ee9c8cdec4e9fb37cf8c3f1b" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.244798 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.249250 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-dcb1-account-create-update-s6wxg"] Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.261488 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.272208 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.276039 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d8f82aad-7df9-4b14-a328-2cc708aeed84-erlang-cookie-secret\") pod \"d8f82aad-7df9-4b14-a328-2cc708aeed84\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.276150 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-tls\") pod \"d8f82aad-7df9-4b14-a328-2cc708aeed84\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.276196 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-confd\") pod \"d8f82aad-7df9-4b14-a328-2cc708aeed84\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.276229 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data\") pod \"db558a41-6dbf-4b18-af50-6a5311530ef4\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.276255 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db558a41-6dbf-4b18-af50-6a5311530ef4-pod-info\") pod \"db558a41-6dbf-4b18-af50-6a5311530ef4\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.276288 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-erlang-cookie\") pod \"d8f82aad-7df9-4b14-a328-2cc708aeed84\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277115 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") pod \"d8f82aad-7df9-4b14-a328-2cc708aeed84\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277163 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-confd\") pod \"db558a41-6dbf-4b18-af50-6a5311530ef4\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277189 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data\") pod \"d8f82aad-7df9-4b14-a328-2cc708aeed84\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277213 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjm2n\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-kube-api-access-fjm2n\") pod \"d8f82aad-7df9-4b14-a328-2cc708aeed84\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277245 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-server-conf\") pod \"d8f82aad-7df9-4b14-a328-2cc708aeed84\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277282 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-server-conf\") pod \"db558a41-6dbf-4b18-af50-6a5311530ef4\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277310 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-erlang-cookie\") pod \"db558a41-6dbf-4b18-af50-6a5311530ef4\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277338 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-plugins\") pod \"db558a41-6dbf-4b18-af50-6a5311530ef4\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277370 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-plugins\") pod \"d8f82aad-7df9-4b14-a328-2cc708aeed84\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277399 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld4sk\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-kube-api-access-ld4sk\") pod \"db558a41-6dbf-4b18-af50-6a5311530ef4\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277419 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-tls\") pod \"db558a41-6dbf-4b18-af50-6a5311530ef4\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277443 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-plugins-conf\") pod \"d8f82aad-7df9-4b14-a328-2cc708aeed84\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277480 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db558a41-6dbf-4b18-af50-6a5311530ef4-erlang-cookie-secret\") pod \"db558a41-6dbf-4b18-af50-6a5311530ef4\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277492 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d8f82aad-7df9-4b14-a328-2cc708aeed84" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.277563 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d8f82aad-7df9-4b14-a328-2cc708aeed84-pod-info\") pod \"d8f82aad-7df9-4b14-a328-2cc708aeed84\" (UID: \"d8f82aad-7df9-4b14-a328-2cc708aeed84\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.278096 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") pod \"db558a41-6dbf-4b18-af50-6a5311530ef4\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.278138 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-plugins-conf\") pod \"db558a41-6dbf-4b18-af50-6a5311530ef4\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.278712 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.279434 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d8f82aad-7df9-4b14-a328-2cc708aeed84" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.279514 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "db558a41-6dbf-4b18-af50-6a5311530ef4" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.281622 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-kube-api-access-fjm2n" (OuterVolumeSpecName: "kube-api-access-fjm2n") pod "d8f82aad-7df9-4b14-a328-2cc708aeed84" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84"). InnerVolumeSpecName "kube-api-access-fjm2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.281634 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "db558a41-6dbf-4b18-af50-6a5311530ef4" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.282025 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "db558a41-6dbf-4b18-af50-6a5311530ef4" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.283394 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d8f82aad-7df9-4b14-a328-2cc708aeed84" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.285211 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db558a41-6dbf-4b18-af50-6a5311530ef4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "db558a41-6dbf-4b18-af50-6a5311530ef4" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.293691 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d8f82aad-7df9-4b14-a328-2cc708aeed84-pod-info" (OuterVolumeSpecName: "pod-info") pod "d8f82aad-7df9-4b14-a328-2cc708aeed84" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.293753 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.296636 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.298224 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f82aad-7df9-4b14-a328-2cc708aeed84-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d8f82aad-7df9-4b14-a328-2cc708aeed84" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.298313 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/db558a41-6dbf-4b18-af50-6a5311530ef4-pod-info" (OuterVolumeSpecName: "pod-info") pod "db558a41-6dbf-4b18-af50-6a5311530ef4" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.307699 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ce6c-account-create-update-cm686"] Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.315193 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.316326 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data" (OuterVolumeSpecName: "config-data") pod "db558a41-6dbf-4b18-af50-6a5311530ef4" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.318305 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ce6c-account-create-update-cm686"] Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.319683 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4" (OuterVolumeSpecName: "persistence") pod "db558a41-6dbf-4b18-af50-6a5311530ef4" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4"). InnerVolumeSpecName "pvc-381a0458-ccaa-459e-8751-bf247399cbd4". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.327147 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d8f82aad-7df9-4b14-a328-2cc708aeed84" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.327958 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "db558a41-6dbf-4b18-af50-6a5311530ef4" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.355368 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d4528b-cbbd-4c90-97e6-73924844d615" path="/var/lib/kubelet/pods/26d4528b-cbbd-4c90-97e6-73924844d615/volumes" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.356645 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d714b4f-02d1-433b-ba95-54c199957dce" path="/var/lib/kubelet/pods/2d714b4f-02d1-433b-ba95-54c199957dce/volumes" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.357344 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3464c846-13b9-479e-b9af-3d571f03b284" path="/var/lib/kubelet/pods/3464c846-13b9-479e-b9af-3d571f03b284/volumes" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.357872 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d570b43-d0ff-42d7-a305-9d7c2f9f9881" path="/var/lib/kubelet/pods/3d570b43-d0ff-42d7-a305-9d7c2f9f9881/volumes" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.359357 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" path="/var/lib/kubelet/pods/4051daa4-7ea3-4ab2-ad1f-52353e0ad995/volumes" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.359797 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73633a8a-1272-44da-a229-00dd09693a39" path="/var/lib/kubelet/pods/73633a8a-1272-44da-a229-00dd09693a39/volumes" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.360235 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74700fa7-59df-4201-a7c4-de815b82208e" path="/var/lib/kubelet/pods/74700fa7-59df-4201-a7c4-de815b82208e/volumes" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.361464 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9205930b-2303-4b01-a3bf-cf4ef3ad0a49" path="/var/lib/kubelet/pods/9205930b-2303-4b01-a3bf-cf4ef3ad0a49/volumes" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.366427 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-kube-api-access-ld4sk" (OuterVolumeSpecName: "kube-api-access-ld4sk") pod "db558a41-6dbf-4b18-af50-6a5311530ef4" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4"). InnerVolumeSpecName "kube-api-access-ld4sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.367598 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.369574 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c" (OuterVolumeSpecName: "persistence") pod "d8f82aad-7df9-4b14-a328-2cc708aeed84" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84"). InnerVolumeSpecName "pvc-82d2ef49-2524-428f-af58-59241651700c". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379514 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-config-data-custom\") pod \"5e613490-e007-4f7e-9868-abf59633c7c2\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379588 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-scripts\") pod \"55697218-de1b-424f-b5ff-2d0806e54a96\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379643 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-internal-tls-certs\") pod \"55697218-de1b-424f-b5ff-2d0806e54a96\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379680 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-config-data\") pod \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379696 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-public-tls-certs\") pod \"55697218-de1b-424f-b5ff-2d0806e54a96\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379721 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-sg-core-conf-yaml\") pod \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379735 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-combined-ca-bundle\") pod \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379762 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd7x2\" (UniqueName: \"kubernetes.io/projected/55697218-de1b-424f-b5ff-2d0806e54a96-kube-api-access-gd7x2\") pod \"55697218-de1b-424f-b5ff-2d0806e54a96\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379778 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6dr2\" (UniqueName: \"kubernetes.io/projected/b24cc8b9-bf0e-45d8-85a9-7c3937896968-kube-api-access-h6dr2\") pod \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379796 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b24cc8b9-bf0e-45d8-85a9-7c3937896968-run-httpd\") pod \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379816 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-combined-ca-bundle\") pod \"5e613490-e007-4f7e-9868-abf59633c7c2\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379834 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-credential-keys\") pod \"55697218-de1b-424f-b5ff-2d0806e54a96\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379860 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b24cc8b9-bf0e-45d8-85a9-7c3937896968-log-httpd\") pod \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379897 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-ceilometer-tls-certs\") pod \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379932 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-config-data\") pod \"55697218-de1b-424f-b5ff-2d0806e54a96\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379951 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-combined-ca-bundle\") pod \"56f69678-8caa-45a4-8361-a0bf3ef10d19\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379969 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89hwx\" (UniqueName: \"kubernetes.io/projected/56f69678-8caa-45a4-8361-a0bf3ef10d19-kube-api-access-89hwx\") pod \"56f69678-8caa-45a4-8361-a0bf3ef10d19\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.379989 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e613490-e007-4f7e-9868-abf59633c7c2-logs\") pod \"5e613490-e007-4f7e-9868-abf59633c7c2\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.380008 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-config-data\") pod \"56f69678-8caa-45a4-8361-a0bf3ef10d19\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.380039 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f69678-8caa-45a4-8361-a0bf3ef10d19-logs\") pod \"56f69678-8caa-45a4-8361-a0bf3ef10d19\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.380064 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-config-data-custom\") pod \"56f69678-8caa-45a4-8361-a0bf3ef10d19\" (UID: \"56f69678-8caa-45a4-8361-a0bf3ef10d19\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.380084 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kjbf\" (UniqueName: \"kubernetes.io/projected/5e613490-e007-4f7e-9868-abf59633c7c2-kube-api-access-8kjbf\") pod \"5e613490-e007-4f7e-9868-abf59633c7c2\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.380107 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-scripts\") pod \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\" (UID: \"b24cc8b9-bf0e-45d8-85a9-7c3937896968\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.380125 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-fernet-keys\") pod \"55697218-de1b-424f-b5ff-2d0806e54a96\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.380140 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-combined-ca-bundle\") pod \"55697218-de1b-424f-b5ff-2d0806e54a96\" (UID: \"55697218-de1b-424f-b5ff-2d0806e54a96\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.380171 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-config-data\") pod \"5e613490-e007-4f7e-9868-abf59633c7c2\" (UID: \"5e613490-e007-4f7e-9868-abf59633c7c2\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.385482 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.387459 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24cc8b9-bf0e-45d8-85a9-7c3937896968-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b24cc8b9-bf0e-45d8-85a9-7c3937896968" (UID: "b24cc8b9-bf0e-45d8-85a9-7c3937896968"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.398346 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55697218-de1b-424f-b5ff-2d0806e54a96-kube-api-access-gd7x2" (OuterVolumeSpecName: "kube-api-access-gd7x2") pod "55697218-de1b-424f-b5ff-2d0806e54a96" (UID: "55697218-de1b-424f-b5ff-2d0806e54a96"). InnerVolumeSpecName "kube-api-access-gd7x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400288 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rh4b\" (UniqueName: \"kubernetes.io/projected/73633a8a-1272-44da-a229-00dd09693a39-kube-api-access-2rh4b\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400318 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400330 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400343 5047 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db558a41-6dbf-4b18-af50-6a5311530ef4-pod-info\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400371 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-82d2ef49-2524-428f-af58-59241651700c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") on node \"crc\" " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400387 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjm2n\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-kube-api-access-fjm2n\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400400 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400412 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400426 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400437 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld4sk\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-kube-api-access-ld4sk\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400448 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400459 5047 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400470 5047 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db558a41-6dbf-4b18-af50-6a5311530ef4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400482 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73633a8a-1272-44da-a229-00dd09693a39-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400495 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd7x2\" (UniqueName: \"kubernetes.io/projected/55697218-de1b-424f-b5ff-2d0806e54a96-kube-api-access-gd7x2\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400508 5047 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b24cc8b9-bf0e-45d8-85a9-7c3937896968-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400519 5047 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d8f82aad-7df9-4b14-a328-2cc708aeed84-pod-info\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400543 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-381a0458-ccaa-459e-8751-bf247399cbd4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") on node \"crc\" " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400558 5047 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.400570 5047 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d8f82aad-7df9-4b14-a328-2cc708aeed84-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.419323 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e613490-e007-4f7e-9868-abf59633c7c2-logs" (OuterVolumeSpecName: "logs") pod "5e613490-e007-4f7e-9868-abf59633c7c2" (UID: "5e613490-e007-4f7e-9868-abf59633c7c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.419666 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data" (OuterVolumeSpecName: "config-data") pod "d8f82aad-7df9-4b14-a328-2cc708aeed84" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.427088 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "55697218-de1b-424f-b5ff-2d0806e54a96" (UID: "55697218-de1b-424f-b5ff-2d0806e54a96"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.427503 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-scripts" (OuterVolumeSpecName: "scripts") pod "55697218-de1b-424f-b5ff-2d0806e54a96" (UID: "55697218-de1b-424f-b5ff-2d0806e54a96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.434085 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f69678-8caa-45a4-8361-a0bf3ef10d19-logs" (OuterVolumeSpecName: "logs") pod "56f69678-8caa-45a4-8361-a0bf3ef10d19" (UID: "56f69678-8caa-45a4-8361-a0bf3ef10d19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.437376 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24cc8b9-bf0e-45d8-85a9-7c3937896968-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b24cc8b9-bf0e-45d8-85a9-7c3937896968" (UID: "b24cc8b9-bf0e-45d8-85a9-7c3937896968"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.441840 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b24cc8b9-bf0e-45d8-85a9-7c3937896968-kube-api-access-h6dr2" (OuterVolumeSpecName: "kube-api-access-h6dr2") pod "b24cc8b9-bf0e-45d8-85a9-7c3937896968" (UID: "b24cc8b9-bf0e-45d8-85a9-7c3937896968"). InnerVolumeSpecName "kube-api-access-h6dr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.508694 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-combined-ca-bundle\") pod \"da9c6500-238e-415a-9e31-e7bf9ccdd205\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.508746 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14988409-8f11-42ca-bc6d-a4ba3d3056a4-config-data\") pod \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\" (UID: \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.508786 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfskw\" (UniqueName: \"kubernetes.io/projected/14988409-8f11-42ca-bc6d-a4ba3d3056a4-kube-api-access-qfskw\") pod \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\" (UID: \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.508977 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-scripts\") pod \"da9c6500-238e-415a-9e31-e7bf9ccdd205\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.509014 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-config-data\") pod \"da9c6500-238e-415a-9e31-e7bf9ccdd205\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.509060 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-internal-tls-certs\") pod \"da9c6500-238e-415a-9e31-e7bf9ccdd205\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.509086 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2jxh\" (UniqueName: \"kubernetes.io/projected/da9c6500-238e-415a-9e31-e7bf9ccdd205-kube-api-access-l2jxh\") pod \"da9c6500-238e-415a-9e31-e7bf9ccdd205\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.509186 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14988409-8f11-42ca-bc6d-a4ba3d3056a4-combined-ca-bundle\") pod \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\" (UID: \"14988409-8f11-42ca-bc6d-a4ba3d3056a4\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.509322 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-public-tls-certs\") pod \"da9c6500-238e-415a-9e31-e7bf9ccdd205\" (UID: \"da9c6500-238e-415a-9e31-e7bf9ccdd205\") " Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.509861 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.509873 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.509884 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6dr2\" (UniqueName: \"kubernetes.io/projected/b24cc8b9-bf0e-45d8-85a9-7c3937896968-kube-api-access-h6dr2\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.509897 5047 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.509959 5047 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b24cc8b9-bf0e-45d8-85a9-7c3937896968-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.509970 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e613490-e007-4f7e-9868-abf59633c7c2-logs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.509978 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f69678-8caa-45a4-8361-a0bf3ef10d19-logs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.546662 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5e613490-e007-4f7e-9868-abf59633c7c2" (UID: "5e613490-e007-4f7e-9868-abf59633c7c2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.548409 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "55697218-de1b-424f-b5ff-2d0806e54a96" (UID: "55697218-de1b-424f-b5ff-2d0806e54a96"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.548493 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "56f69678-8caa-45a4-8361-a0bf3ef10d19" (UID: "56f69678-8caa-45a4-8361-a0bf3ef10d19"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.548797 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f69678-8caa-45a4-8361-a0bf3ef10d19-kube-api-access-89hwx" (OuterVolumeSpecName: "kube-api-access-89hwx") pod "56f69678-8caa-45a4-8361-a0bf3ef10d19" (UID: "56f69678-8caa-45a4-8361-a0bf3ef10d19"). InnerVolumeSpecName "kube-api-access-89hwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.552659 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14988409-8f11-42ca-bc6d-a4ba3d3056a4-kube-api-access-qfskw" (OuterVolumeSpecName: "kube-api-access-qfskw") pod "14988409-8f11-42ca-bc6d-a4ba3d3056a4" (UID: "14988409-8f11-42ca-bc6d-a4ba3d3056a4"). InnerVolumeSpecName "kube-api-access-qfskw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.565081 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e613490-e007-4f7e-9868-abf59633c7c2-kube-api-access-8kjbf" (OuterVolumeSpecName: "kube-api-access-8kjbf") pod "5e613490-e007-4f7e-9868-abf59633c7c2" (UID: "5e613490-e007-4f7e-9868-abf59633c7c2"). InnerVolumeSpecName "kube-api-access-8kjbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.566552 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da9c6500-238e-415a-9e31-e7bf9ccdd205-kube-api-access-l2jxh" (OuterVolumeSpecName: "kube-api-access-l2jxh") pod "da9c6500-238e-415a-9e31-e7bf9ccdd205" (UID: "da9c6500-238e-415a-9e31-e7bf9ccdd205"). InnerVolumeSpecName "kube-api-access-l2jxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.567816 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.568189 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-381a0458-ccaa-459e-8751-bf247399cbd4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4") on node "crc" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.568598 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.568730 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-82d2ef49-2524-428f-af58-59241651700c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c") on node "crc" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.591138 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-scripts" (OuterVolumeSpecName: "scripts") pod "b24cc8b9-bf0e-45d8-85a9-7c3937896968" (UID: "b24cc8b9-bf0e-45d8-85a9-7c3937896968"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.602347 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-scripts" (OuterVolumeSpecName: "scripts") pod "da9c6500-238e-415a-9e31-e7bf9ccdd205" (UID: "da9c6500-238e-415a-9e31-e7bf9ccdd205"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.631036 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-381a0458-ccaa-459e-8751-bf247399cbd4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-381a0458-ccaa-459e-8751-bf247399cbd4\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.631254 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.631266 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89hwx\" (UniqueName: \"kubernetes.io/projected/56f69678-8caa-45a4-8361-a0bf3ef10d19-kube-api-access-89hwx\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.631275 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2jxh\" (UniqueName: \"kubernetes.io/projected/da9c6500-238e-415a-9e31-e7bf9ccdd205-kube-api-access-l2jxh\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.631285 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.631293 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kjbf\" (UniqueName: \"kubernetes.io/projected/5e613490-e007-4f7e-9868-abf59633c7c2-kube-api-access-8kjbf\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.631301 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.631310 5047 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.631318 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.631338 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-82d2ef49-2524-428f-af58-59241651700c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-82d2ef49-2524-428f-af58-59241651700c\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.631346 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfskw\" (UniqueName: \"kubernetes.io/projected/14988409-8f11-42ca-bc6d-a4ba3d3056a4-kube-api-access-qfskw\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.638375 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56f69678-8caa-45a4-8361-a0bf3ef10d19" (UID: "56f69678-8caa-45a4-8361-a0bf3ef10d19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.654925 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c59427cb-019c-4f83-af18-75900909e70f/ovn-northd/0.log" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.654994 5047 generic.go:334] "Generic (PLEG): container finished" podID="c59427cb-019c-4f83-af18-75900909e70f" containerID="5474b9871e571cc4e16fdda4dacca659354e26baaaa9e5970121b7d51f36100b" exitCode=139 Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.666597 5047 generic.go:334] "Generic (PLEG): container finished" podID="55697218-de1b-424f-b5ff-2d0806e54a96" containerID="a2aa3c71cf56d92d78a6319101098dd98b5f40cdc1f1960e2e0f189d122e8285" exitCode=0 Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.666703 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-75f6dd64bf-sswjk" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.678874 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.704964 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.716890 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.720971 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.745931 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.746372 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56c47db55f-krsw7" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.790974 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "55697218-de1b-424f-b5ff-2d0806e54a96" (UID: "55697218-de1b-424f-b5ff-2d0806e54a96"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.796444 5047 generic.go:334] "Generic (PLEG): container finished" podID="753e53f0-d44b-4af2-9aff-eabc1a46d537" containerID="4373b06bebc37262c61b269b46fec7e7e686ae9f047fd098351acbd0de2a4b21" exitCode=0 Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.799628 5047 generic.go:334] "Generic (PLEG): container finished" podID="01713a47-8bed-4038-8339-bdcd77e6e1db" containerID="9f09d38933227a4847581a0cdb85bb38728c31775104288e65ecc52ea3a4a431" exitCode=0 Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.808583 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.815376 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.824122 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b24cc8b9-bf0e-45d8-85a9-7c3937896968" (UID: "b24cc8b9-bf0e-45d8-85a9-7c3937896968"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.824143 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e613490-e007-4f7e-9868-abf59633c7c2" (UID: "5e613490-e007-4f7e-9868-abf59633c7c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.839017 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-config-data" (OuterVolumeSpecName: "config-data") pod "55697218-de1b-424f-b5ff-2d0806e54a96" (UID: "55697218-de1b-424f-b5ff-2d0806e54a96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.848281 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.848302 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.848313 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.848321 5047 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.868615 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14988409-8f11-42ca-bc6d-a4ba3d3056a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14988409-8f11-42ca-bc6d-a4ba3d3056a4" (UID: "14988409-8f11-42ca-bc6d-a4ba3d3056a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.921696 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-server-conf" (OuterVolumeSpecName: "server-conf") pod "db558a41-6dbf-4b18-af50-6a5311530ef4" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.930044 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b24cc8b9-bf0e-45d8-85a9-7c3937896968" (UID: "b24cc8b9-bf0e-45d8-85a9-7c3937896968"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.941077 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14988409-8f11-42ca-bc6d-a4ba3d3056a4-config-data" (OuterVolumeSpecName: "config-data") pod "14988409-8f11-42ca-bc6d-a4ba3d3056a4" (UID: "14988409-8f11-42ca-bc6d-a4ba3d3056a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.953991 5047 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.954031 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14988409-8f11-42ca-bc6d-a4ba3d3056a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.954044 5047 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db558a41-6dbf-4b18-af50-6a5311530ef4-server-conf\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.954053 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14988409-8f11-42ca-bc6d-a4ba3d3056a4-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:50 crc kubenswrapper[5047]: I0223 09:02:50.975077 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b24cc8b9-bf0e-45d8-85a9-7c3937896968" (UID: "b24cc8b9-bf0e-45d8-85a9-7c3937896968"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.052264 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-server-conf" (OuterVolumeSpecName: "server-conf") pod "d8f82aad-7df9-4b14-a328-2cc708aeed84" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.056047 5047 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d8f82aad-7df9-4b14-a328-2cc708aeed84-server-conf\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.056080 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.069790 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-config-data" (OuterVolumeSpecName: "config-data") pod "5e613490-e007-4f7e-9868-abf59633c7c2" (UID: "5e613490-e007-4f7e-9868-abf59633c7c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.088040 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55697218-de1b-424f-b5ff-2d0806e54a96" (UID: "55697218-de1b-424f-b5ff-2d0806e54a96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.114982 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d854f95bd-qf2l8"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115033 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5d854f95bd-qf2l8"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115050 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c59427cb-019c-4f83-af18-75900909e70f","Type":"ContainerDied","Data":"5474b9871e571cc4e16fdda4dacca659354e26baaaa9e5970121b7d51f36100b"} Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115080 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-75f6dd64bf-sswjk" event={"ID":"55697218-de1b-424f-b5ff-2d0806e54a96","Type":"ContainerDied","Data":"a2aa3c71cf56d92d78a6319101098dd98b5f40cdc1f1960e2e0f189d122e8285"} Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115102 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-75f6dd64bf-sswjk" event={"ID":"55697218-de1b-424f-b5ff-2d0806e54a96","Type":"ContainerDied","Data":"6e6604c64fd88ae2771dbb3358825a5cb8bb77bda481a6b3261006aa838d5630"} Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115111 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da9c6500-238e-415a-9e31-e7bf9ccdd205","Type":"ContainerDied","Data":"23fe375fd297f1a2f28ba51c2e0cd8e6145ccb0f45696d8d95c745b5f117ae0e"} Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115124 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58486f655d-lth95"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115137 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-58486f655d-lth95"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115147 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b24cc8b9-bf0e-45d8-85a9-7c3937896968","Type":"ContainerDied","Data":"36109629f4a0c8a04a72f128f8db98fbfb27085eeb7b6927cb9c00597f00dea5"} Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115162 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"14988409-8f11-42ca-bc6d-a4ba3d3056a4","Type":"ContainerDied","Data":"9501a06a1f385419b71931a2d8cb4eb06a1be0c307accee6c17be6c2cc013e2e"} Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115177 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-aad1-account-create-update-n2bdk"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115187 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f75d6c566-lwrqj" event={"ID":"56f69678-8caa-45a4-8361-a0bf3ef10d19","Type":"ContainerDied","Data":"decc8b8f7b2dfc4e7ef4724327db55ae0b38ada9af823a26288f00524f859bf5"} Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115202 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56c47db55f-krsw7" event={"ID":"5e613490-e007-4f7e-9868-abf59633c7c2","Type":"ContainerDied","Data":"6d94b313ca3f23a72550cd486abde3b7c8eb06b9b4fb20291a2cf0afb5b37850"} Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115214 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-aad1-account-create-update-n2bdk"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115225 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115237 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115249 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0953-account-create-update-psp84"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115257 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"753e53f0-d44b-4af2-9aff-eabc1a46d537","Type":"ContainerDied","Data":"4373b06bebc37262c61b269b46fec7e7e686ae9f047fd098351acbd0de2a4b21"} Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115269 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-0953-account-create-update-psp84"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115281 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"01713a47-8bed-4038-8339-bdcd77e6e1db","Type":"ContainerDied","Data":"9f09d38933227a4847581a0cdb85bb38728c31775104288e65ecc52ea3a4a431"} Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115293 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115304 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115313 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6ff9694944-nmq4c"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115323 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6ff9694944-nmq4c"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115332 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f58d8574d-t4gn8"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115342 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5f58d8574d-t4gn8"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115351 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115362 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115371 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rl8wd"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115383 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-rl8wd"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115391 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115400 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115412 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b6cf-account-create-update-r57lp"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115422 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b6cf-account-create-update-r57lp"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115437 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6c0a-account-create-update-mngkq"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115446 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6c0a-account-create-update-mngkq"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.115478 5047 scope.go:117] "RemoveContainer" containerID="a2aa3c71cf56d92d78a6319101098dd98b5f40cdc1f1960e2e0f189d122e8285" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.143023 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.151324 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "da9c6500-238e-415a-9e31-e7bf9ccdd205" (UID: "da9c6500-238e-415a-9e31-e7bf9ccdd205"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.155968 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.158484 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.158504 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e613490-e007-4f7e-9868-abf59633c7c2-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.158515 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.170357 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "da9c6500-238e-415a-9e31-e7bf9ccdd205" (UID: "da9c6500-238e-415a-9e31-e7bf9ccdd205"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.172163 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.181386 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.185443 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c59427cb-019c-4f83-af18-75900909e70f/ovn-northd/0.log" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.185507 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.190284 5047 scope.go:117] "RemoveContainer" containerID="a2aa3c71cf56d92d78a6319101098dd98b5f40cdc1f1960e2e0f189d122e8285" Feb 23 09:02:51 crc kubenswrapper[5047]: E0223 09:02:51.190796 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2aa3c71cf56d92d78a6319101098dd98b5f40cdc1f1960e2e0f189d122e8285\": container with ID starting with a2aa3c71cf56d92d78a6319101098dd98b5f40cdc1f1960e2e0f189d122e8285 not found: ID does not exist" containerID="a2aa3c71cf56d92d78a6319101098dd98b5f40cdc1f1960e2e0f189d122e8285" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.190829 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2aa3c71cf56d92d78a6319101098dd98b5f40cdc1f1960e2e0f189d122e8285"} err="failed to get container status \"a2aa3c71cf56d92d78a6319101098dd98b5f40cdc1f1960e2e0f189d122e8285\": rpc error: code = NotFound desc = could not find container \"a2aa3c71cf56d92d78a6319101098dd98b5f40cdc1f1960e2e0f189d122e8285\": container with ID starting with a2aa3c71cf56d92d78a6319101098dd98b5f40cdc1f1960e2e0f189d122e8285 not found: ID does not exist" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.190849 5047 scope.go:117] "RemoveContainer" containerID="e932f691f1d7bc2d70f16a64edd63fd7ca5272ea9a3dce01dda8b2cbb647641f" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.219394 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-config-data" (OuterVolumeSpecName: "config-data") pod "56f69678-8caa-45a4-8361-a0bf3ef10d19" (UID: "56f69678-8caa-45a4-8361-a0bf3ef10d19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.230200 5047 scope.go:117] "RemoveContainer" containerID="a0fef05f6987a47a8d1202e9246540bf050838d7c193ea1ef5e1bb51346912c0" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.236035 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-config-data" (OuterVolumeSpecName: "config-data") pod "b24cc8b9-bf0e-45d8-85a9-7c3937896968" (UID: "b24cc8b9-bf0e-45d8-85a9-7c3937896968"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.251154 5047 scope.go:117] "RemoveContainer" containerID="7a05e953fc80368078f66f82f035d35c3158450022c92ac887c442e42eb47b06" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.260097 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "db558a41-6dbf-4b18-af50-6a5311530ef4" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.260178 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-confd\") pod \"db558a41-6dbf-4b18-af50-6a5311530ef4\" (UID: \"db558a41-6dbf-4b18-af50-6a5311530ef4\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.260223 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-kolla-config\") pod \"753e53f0-d44b-4af2-9aff-eabc1a46d537\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.260242 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/753e53f0-d44b-4af2-9aff-eabc1a46d537-config-data-generated\") pod \"753e53f0-d44b-4af2-9aff-eabc1a46d537\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.260261 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c59427cb-019c-4f83-af18-75900909e70f-ovn-rundir\") pod \"c59427cb-019c-4f83-af18-75900909e70f\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.260279 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-ovn-northd-tls-certs\") pod \"c59427cb-019c-4f83-af18-75900909e70f\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.260300 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-operator-scripts\") pod \"753e53f0-d44b-4af2-9aff-eabc1a46d537\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.260318 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6ngk\" (UniqueName: \"kubernetes.io/projected/753e53f0-d44b-4af2-9aff-eabc1a46d537-kube-api-access-w6ngk\") pod \"753e53f0-d44b-4af2-9aff-eabc1a46d537\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.260335 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/753e53f0-d44b-4af2-9aff-eabc1a46d537-galera-tls-certs\") pod \"753e53f0-d44b-4af2-9aff-eabc1a46d537\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.260356 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01713a47-8bed-4038-8339-bdcd77e6e1db-config-data\") pod \"01713a47-8bed-4038-8339-bdcd77e6e1db\" (UID: \"01713a47-8bed-4038-8339-bdcd77e6e1db\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.261665 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "753e53f0-d44b-4af2-9aff-eabc1a46d537" (UID: "753e53f0-d44b-4af2-9aff-eabc1a46d537"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: W0223 09:02:51.261742 5047 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/db558a41-6dbf-4b18-af50-6a5311530ef4/volumes/kubernetes.io~projected/rabbitmq-confd Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.261774 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "db558a41-6dbf-4b18-af50-6a5311530ef4" (UID: "db558a41-6dbf-4b18-af50-6a5311530ef4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.261005 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c59427cb-019c-4f83-af18-75900909e70f-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "c59427cb-019c-4f83-af18-75900909e70f" (UID: "c59427cb-019c-4f83-af18-75900909e70f"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.262262 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602\") pod \"753e53f0-d44b-4af2-9aff-eabc1a46d537\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.262308 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01713a47-8bed-4038-8339-bdcd77e6e1db-combined-ca-bundle\") pod \"01713a47-8bed-4038-8339-bdcd77e6e1db\" (UID: \"01713a47-8bed-4038-8339-bdcd77e6e1db\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.262351 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-metrics-certs-tls-certs\") pod \"c59427cb-019c-4f83-af18-75900909e70f\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.262397 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c59427cb-019c-4f83-af18-75900909e70f-config\") pod \"c59427cb-019c-4f83-af18-75900909e70f\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.262453 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c59427cb-019c-4f83-af18-75900909e70f-scripts\") pod \"c59427cb-019c-4f83-af18-75900909e70f\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.262469 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-combined-ca-bundle\") pod \"c59427cb-019c-4f83-af18-75900909e70f\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.262483 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5k5d\" (UniqueName: \"kubernetes.io/projected/c59427cb-019c-4f83-af18-75900909e70f-kube-api-access-j5k5d\") pod \"c59427cb-019c-4f83-af18-75900909e70f\" (UID: \"c59427cb-019c-4f83-af18-75900909e70f\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.262533 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-config-data-default\") pod \"753e53f0-d44b-4af2-9aff-eabc1a46d537\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.262566 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753e53f0-d44b-4af2-9aff-eabc1a46d537-combined-ca-bundle\") pod \"753e53f0-d44b-4af2-9aff-eabc1a46d537\" (UID: \"753e53f0-d44b-4af2-9aff-eabc1a46d537\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.262585 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8zpb\" (UniqueName: \"kubernetes.io/projected/01713a47-8bed-4038-8339-bdcd77e6e1db-kube-api-access-c8zpb\") pod \"01713a47-8bed-4038-8339-bdcd77e6e1db\" (UID: \"01713a47-8bed-4038-8339-bdcd77e6e1db\") " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.263022 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.263039 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f69678-8caa-45a4-8361-a0bf3ef10d19-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.263049 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db558a41-6dbf-4b18-af50-6a5311530ef4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.263057 5047 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.263065 5047 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c59427cb-019c-4f83-af18-75900909e70f-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.263250 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24cc8b9-bf0e-45d8-85a9-7c3937896968-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.263413 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "753e53f0-d44b-4af2-9aff-eabc1a46d537" (UID: "753e53f0-d44b-4af2-9aff-eabc1a46d537"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.263553 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/753e53f0-d44b-4af2-9aff-eabc1a46d537-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "753e53f0-d44b-4af2-9aff-eabc1a46d537" (UID: "753e53f0-d44b-4af2-9aff-eabc1a46d537"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.264170 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "753e53f0-d44b-4af2-9aff-eabc1a46d537" (UID: "753e53f0-d44b-4af2-9aff-eabc1a46d537"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.265397 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c59427cb-019c-4f83-af18-75900909e70f-scripts" (OuterVolumeSpecName: "scripts") pod "c59427cb-019c-4f83-af18-75900909e70f" (UID: "c59427cb-019c-4f83-af18-75900909e70f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.265553 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c59427cb-019c-4f83-af18-75900909e70f-config" (OuterVolumeSpecName: "config") pod "c59427cb-019c-4f83-af18-75900909e70f" (UID: "c59427cb-019c-4f83-af18-75900909e70f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.272111 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d8f82aad-7df9-4b14-a328-2cc708aeed84" (UID: "d8f82aad-7df9-4b14-a328-2cc708aeed84"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.280766 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753e53f0-d44b-4af2-9aff-eabc1a46d537-kube-api-access-w6ngk" (OuterVolumeSpecName: "kube-api-access-w6ngk") pod "753e53f0-d44b-4af2-9aff-eabc1a46d537" (UID: "753e53f0-d44b-4af2-9aff-eabc1a46d537"). InnerVolumeSpecName "kube-api-access-w6ngk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.281016 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c59427cb-019c-4f83-af18-75900909e70f-kube-api-access-j5k5d" (OuterVolumeSpecName: "kube-api-access-j5k5d") pod "c59427cb-019c-4f83-af18-75900909e70f" (UID: "c59427cb-019c-4f83-af18-75900909e70f"). InnerVolumeSpecName "kube-api-access-j5k5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.286074 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "55697218-de1b-424f-b5ff-2d0806e54a96" (UID: "55697218-de1b-424f-b5ff-2d0806e54a96"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.286081 5047 scope.go:117] "RemoveContainer" containerID="e5e7ee2c8b135051740b1ea7d291484779a9c2e5a708219176467993426ff5fd" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.286172 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01713a47-8bed-4038-8339-bdcd77e6e1db-kube-api-access-c8zpb" (OuterVolumeSpecName: "kube-api-access-c8zpb") pod "01713a47-8bed-4038-8339-bdcd77e6e1db" (UID: "01713a47-8bed-4038-8339-bdcd77e6e1db"). InnerVolumeSpecName "kube-api-access-c8zpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.306070 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602" (OuterVolumeSpecName: "mysql-db") pod "753e53f0-d44b-4af2-9aff-eabc1a46d537" (UID: "753e53f0-d44b-4af2-9aff-eabc1a46d537"). InnerVolumeSpecName "pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.325234 5047 scope.go:117] "RemoveContainer" containerID="a0215ac3fbe253cdcd33e23224e66c3483f6042b430140b93acbd73cad3aa678" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.330298 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01713a47-8bed-4038-8339-bdcd77e6e1db-config-data" (OuterVolumeSpecName: "config-data") pod "01713a47-8bed-4038-8339-bdcd77e6e1db" (UID: "01713a47-8bed-4038-8339-bdcd77e6e1db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.342166 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753e53f0-d44b-4af2-9aff-eabc1a46d537-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "753e53f0-d44b-4af2-9aff-eabc1a46d537" (UID: "753e53f0-d44b-4af2-9aff-eabc1a46d537"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.349174 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01713a47-8bed-4038-8339-bdcd77e6e1db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01713a47-8bed-4038-8339-bdcd77e6e1db" (UID: "01713a47-8bed-4038-8339-bdcd77e6e1db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.361513 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "c59427cb-019c-4f83-af18-75900909e70f" (UID: "c59427cb-019c-4f83-af18-75900909e70f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.363990 5047 scope.go:117] "RemoveContainer" containerID="60141a0931e5bb65b57978e1903e918449f3377efc61f92213ef2a271cd7c68c" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364235 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01713a47-8bed-4038-8339-bdcd77e6e1db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364258 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55697218-de1b-424f-b5ff-2d0806e54a96-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364273 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364281 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c59427cb-019c-4f83-af18-75900909e70f-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364299 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5k5d\" (UniqueName: \"kubernetes.io/projected/c59427cb-019c-4f83-af18-75900909e70f-kube-api-access-j5k5d\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364309 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c59427cb-019c-4f83-af18-75900909e70f-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364317 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364326 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753e53f0-d44b-4af2-9aff-eabc1a46d537-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364334 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8zpb\" (UniqueName: \"kubernetes.io/projected/01713a47-8bed-4038-8339-bdcd77e6e1db-kube-api-access-c8zpb\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364342 5047 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d8f82aad-7df9-4b14-a328-2cc708aeed84-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364350 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/753e53f0-d44b-4af2-9aff-eabc1a46d537-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364359 5047 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/753e53f0-d44b-4af2-9aff-eabc1a46d537-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364366 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6ngk\" (UniqueName: \"kubernetes.io/projected/753e53f0-d44b-4af2-9aff-eabc1a46d537-kube-api-access-w6ngk\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364374 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01713a47-8bed-4038-8339-bdcd77e6e1db-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.364402 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602\") on node \"crc\" " Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.368298 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da9c6500-238e-415a-9e31-e7bf9ccdd205" (UID: "da9c6500-238e-415a-9e31-e7bf9ccdd205"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.377371 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c59427cb-019c-4f83-af18-75900909e70f" (UID: "c59427cb-019c-4f83-af18-75900909e70f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.381792 5047 scope.go:117] "RemoveContainer" containerID="5fe5d75b2b3b7484337c60c1fa98ffc49a3af8f8d35cefe2ba43edd510fddd68" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.382517 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753e53f0-d44b-4af2-9aff-eabc1a46d537-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "753e53f0-d44b-4af2-9aff-eabc1a46d537" (UID: "753e53f0-d44b-4af2-9aff-eabc1a46d537"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.392888 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5f75d6c566-lwrqj"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.396151 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.396334 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602") on node "crc" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.408067 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-config-data" (OuterVolumeSpecName: "config-data") pod "da9c6500-238e-415a-9e31-e7bf9ccdd205" (UID: "da9c6500-238e-415a-9e31-e7bf9ccdd205"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.413810 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5f75d6c566-lwrqj"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.419800 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "c59427cb-019c-4f83-af18-75900909e70f" (UID: "c59427cb-019c-4f83-af18-75900909e70f"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.425590 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-56c47db55f-krsw7"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.438556 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-56c47db55f-krsw7"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.463294 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.467318 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.467362 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.467378 5047 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59427cb-019c-4f83-af18-75900909e70f-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.467395 5047 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/753e53f0-d44b-4af2-9aff-eabc1a46d537-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.467410 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5cfa6ec-ec29-471b-a921-f68616ffa602\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.467434 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da9c6500-238e-415a-9e31-e7bf9ccdd205-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.480025 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.521212 5047 scope.go:117] "RemoveContainer" containerID="a65243912ed9e00054f4ebb306fb5fe73793f4ea7a467cd196686a7efbdd8df4" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.527784 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.533722 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.567966 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.571510 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.580472 5047 scope.go:117] "RemoveContainer" containerID="b55343fde4610941e2de9a407053a8025bb06c2d056689bd185172f9ed4bdd5a" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.603449 5047 scope.go:117] "RemoveContainer" containerID="46026da011a0aa7880379234ae4494704c649f1e05648955f118d375e2a3e1e0" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.608950 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-75f6dd64bf-sswjk"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.613654 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-75f6dd64bf-sswjk"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.648029 5047 scope.go:117] "RemoveContainer" containerID="f80a58364511a0fa76558ab45536d04c4cce2e97d79499c3e5dad8ce40b0217a" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.648880 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.655005 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.687274 5047 scope.go:117] "RemoveContainer" containerID="4c4abeb50762236bdb25af75e59326ffcbc0cafbe34c51b67459b228cce0deb5" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.712183 5047 scope.go:117] "RemoveContainer" containerID="2b967aa0fbb7606bb8ade55e0690f140e4568286a58a2b7dd614a5b813dee08a" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.824985 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"753e53f0-d44b-4af2-9aff-eabc1a46d537","Type":"ContainerDied","Data":"3ecaac4e9c6f1e95cfd65140c33e4209babfd73d1e7084f88cb7a0312107bc12"} Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.825029 5047 scope.go:117] "RemoveContainer" containerID="4373b06bebc37262c61b269b46fec7e7e686ae9f047fd098351acbd0de2a4b21" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.825184 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.835504 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c59427cb-019c-4f83-af18-75900909e70f/ovn-northd/0.log" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.835646 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c59427cb-019c-4f83-af18-75900909e70f","Type":"ContainerDied","Data":"9a19259a9f8f0b14e6169dbcfedda2d068f9b140d9214894ccfa3ec424cc30f3"} Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.835715 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.856477 5047 scope.go:117] "RemoveContainer" containerID="abc7a14060746168d3aab5953a1dec8484ca0c6b35fab048b6695351e5711f38" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.873598 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.874887 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"01713a47-8bed-4038-8339-bdcd77e6e1db","Type":"ContainerDied","Data":"36d003992f1ff6be3ceeaeb907294e65d0b8ade351ef14317b3419b70e24b148"} Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.875127 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.878316 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.891384 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.900218 5047 scope.go:117] "RemoveContainer" containerID="300cc5bda7124e79cc3bdc763a253160aad67d2c3e2c6d44cbdf59edc78788c3" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.903948 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.924214 5047 scope.go:117] "RemoveContainer" containerID="5474b9871e571cc4e16fdda4dacca659354e26baaaa9e5970121b7d51f36100b" Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.925951 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.932071 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 09:02:51 crc kubenswrapper[5047]: I0223 09:02:51.964311 5047 scope.go:117] "RemoveContainer" containerID="9f09d38933227a4847581a0cdb85bb38728c31775104288e65ecc52ea3a4a431" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.353462 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01713a47-8bed-4038-8339-bdcd77e6e1db" path="/var/lib/kubelet/pods/01713a47-8bed-4038-8339-bdcd77e6e1db/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.354075 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07094621-ca88-4942-a226-76667658a5bc" path="/var/lib/kubelet/pods/07094621-ca88-4942-a226-76667658a5bc/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.354634 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1058e169-d572-43c5-80b3-d5f2f2c78afb" path="/var/lib/kubelet/pods/1058e169-d572-43c5-80b3-d5f2f2c78afb/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.355615 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14988409-8f11-42ca-bc6d-a4ba3d3056a4" path="/var/lib/kubelet/pods/14988409-8f11-42ca-bc6d-a4ba3d3056a4/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.356122 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39cf7673-4d38-49e0-9b86-f80c3949fd06" path="/var/lib/kubelet/pods/39cf7673-4d38-49e0-9b86-f80c3949fd06/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.356642 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ad1ab7e-d139-471c-9d7c-ceaea0bc0271" path="/var/lib/kubelet/pods/4ad1ab7e-d139-471c-9d7c-ceaea0bc0271/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.357071 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce03d60-441e-4a79-acce-d54444aedfcb" path="/var/lib/kubelet/pods/4ce03d60-441e-4a79-acce-d54444aedfcb/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.358235 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50aef737-c888-466c-92d0-9c683267d266" path="/var/lib/kubelet/pods/50aef737-c888-466c-92d0-9c683267d266/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.358693 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55697218-de1b-424f-b5ff-2d0806e54a96" path="/var/lib/kubelet/pods/55697218-de1b-424f-b5ff-2d0806e54a96/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.359199 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f69678-8caa-45a4-8361-a0bf3ef10d19" path="/var/lib/kubelet/pods/56f69678-8caa-45a4-8361-a0bf3ef10d19/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.360238 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="570bec3d-603c-4f92-b183-c5abb7e799d8" path="/var/lib/kubelet/pods/570bec3d-603c-4f92-b183-c5abb7e799d8/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.360800 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e613490-e007-4f7e-9868-abf59633c7c2" path="/var/lib/kubelet/pods/5e613490-e007-4f7e-9868-abf59633c7c2/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.361845 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="753e53f0-d44b-4af2-9aff-eabc1a46d537" path="/var/lib/kubelet/pods/753e53f0-d44b-4af2-9aff-eabc1a46d537/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.362393 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7607a87e-f072-4cdf-ba60-40b579d694ab" path="/var/lib/kubelet/pods/7607a87e-f072-4cdf-ba60-40b579d694ab/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.362765 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8596ec60-89ed-43d9-be63-b7130fd0f937" path="/var/lib/kubelet/pods/8596ec60-89ed-43d9-be63-b7130fd0f937/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.363680 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95116420-b62b-402c-bcbe-17026cba0354" path="/var/lib/kubelet/pods/95116420-b62b-402c-bcbe-17026cba0354/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.364337 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" path="/var/lib/kubelet/pods/b24cc8b9-bf0e-45d8-85a9-7c3937896968/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.364966 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4565dbe-dc04-452f-a79e-bc09cb299f29" path="/var/lib/kubelet/pods/b4565dbe-dc04-452f-a79e-bc09cb299f29/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.365860 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c59427cb-019c-4f83-af18-75900909e70f" path="/var/lib/kubelet/pods/c59427cb-019c-4f83-af18-75900909e70f/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.366515 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f82aad-7df9-4b14-a328-2cc708aeed84" path="/var/lib/kubelet/pods/d8f82aad-7df9-4b14-a328-2cc708aeed84/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.367545 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" path="/var/lib/kubelet/pods/da9c6500-238e-415a-9e31-e7bf9ccdd205/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.368408 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db558a41-6dbf-4b18-af50-6a5311530ef4" path="/var/lib/kubelet/pods/db558a41-6dbf-4b18-af50-6a5311530ef4/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.369327 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd7a46ed-9033-4c72-9f92-34816276560b" path="/var/lib/kubelet/pods/dd7a46ed-9033-4c72-9f92-34816276560b/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.369652 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e444ce7d-b56a-406c-91ff-4623a469c13a" path="/var/lib/kubelet/pods/e444ce7d-b56a-406c-91ff-4623a469c13a/volumes" Feb 23 09:02:52 crc kubenswrapper[5047]: I0223 09:02:52.414874 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-86dd65c656-6n5cp" podUID="adff4079-41bc-4cde-bc30-4a29f5302568" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.118:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.118:8443: connect: connection refused" Feb 23 09:02:52 crc kubenswrapper[5047]: E0223 09:02:52.519796 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:02:52 crc kubenswrapper[5047]: E0223 09:02:52.521738 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:02:52 crc kubenswrapper[5047]: E0223 09:02:52.524673 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:02:52 crc kubenswrapper[5047]: E0223 09:02:52.524737 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6db8c9f77f-vfbhs" podUID="5adc996d-c7bb-49a2-bea0-909b84c93353" containerName="heat-engine" Feb 23 09:02:53 crc kubenswrapper[5047]: I0223 09:02:53.582819 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 23 09:02:53 crc kubenswrapper[5047]: I0223 09:02:53.583277 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="f7982ba2-03dd-461c-bef5-d4223fd9ddd5" containerName="adoption" containerID="cri-o://275245be4e1d753203f6c3c8cc1e21aa7ee24902f5fb7405f17a6544e6c06d89" gracePeriod=30 Feb 23 09:02:53 crc kubenswrapper[5047]: I0223 09:02:53.809587 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 23 09:02:53 crc kubenswrapper[5047]: I0223 09:02:53.809925 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="33e156a6-7aa5-4769-8219-118aecb3b161" containerName="adoption" containerID="cri-o://1c77c1f0f1e6b25a138a342acba27c33a4de799901c4d45d6ddcd5cd88c94c61" gracePeriod=30 Feb 23 09:02:56 crc kubenswrapper[5047]: I0223 09:02:56.342062 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:02:56 crc kubenswrapper[5047]: E0223 09:02:56.342717 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:03:00 crc kubenswrapper[5047]: I0223 09:03:00.920854 5047 scope.go:117] "RemoveContainer" containerID="dcf3076fcf8e8a3596430ddf627b1510267721dfc147bcaa96b9951e0ece0e7c" Feb 23 09:03:00 crc kubenswrapper[5047]: I0223 09:03:00.944825 5047 scope.go:117] "RemoveContainer" containerID="d75d9c5c9f819a4fef6f8458366281a7a7b56194dad2149a7dac9c6288a64b3e" Feb 23 09:03:00 crc kubenswrapper[5047]: I0223 09:03:00.977887 5047 scope.go:117] "RemoveContainer" containerID="c349d4019c79a38336f264dd317d2df1febd4f029273706fb5fc837b0d162f2a" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.002015 5047 scope.go:117] "RemoveContainer" containerID="5b5cdce23bc6b1850489f770e11653c0e0f5c790fb261d4311d63e59f1d91444" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.020951 5047 scope.go:117] "RemoveContainer" containerID="809716648c4107c38d6ed5e1da34f1681092785995acc0e50a5dfbd5e3502aa8" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.042494 5047 scope.go:117] "RemoveContainer" containerID="8c44a4d060fe0afbdab6a1f05ac20d1249c428efb3b01ca24ed8e720e9d10271" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.063267 5047 scope.go:117] "RemoveContainer" containerID="5b1e9abcc11ca95b25f760ee4921480a4a3de7b01708fdf6b170d338c0245738" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.078856 5047 scope.go:117] "RemoveContainer" containerID="c118d283468399c97557ee04375c5d709e9aa2386c25e5bb23ba33476c1b630a" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.114932 5047 scope.go:117] "RemoveContainer" containerID="7eaea98d0755487d3bd7e40d9df640eabeb421e14f34fb1d51127b4360c754c2" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.155430 5047 scope.go:117] "RemoveContainer" containerID="3af56d9850e342eaf92994ecc8a16c1e2545b5684d7c31b87e2addd0e9540af4" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.174183 5047 scope.go:117] "RemoveContainer" containerID="1b91d13309e4eceb1e8a3d37f030db4d6384384b7d04e3f74e24403de0da2f57" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.190809 5047 scope.go:117] "RemoveContainer" containerID="38bbd56ee9a5e5626e98c5f9e39c04af222fb0e3d5740284ea5093fdcc638927" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.213676 5047 scope.go:117] "RemoveContainer" containerID="20ea81041bb52ceb23d8bb279a41e64793f561c21214179c617e0cd48b7db8c3" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.247362 5047 scope.go:117] "RemoveContainer" containerID="919ae0467b0ab826c5b9f2356df51aec4127e389561d05792503362c2de42572" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.272686 5047 scope.go:117] "RemoveContainer" containerID="b6d42e4e1c915ab7ad50cbcf97e861f1ad954fa7b65115fca83232f8d0cd78e5" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.353323 5047 scope.go:117] "RemoveContainer" containerID="4ada85ea2224cba0dd0ac5bfad26078501595310d0f4318c0f68ec1cab3e8d84" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.413755 5047 scope.go:117] "RemoveContainer" containerID="d60474bd97e9b0d0915a58eae3db7da57f5234e0b02651a966146b0399694399" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.435572 5047 scope.go:117] "RemoveContainer" containerID="d228a3b0cadcbb3c297b956ba0f819355627a6ecf7412f2e7ba427ce670fe91f" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.461673 5047 scope.go:117] "RemoveContainer" containerID="9f6508e2706dcc1d9229284a5dbaed31e788597ffade321a44957814cb190cdb" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.502124 5047 scope.go:117] "RemoveContainer" containerID="d5925a76b8dc1aa4699c52a60dfe9bafd55bce85e0b0fe07e7a09153ac94b57d" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.541871 5047 scope.go:117] "RemoveContainer" containerID="7ec9f905acecad1325d415f2ddb8f8369e83f15dfec01b649e5e7392716e0b34" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.561400 5047 scope.go:117] "RemoveContainer" containerID="33c766011377115c89385b76818662881b608592e8f5fe9eacc9c3fff7aad9a0" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.581591 5047 scope.go:117] "RemoveContainer" containerID="764eca4b1eae80106e477be230c3c004a88237b926e233ec66dd335cc48d62a3" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.604195 5047 scope.go:117] "RemoveContainer" containerID="000d3c24fd99b6c2ed581b5f5c4fd5b32ee63c9bba247e2ec7f68a77055306d1" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.629440 5047 scope.go:117] "RemoveContainer" containerID="71886f23f03852059f3e2fb43eb1d7d43f524af30918a11409712a16fb14f4a7" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.676660 5047 scope.go:117] "RemoveContainer" containerID="ec75f8350065896340138084424783a8ac231166ea7fffbf795299442b505721" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.701966 5047 scope.go:117] "RemoveContainer" containerID="f1af0f63c5a3f5e9fc8ecb2896b6500cf5a64a631e0df682cd475c9a6a8beb23" Feb 23 09:03:01 crc kubenswrapper[5047]: I0223 09:03:01.734765 5047 scope.go:117] "RemoveContainer" containerID="86a16a531bca49c39e0367bf3d4345deb39c5463a8f65beb54493af30860a9d3" Feb 23 09:03:02 crc kubenswrapper[5047]: I0223 09:03:02.414183 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-86dd65c656-6n5cp" podUID="adff4079-41bc-4cde-bc30-4a29f5302568" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.118:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.118:8443: connect: connection refused" Feb 23 09:03:02 crc kubenswrapper[5047]: I0223 09:03:02.414317 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 09:03:02 crc kubenswrapper[5047]: E0223 09:03:02.519556 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:02 crc kubenswrapper[5047]: E0223 09:03:02.521166 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:02 crc kubenswrapper[5047]: E0223 09:03:02.522721 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:02 crc kubenswrapper[5047]: E0223 09:03:02.522947 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6db8c9f77f-vfbhs" podUID="5adc996d-c7bb-49a2-bea0-909b84c93353" containerName="heat-engine" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.012440 5047 generic.go:334] "Generic (PLEG): container finished" podID="f1306003-e12b-4db1-beeb-cd461db0975e" containerID="90d8272e14124df09a1e4603e8bf04a61add7c1cf9078eea0c18f84d9f987fe3" exitCode=0 Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.012574 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755d9b4d6f-jbxjv" event={"ID":"f1306003-e12b-4db1-beeb-cd461db0975e","Type":"ContainerDied","Data":"90d8272e14124df09a1e4603e8bf04a61add7c1cf9078eea0c18f84d9f987fe3"} Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.419079 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.585677 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-public-tls-certs\") pod \"f1306003-e12b-4db1-beeb-cd461db0975e\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.585836 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsqlz\" (UniqueName: \"kubernetes.io/projected/f1306003-e12b-4db1-beeb-cd461db0975e-kube-api-access-zsqlz\") pod \"f1306003-e12b-4db1-beeb-cd461db0975e\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.585955 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-internal-tls-certs\") pod \"f1306003-e12b-4db1-beeb-cd461db0975e\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.586025 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-config\") pod \"f1306003-e12b-4db1-beeb-cd461db0975e\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.586081 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-ovndb-tls-certs\") pod \"f1306003-e12b-4db1-beeb-cd461db0975e\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.586219 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-combined-ca-bundle\") pod \"f1306003-e12b-4db1-beeb-cd461db0975e\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.586285 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-httpd-config\") pod \"f1306003-e12b-4db1-beeb-cd461db0975e\" (UID: \"f1306003-e12b-4db1-beeb-cd461db0975e\") " Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.593578 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1306003-e12b-4db1-beeb-cd461db0975e-kube-api-access-zsqlz" (OuterVolumeSpecName: "kube-api-access-zsqlz") pod "f1306003-e12b-4db1-beeb-cd461db0975e" (UID: "f1306003-e12b-4db1-beeb-cd461db0975e"). InnerVolumeSpecName "kube-api-access-zsqlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.594076 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f1306003-e12b-4db1-beeb-cd461db0975e" (UID: "f1306003-e12b-4db1-beeb-cd461db0975e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.636709 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f1306003-e12b-4db1-beeb-cd461db0975e" (UID: "f1306003-e12b-4db1-beeb-cd461db0975e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.643387 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f1306003-e12b-4db1-beeb-cd461db0975e" (UID: "f1306003-e12b-4db1-beeb-cd461db0975e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.644223 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1306003-e12b-4db1-beeb-cd461db0975e" (UID: "f1306003-e12b-4db1-beeb-cd461db0975e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.652183 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-config" (OuterVolumeSpecName: "config") pod "f1306003-e12b-4db1-beeb-cd461db0975e" (UID: "f1306003-e12b-4db1-beeb-cd461db0975e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.669990 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f1306003-e12b-4db1-beeb-cd461db0975e" (UID: "f1306003-e12b-4db1-beeb-cd461db0975e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.688281 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsqlz\" (UniqueName: \"kubernetes.io/projected/f1306003-e12b-4db1-beeb-cd461db0975e-kube-api-access-zsqlz\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.688331 5047 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.688345 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.688357 5047 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.688368 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.688382 5047 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:03 crc kubenswrapper[5047]: I0223 09:03:03.688394 5047 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1306003-e12b-4db1-beeb-cd461db0975e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:04 crc kubenswrapper[5047]: I0223 09:03:04.027788 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755d9b4d6f-jbxjv" event={"ID":"f1306003-e12b-4db1-beeb-cd461db0975e","Type":"ContainerDied","Data":"e34a311f5c039beb78ef7cb8b18633ff064cb83ddb6ea59bdbeed93d52ec9700"} Feb 23 09:03:04 crc kubenswrapper[5047]: I0223 09:03:04.027850 5047 scope.go:117] "RemoveContainer" containerID="9ea42abec4f78326158bc5cb0cec2641d24efe082290fbd2778b5995dbeb07c2" Feb 23 09:03:04 crc kubenswrapper[5047]: I0223 09:03:04.028057 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-755d9b4d6f-jbxjv" Feb 23 09:03:04 crc kubenswrapper[5047]: I0223 09:03:04.081953 5047 scope.go:117] "RemoveContainer" containerID="90d8272e14124df09a1e4603e8bf04a61add7c1cf9078eea0c18f84d9f987fe3" Feb 23 09:03:04 crc kubenswrapper[5047]: I0223 09:03:04.085415 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-755d9b4d6f-jbxjv"] Feb 23 09:03:04 crc kubenswrapper[5047]: I0223 09:03:04.092318 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-755d9b4d6f-jbxjv"] Feb 23 09:03:04 crc kubenswrapper[5047]: I0223 09:03:04.360549 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1306003-e12b-4db1-beeb-cd461db0975e" path="/var/lib/kubelet/pods/f1306003-e12b-4db1-beeb-cd461db0975e/volumes" Feb 23 09:03:07 crc kubenswrapper[5047]: I0223 09:03:07.364448 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:03:07 crc kubenswrapper[5047]: E0223 09:03:07.365337 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.147510 5047 generic.go:334] "Generic (PLEG): container finished" podID="b6c3821a-cbde-41d5-95fd-1e617b2d12bc" containerID="458b0e1642c869d058c9b43e3416f43f71ec7ed0417844e4fdd5515463a4d080" exitCode=137 Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.148207 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6c3821a-cbde-41d5-95fd-1e617b2d12bc","Type":"ContainerDied","Data":"458b0e1642c869d058c9b43e3416f43f71ec7ed0417844e4fdd5515463a4d080"} Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.151718 5047 generic.go:334] "Generic (PLEG): container finished" podID="adff4079-41bc-4cde-bc30-4a29f5302568" containerID="902673b5c0f9f5c778122d89416d764f9c657be00747f6d0aaf5e3bce9ebde6d" exitCode=137 Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.151841 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86dd65c656-6n5cp" event={"ID":"adff4079-41bc-4cde-bc30-4a29f5302568","Type":"ContainerDied","Data":"902673b5c0f9f5c778122d89416d764f9c657be00747f6d0aaf5e3bce9ebde6d"} Feb 23 09:03:12 crc kubenswrapper[5047]: E0223 09:03:12.518140 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:12 crc kubenswrapper[5047]: E0223 09:03:12.519225 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:12 crc kubenswrapper[5047]: E0223 09:03:12.520477 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:12 crc kubenswrapper[5047]: E0223 09:03:12.520518 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6db8c9f77f-vfbhs" podUID="5adc996d-c7bb-49a2-bea0-909b84c93353" containerName="heat-engine" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.589643 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.697745 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adff4079-41bc-4cde-bc30-4a29f5302568-logs\") pod \"adff4079-41bc-4cde-bc30-4a29f5302568\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.697871 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-horizon-tls-certs\") pod \"adff4079-41bc-4cde-bc30-4a29f5302568\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.697913 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-combined-ca-bundle\") pod \"adff4079-41bc-4cde-bc30-4a29f5302568\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.697933 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq7ln\" (UniqueName: \"kubernetes.io/projected/adff4079-41bc-4cde-bc30-4a29f5302568-kube-api-access-gq7ln\") pod \"adff4079-41bc-4cde-bc30-4a29f5302568\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.697953 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adff4079-41bc-4cde-bc30-4a29f5302568-config-data\") pod \"adff4079-41bc-4cde-bc30-4a29f5302568\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.697983 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adff4079-41bc-4cde-bc30-4a29f5302568-scripts\") pod \"adff4079-41bc-4cde-bc30-4a29f5302568\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.698022 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-horizon-secret-key\") pod \"adff4079-41bc-4cde-bc30-4a29f5302568\" (UID: \"adff4079-41bc-4cde-bc30-4a29f5302568\") " Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.698897 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adff4079-41bc-4cde-bc30-4a29f5302568-logs" (OuterVolumeSpecName: "logs") pod "adff4079-41bc-4cde-bc30-4a29f5302568" (UID: "adff4079-41bc-4cde-bc30-4a29f5302568"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.734450 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adff4079-41bc-4cde-bc30-4a29f5302568-scripts" (OuterVolumeSpecName: "scripts") pod "adff4079-41bc-4cde-bc30-4a29f5302568" (UID: "adff4079-41bc-4cde-bc30-4a29f5302568"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.736654 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adff4079-41bc-4cde-bc30-4a29f5302568-config-data" (OuterVolumeSpecName: "config-data") pod "adff4079-41bc-4cde-bc30-4a29f5302568" (UID: "adff4079-41bc-4cde-bc30-4a29f5302568"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.737813 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adff4079-41bc-4cde-bc30-4a29f5302568-kube-api-access-gq7ln" (OuterVolumeSpecName: "kube-api-access-gq7ln") pod "adff4079-41bc-4cde-bc30-4a29f5302568" (UID: "adff4079-41bc-4cde-bc30-4a29f5302568"). InnerVolumeSpecName "kube-api-access-gq7ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.739240 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "adff4079-41bc-4cde-bc30-4a29f5302568" (UID: "adff4079-41bc-4cde-bc30-4a29f5302568"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.756248 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adff4079-41bc-4cde-bc30-4a29f5302568" (UID: "adff4079-41bc-4cde-bc30-4a29f5302568"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.773660 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "adff4079-41bc-4cde-bc30-4a29f5302568" (UID: "adff4079-41bc-4cde-bc30-4a29f5302568"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.800106 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.801993 5047 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.802071 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.802082 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq7ln\" (UniqueName: \"kubernetes.io/projected/adff4079-41bc-4cde-bc30-4a29f5302568-kube-api-access-gq7ln\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.802094 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adff4079-41bc-4cde-bc30-4a29f5302568-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.802104 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adff4079-41bc-4cde-bc30-4a29f5302568-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.802113 5047 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adff4079-41bc-4cde-bc30-4a29f5302568-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.802120 5047 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adff4079-41bc-4cde-bc30-4a29f5302568-logs\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.903808 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-config-data\") pod \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.903877 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-etc-machine-id\") pod \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.904061 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j289z\" (UniqueName: \"kubernetes.io/projected/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-kube-api-access-j289z\") pod \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.904104 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-config-data-custom\") pod \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.904136 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b6c3821a-cbde-41d5-95fd-1e617b2d12bc" (UID: "b6c3821a-cbde-41d5-95fd-1e617b2d12bc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.904222 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-scripts\") pod \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.904305 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-combined-ca-bundle\") pod \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\" (UID: \"b6c3821a-cbde-41d5-95fd-1e617b2d12bc\") " Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.904865 5047 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.908080 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-kube-api-access-j289z" (OuterVolumeSpecName: "kube-api-access-j289z") pod "b6c3821a-cbde-41d5-95fd-1e617b2d12bc" (UID: "b6c3821a-cbde-41d5-95fd-1e617b2d12bc"). InnerVolumeSpecName "kube-api-access-j289z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.908479 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b6c3821a-cbde-41d5-95fd-1e617b2d12bc" (UID: "b6c3821a-cbde-41d5-95fd-1e617b2d12bc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.909240 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-scripts" (OuterVolumeSpecName: "scripts") pod "b6c3821a-cbde-41d5-95fd-1e617b2d12bc" (UID: "b6c3821a-cbde-41d5-95fd-1e617b2d12bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.949276 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6c3821a-cbde-41d5-95fd-1e617b2d12bc" (UID: "b6c3821a-cbde-41d5-95fd-1e617b2d12bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:12 crc kubenswrapper[5047]: I0223 09:03:12.999386 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-config-data" (OuterVolumeSpecName: "config-data") pod "b6c3821a-cbde-41d5-95fd-1e617b2d12bc" (UID: "b6c3821a-cbde-41d5-95fd-1e617b2d12bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.006878 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.006958 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j289z\" (UniqueName: \"kubernetes.io/projected/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-kube-api-access-j289z\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.006984 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.007004 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.007047 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c3821a-cbde-41d5-95fd-1e617b2d12bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.167357 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86dd65c656-6n5cp" event={"ID":"adff4079-41bc-4cde-bc30-4a29f5302568","Type":"ContainerDied","Data":"3d866e70ebba36c008403cf1441ec6a5555de1d40558d732a15fa81a0bb7be51"} Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.167515 5047 scope.go:117] "RemoveContainer" containerID="47ff5d6db1f45daec44c8d4bbc86c28325f137beacfc6286b66cd85eba48a741" Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.168957 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86dd65c656-6n5cp" Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.173629 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6c3821a-cbde-41d5-95fd-1e617b2d12bc","Type":"ContainerDied","Data":"2c83c28ee8b401e217876193e0b182868a9c236011490fd79285c3519c9dd77e"} Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.173723 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.283380 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.292742 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.325328 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86dd65c656-6n5cp"] Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.332301 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-86dd65c656-6n5cp"] Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.477085 5047 scope.go:117] "RemoveContainer" containerID="902673b5c0f9f5c778122d89416d764f9c657be00747f6d0aaf5e3bce9ebde6d" Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.575161 5047 scope.go:117] "RemoveContainer" containerID="0ff894197ec53c6fa19bcceb0a1998cc1302ed5266ed87a7fab99b18cfd8df6e" Feb 23 09:03:13 crc kubenswrapper[5047]: I0223 09:03:13.613862 5047 scope.go:117] "RemoveContainer" containerID="458b0e1642c869d058c9b43e3416f43f71ec7ed0417844e4fdd5515463a4d080" Feb 23 09:03:14 crc kubenswrapper[5047]: I0223 09:03:14.362984 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adff4079-41bc-4cde-bc30-4a29f5302568" path="/var/lib/kubelet/pods/adff4079-41bc-4cde-bc30-4a29f5302568/volumes" Feb 23 09:03:14 crc kubenswrapper[5047]: I0223 09:03:14.363585 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c3821a-cbde-41d5-95fd-1e617b2d12bc" path="/var/lib/kubelet/pods/b6c3821a-cbde-41d5-95fd-1e617b2d12bc/volumes" Feb 23 09:03:15 crc kubenswrapper[5047]: I0223 09:03:15.046150 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="4ce03d60-441e-4a79-acce-d54444aedfcb" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.115:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 09:03:15 crc kubenswrapper[5047]: I0223 09:03:15.046269 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="4ce03d60-441e-4a79-acce-d54444aedfcb" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.115:9292/healthcheck\": dial tcp 10.217.1.115:9292: i/o timeout" Feb 23 09:03:17 crc kubenswrapper[5047]: I0223 09:03:17.417227 5047 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-86dd65c656-6n5cp" podUID="adff4079-41bc-4cde-bc30-4a29f5302568" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.118:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.118:8443: i/o timeout" Feb 23 09:03:21 crc kubenswrapper[5047]: I0223 09:03:21.341118 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:03:21 crc kubenswrapper[5047]: E0223 09:03:21.342367 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:03:22 crc kubenswrapper[5047]: E0223 09:03:22.520570 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:22 crc kubenswrapper[5047]: E0223 09:03:22.527067 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:22 crc kubenswrapper[5047]: E0223 09:03:22.530400 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:22 crc kubenswrapper[5047]: E0223 09:03:22.530493 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6db8c9f77f-vfbhs" podUID="5adc996d-c7bb-49a2-bea0-909b84c93353" containerName="heat-engine" Feb 23 09:03:26 crc kubenswrapper[5047]: I0223 09:03:26.876169 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-nh5jn" podUID="e7785968-bae6-4ad4-bc4a-ccc4fac2cf41" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 09:03:32 crc kubenswrapper[5047]: I0223 09:03:32.057497 5047 generic.go:334] "Generic (PLEG): container finished" podID="33e156a6-7aa5-4769-8219-118aecb3b161" containerID="1c77c1f0f1e6b25a138a342acba27c33a4de799901c4d45d6ddcd5cd88c94c61" exitCode=-1 Feb 23 09:03:32 crc kubenswrapper[5047]: I0223 09:03:32.057606 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"33e156a6-7aa5-4769-8219-118aecb3b161","Type":"ContainerDied","Data":"1c77c1f0f1e6b25a138a342acba27c33a4de799901c4d45d6ddcd5cd88c94c61"} Feb 23 09:03:32 crc kubenswrapper[5047]: E0223 09:03:32.517570 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:32 crc kubenswrapper[5047]: E0223 09:03:32.519466 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:32 crc kubenswrapper[5047]: E0223 09:03:32.521282 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:32 crc kubenswrapper[5047]: E0223 09:03:32.521326 5047 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6db8c9f77f-vfbhs" podUID="5adc996d-c7bb-49a2-bea0-909b84c93353" containerName="heat-engine" Feb 23 09:03:35 crc kubenswrapper[5047]: I0223 09:03:35.341676 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:03:35 crc kubenswrapper[5047]: E0223 09:03:35.343112 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:03:35 crc kubenswrapper[5047]: I0223 09:03:35.798611 5047 generic.go:334] "Generic (PLEG): container finished" podID="f7982ba2-03dd-461c-bef5-d4223fd9ddd5" containerID="275245be4e1d753203f6c3c8cc1e21aa7ee24902f5fb7405f17a6544e6c06d89" exitCode=-1 Feb 23 09:03:35 crc kubenswrapper[5047]: I0223 09:03:35.798672 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"f7982ba2-03dd-461c-bef5-d4223fd9ddd5","Type":"ContainerDied","Data":"275245be4e1d753203f6c3c8cc1e21aa7ee24902f5fb7405f17a6544e6c06d89"} Feb 23 09:03:36 crc kubenswrapper[5047]: I0223 09:03:36.877365 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-nh5jn" podUID="e7785968-bae6-4ad4-bc4a-ccc4fac2cf41" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 09:03:42 crc kubenswrapper[5047]: E0223 09:03:42.517006 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746 is running failed: container process not found" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:42 crc kubenswrapper[5047]: E0223 09:03:42.518075 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746 is running failed: container process not found" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:42 crc kubenswrapper[5047]: E0223 09:03:42.518844 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746 is running failed: container process not found" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:42 crc kubenswrapper[5047]: E0223 09:03:42.518934 5047 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746 is running failed: container process not found" probeType="Readiness" pod="openstack/heat-engine-6db8c9f77f-vfbhs" podUID="5adc996d-c7bb-49a2-bea0-909b84c93353" containerName="heat-engine" Feb 23 09:03:46 crc kubenswrapper[5047]: I0223 09:03:46.877146 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-nh5jn" podUID="e7785968-bae6-4ad4-bc4a-ccc4fac2cf41" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 09:03:46 crc kubenswrapper[5047]: I0223 09:03:46.877533 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-nh5jn" Feb 23 09:03:46 crc kubenswrapper[5047]: I0223 09:03:46.878405 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr" containerStatusID={"Type":"cri-o","ID":"f0983d003fabd10061ec547251645b174c56ea5d575bcb957ce5c943deee01eb"} pod="metallb-system/frr-k8s-nh5jn" containerMessage="Container frr failed liveness probe, will be restarted" Feb 23 09:03:46 crc kubenswrapper[5047]: I0223 09:03:46.878522 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-nh5jn" podUID="e7785968-bae6-4ad4-bc4a-ccc4fac2cf41" containerName="frr" containerID="cri-o://f0983d003fabd10061ec547251645b174c56ea5d575bcb957ce5c943deee01eb" gracePeriod=2 Feb 23 09:03:47 crc kubenswrapper[5047]: I0223 09:03:47.220150 5047 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 09:03:47 crc kubenswrapper[5047]: I0223 09:03:47.220268 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 09:03:47 crc kubenswrapper[5047]: I0223 09:03:47.341952 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:03:47 crc kubenswrapper[5047]: E0223 09:03:47.342596 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:03:49 crc kubenswrapper[5047]: I0223 09:03:49.628369 5047 generic.go:334] "Generic (PLEG): container finished" podID="5adc996d-c7bb-49a2-bea0-909b84c93353" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" exitCode=-1 Feb 23 09:03:49 crc kubenswrapper[5047]: I0223 09:03:49.628593 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6db8c9f77f-vfbhs" event={"ID":"5adc996d-c7bb-49a2-bea0-909b84c93353","Type":"ContainerDied","Data":"c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746"} Feb 23 09:03:51 crc kubenswrapper[5047]: I0223 09:03:51.370362 5047 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 09:03:51 crc kubenswrapper[5047]: I0223 09:03:51.370432 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 09:03:52 crc kubenswrapper[5047]: E0223 09:03:52.517149 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746 is running failed: container process not found" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:52 crc kubenswrapper[5047]: E0223 09:03:52.518671 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746 is running failed: container process not found" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:52 crc kubenswrapper[5047]: E0223 09:03:52.519262 5047 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746 is running failed: container process not found" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 09:03:52 crc kubenswrapper[5047]: E0223 09:03:52.519361 5047 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746 is running failed: container process not found" probeType="Readiness" pod="openstack/heat-engine-6db8c9f77f-vfbhs" podUID="5adc996d-c7bb-49a2-bea0-909b84c93353" containerName="heat-engine" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.120972 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.207686 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.276737 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgm8k\" (UniqueName: \"kubernetes.io/projected/5adc996d-c7bb-49a2-bea0-909b84c93353-kube-api-access-xgm8k\") pod \"5adc996d-c7bb-49a2-bea0-909b84c93353\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.276833 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-combined-ca-bundle\") pod \"5adc996d-c7bb-49a2-bea0-909b84c93353\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.276930 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-config-data-custom\") pod \"5adc996d-c7bb-49a2-bea0-909b84c93353\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.276962 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-config-data\") pod \"5adc996d-c7bb-49a2-bea0-909b84c93353\" (UID: \"5adc996d-c7bb-49a2-bea0-909b84c93353\") " Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.282072 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5adc996d-c7bb-49a2-bea0-909b84c93353" (UID: "5adc996d-c7bb-49a2-bea0-909b84c93353"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.282244 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5adc996d-c7bb-49a2-bea0-909b84c93353-kube-api-access-xgm8k" (OuterVolumeSpecName: "kube-api-access-xgm8k") pod "5adc996d-c7bb-49a2-bea0-909b84c93353" (UID: "5adc996d-c7bb-49a2-bea0-909b84c93353"). InnerVolumeSpecName "kube-api-access-xgm8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.297776 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5adc996d-c7bb-49a2-bea0-909b84c93353" (UID: "5adc996d-c7bb-49a2-bea0-909b84c93353"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.319465 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-config-data" (OuterVolumeSpecName: "config-data") pod "5adc996d-c7bb-49a2-bea0-909b84c93353" (UID: "5adc996d-c7bb-49a2-bea0-909b84c93353"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.378497 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-05b84435-efb2-4b49-8b88-57874eed5701\") pod \"33e156a6-7aa5-4769-8219-118aecb3b161\" (UID: \"33e156a6-7aa5-4769-8219-118aecb3b161\") " Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.378544 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/33e156a6-7aa5-4769-8219-118aecb3b161-ovn-data-cert\") pod \"33e156a6-7aa5-4769-8219-118aecb3b161\" (UID: \"33e156a6-7aa5-4769-8219-118aecb3b161\") " Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.378572 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmdbw\" (UniqueName: \"kubernetes.io/projected/33e156a6-7aa5-4769-8219-118aecb3b161-kube-api-access-qmdbw\") pod \"33e156a6-7aa5-4769-8219-118aecb3b161\" (UID: \"33e156a6-7aa5-4769-8219-118aecb3b161\") " Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.378936 5047 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.378954 5047 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.378964 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgm8k\" (UniqueName: \"kubernetes.io/projected/5adc996d-c7bb-49a2-bea0-909b84c93353-kube-api-access-xgm8k\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.378974 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5adc996d-c7bb-49a2-bea0-909b84c93353-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.384149 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e156a6-7aa5-4769-8219-118aecb3b161-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "33e156a6-7aa5-4769-8219-118aecb3b161" (UID: "33e156a6-7aa5-4769-8219-118aecb3b161"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.384149 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e156a6-7aa5-4769-8219-118aecb3b161-kube-api-access-qmdbw" (OuterVolumeSpecName: "kube-api-access-qmdbw") pod "33e156a6-7aa5-4769-8219-118aecb3b161" (UID: "33e156a6-7aa5-4769-8219-118aecb3b161"). InnerVolumeSpecName "kube-api-access-qmdbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.390435 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-05b84435-efb2-4b49-8b88-57874eed5701" (OuterVolumeSpecName: "ovn-data") pod "33e156a6-7aa5-4769-8219-118aecb3b161" (UID: "33e156a6-7aa5-4769-8219-118aecb3b161"). InnerVolumeSpecName "pvc-05b84435-efb2-4b49-8b88-57874eed5701". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.479828 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmdbw\" (UniqueName: \"kubernetes.io/projected/33e156a6-7aa5-4769-8219-118aecb3b161-kube-api-access-qmdbw\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.480158 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-05b84435-efb2-4b49-8b88-57874eed5701\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-05b84435-efb2-4b49-8b88-57874eed5701\") on node \"crc\" " Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.480176 5047 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/33e156a6-7aa5-4769-8219-118aecb3b161-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.514412 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.514594 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-05b84435-efb2-4b49-8b88-57874eed5701" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-05b84435-efb2-4b49-8b88-57874eed5701") on node "crc" Feb 23 09:03:58 crc kubenswrapper[5047]: E0223 09:03:58.532151 5047 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5adc996d_c7bb_49a2_bea0_909b84c93353.slice\": RecentStats: unable to find data in memory cache]" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.581100 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-05b84435-efb2-4b49-8b88-57874eed5701\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-05b84435-efb2-4b49-8b88-57874eed5701\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.677685 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.682578 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e00019bb-2318-453f-86fb-844b28e7993c\") pod \"f7982ba2-03dd-461c-bef5-d4223fd9ddd5\" (UID: \"f7982ba2-03dd-461c-bef5-d4223fd9ddd5\") " Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.682664 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlwzc\" (UniqueName: \"kubernetes.io/projected/f7982ba2-03dd-461c-bef5-d4223fd9ddd5-kube-api-access-mlwzc\") pod \"f7982ba2-03dd-461c-bef5-d4223fd9ddd5\" (UID: \"f7982ba2-03dd-461c-bef5-d4223fd9ddd5\") " Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.686858 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7982ba2-03dd-461c-bef5-d4223fd9ddd5-kube-api-access-mlwzc" (OuterVolumeSpecName: "kube-api-access-mlwzc") pod "f7982ba2-03dd-461c-bef5-d4223fd9ddd5" (UID: "f7982ba2-03dd-461c-bef5-d4223fd9ddd5"). InnerVolumeSpecName "kube-api-access-mlwzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.696232 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e00019bb-2318-453f-86fb-844b28e7993c" (OuterVolumeSpecName: "mariadb-data") pod "f7982ba2-03dd-461c-bef5-d4223fd9ddd5" (UID: "f7982ba2-03dd-461c-bef5-d4223fd9ddd5"). InnerVolumeSpecName "pvc-e00019bb-2318-453f-86fb-844b28e7993c". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.744351 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6db8c9f77f-vfbhs" event={"ID":"5adc996d-c7bb-49a2-bea0-909b84c93353","Type":"ContainerDied","Data":"c7807c5d9f393a58b4ab5708356fd88c32f39c95edb07851bda6944d7a823ac9"} Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.744388 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6db8c9f77f-vfbhs" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.744400 5047 scope.go:117] "RemoveContainer" containerID="c1e189e1b48f0343dd2bf505d57e29dca1d09bc94bbaadd88ba8b0b3a70e4746" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.746271 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.746263 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"33e156a6-7aa5-4769-8219-118aecb3b161","Type":"ContainerDied","Data":"3ed959145df59d193fbf97cb9f6f6526e3274b1e4aa71dc030b245e4512ecd17"} Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.749224 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.750635 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.761468 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.761564 5047 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d8fac09f4ceec44d8c4d007da5abfd18ee971eb0e28a823c9c81b05b8ecb1afc" exitCode=1 Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.761704 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d8fac09f4ceec44d8c4d007da5abfd18ee971eb0e28a823c9c81b05b8ecb1afc"} Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.762290 5047 scope.go:117] "RemoveContainer" containerID="d8fac09f4ceec44d8c4d007da5abfd18ee971eb0e28a823c9c81b05b8ecb1afc" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.765040 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6db8c9f77f-vfbhs"] Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.765732 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.765748 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"f7982ba2-03dd-461c-bef5-d4223fd9ddd5","Type":"ContainerDied","Data":"bca99637c6f292db0aaad36e9beadb2bd35f59a9d94a6bd7f9450fcfc165a7b2"} Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.771587 5047 scope.go:117] "RemoveContainer" containerID="1c77c1f0f1e6b25a138a342acba27c33a4de799901c4d45d6ddcd5cd88c94c61" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.772258 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6db8c9f77f-vfbhs"] Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.773311 5047 generic.go:334] "Generic (PLEG): container finished" podID="e7785968-bae6-4ad4-bc4a-ccc4fac2cf41" containerID="f0983d003fabd10061ec547251645b174c56ea5d575bcb957ce5c943deee01eb" exitCode=143 Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.773353 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nh5jn" event={"ID":"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41","Type":"ContainerDied","Data":"f0983d003fabd10061ec547251645b174c56ea5d575bcb957ce5c943deee01eb"} Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.773379 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nh5jn" event={"ID":"e7785968-bae6-4ad4-bc4a-ccc4fac2cf41","Type":"ContainerStarted","Data":"7b54df5c5f296235835235ad761108853fb47c1ade858a9be307df0fcf6820d6"} Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.784757 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlwzc\" (UniqueName: \"kubernetes.io/projected/f7982ba2-03dd-461c-bef5-d4223fd9ddd5-kube-api-access-mlwzc\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.784841 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e00019bb-2318-453f-86fb-844b28e7993c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e00019bb-2318-453f-86fb-844b28e7993c\") on node \"crc\" " Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.810752 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.811130 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e00019bb-2318-453f-86fb-844b28e7993c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e00019bb-2318-453f-86fb-844b28e7993c") on node "crc" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.818506 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.823691 5047 scope.go:117] "RemoveContainer" containerID="db410610d66ac6995629552ac1e2a2d1f6f1ee0716cfced332200b1ec230d213" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.830944 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.872217 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.879322 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.882079 5047 scope.go:117] "RemoveContainer" containerID="275245be4e1d753203f6c3c8cc1e21aa7ee24902f5fb7405f17a6544e6c06d89" Feb 23 09:03:58 crc kubenswrapper[5047]: I0223 09:03:58.886047 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-e00019bb-2318-453f-86fb-844b28e7993c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e00019bb-2318-453f-86fb-844b28e7993c\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:59 crc kubenswrapper[5047]: I0223 09:03:59.228531 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 09:03:59 crc kubenswrapper[5047]: I0223 09:03:59.340941 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:03:59 crc kubenswrapper[5047]: E0223 09:03:59.341270 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:03:59 crc kubenswrapper[5047]: I0223 09:03:59.791433 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 23 09:03:59 crc kubenswrapper[5047]: I0223 09:03:59.793005 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 23 09:03:59 crc kubenswrapper[5047]: I0223 09:03:59.797014 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"90e137935e3da076fb6589af1832bc0329a98fc957f8de7ca0f20502c9fe03f6"} Feb 23 09:04:00 crc kubenswrapper[5047]: I0223 09:04:00.351170 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33e156a6-7aa5-4769-8219-118aecb3b161" path="/var/lib/kubelet/pods/33e156a6-7aa5-4769-8219-118aecb3b161/volumes" Feb 23 09:04:00 crc kubenswrapper[5047]: I0223 09:04:00.352388 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5adc996d-c7bb-49a2-bea0-909b84c93353" path="/var/lib/kubelet/pods/5adc996d-c7bb-49a2-bea0-909b84c93353/volumes" Feb 23 09:04:00 crc kubenswrapper[5047]: I0223 09:04:00.353160 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7982ba2-03dd-461c-bef5-d4223fd9ddd5" path="/var/lib/kubelet/pods/f7982ba2-03dd-461c-bef5-d4223fd9ddd5/volumes" Feb 23 09:04:00 crc kubenswrapper[5047]: I0223 09:04:00.835951 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-nh5jn" Feb 23 09:04:00 crc kubenswrapper[5047]: I0223 09:04:00.875394 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-nh5jn" Feb 23 09:04:03 crc kubenswrapper[5047]: I0223 09:04:03.446258 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 09:04:03 crc kubenswrapper[5047]: I0223 09:04:03.450013 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 09:04:03 crc kubenswrapper[5047]: I0223 09:04:03.842326 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 09:04:09 crc kubenswrapper[5047]: I0223 09:04:09.236568 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 09:04:12 crc kubenswrapper[5047]: I0223 09:04:12.347292 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:04:12 crc kubenswrapper[5047]: E0223 09:04:12.349714 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:04:25 crc kubenswrapper[5047]: I0223 09:04:25.342106 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:04:25 crc kubenswrapper[5047]: E0223 09:04:25.344093 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:04:37 crc kubenswrapper[5047]: I0223 09:04:37.341521 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:04:37 crc kubenswrapper[5047]: E0223 09:04:37.342590 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:04:49 crc kubenswrapper[5047]: I0223 09:04:49.341025 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:04:49 crc kubenswrapper[5047]: E0223 09:04:49.341805 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:05:02 crc kubenswrapper[5047]: I0223 09:05:02.677653 5047 scope.go:117] "RemoveContainer" containerID="8110fa3c9068cd81f7b808eb4081d1c9b96f0d6ba507405957aa5e3d3e2001f4" Feb 23 09:05:02 crc kubenswrapper[5047]: I0223 09:05:02.698973 5047 scope.go:117] "RemoveContainer" containerID="89fe650989f06df1fa46950063352f4f07e0ac4cdfbad470ce07cca5b1f0509c" Feb 23 09:05:04 crc kubenswrapper[5047]: I0223 09:05:04.340832 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:05:04 crc kubenswrapper[5047]: E0223 09:05:04.341268 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:05:17 crc kubenswrapper[5047]: I0223 09:05:17.340681 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:05:17 crc kubenswrapper[5047]: E0223 09:05:17.341502 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:05:29 crc kubenswrapper[5047]: I0223 09:05:29.341038 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:05:29 crc kubenswrapper[5047]: E0223 09:05:29.341733 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.865089 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7lqvl/must-gather-qksss"] Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866193 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753e53f0-d44b-4af2-9aff-eabc1a46d537" containerName="mysql-bootstrap" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866215 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="753e53f0-d44b-4af2-9aff-eabc1a46d537" containerName="mysql-bootstrap" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866247 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c59427cb-019c-4f83-af18-75900909e70f" containerName="openstack-network-exporter" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866260 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c59427cb-019c-4f83-af18-75900909e70f" containerName="openstack-network-exporter" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866279 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1058e169-d572-43c5-80b3-d5f2f2c78afb" containerName="barbican-api-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866292 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="1058e169-d572-43c5-80b3-d5f2f2c78afb" containerName="barbican-api-log" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866305 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39cf7673-4d38-49e0-9b86-f80c3949fd06" containerName="memcached" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866319 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="39cf7673-4d38-49e0-9b86-f80c3949fd06" containerName="memcached" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866379 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4565dbe-dc04-452f-a79e-bc09cb299f29" containerName="mariadb-account-create-update" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866392 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4565dbe-dc04-452f-a79e-bc09cb299f29" containerName="mariadb-account-create-update" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866410 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d714b4f-02d1-433b-ba95-54c199957dce" containerName="kube-state-metrics" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866423 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d714b4f-02d1-433b-ba95-54c199957dce" containerName="kube-state-metrics" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866443 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-notifier" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866455 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-notifier" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866479 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e613490-e007-4f7e-9868-abf59633c7c2" containerName="barbican-worker" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866505 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e613490-e007-4f7e-9868-abf59633c7c2" containerName="barbican-worker" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866525 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95116420-b62b-402c-bcbe-17026cba0354" containerName="glance-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866537 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="95116420-b62b-402c-bcbe-17026cba0354" containerName="glance-log" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866553 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce03d60-441e-4a79-acce-d54444aedfcb" containerName="glance-httpd" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866565 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce03d60-441e-4a79-acce-d54444aedfcb" containerName="glance-httpd" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866581 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c3821a-cbde-41d5-95fd-1e617b2d12bc" containerName="cinder-scheduler" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866592 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c3821a-cbde-41d5-95fd-1e617b2d12bc" containerName="cinder-scheduler" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866607 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db558a41-6dbf-4b18-af50-6a5311530ef4" containerName="rabbitmq" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866619 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="db558a41-6dbf-4b18-af50-6a5311530ef4" containerName="rabbitmq" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866639 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74700fa7-59df-4201-a7c4-de815b82208e" containerName="mysql-bootstrap" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866650 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="74700fa7-59df-4201-a7c4-de815b82208e" containerName="mysql-bootstrap" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866659 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="570bec3d-603c-4f92-b183-c5abb7e799d8" containerName="nova-api-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866666 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="570bec3d-603c-4f92-b183-c5abb7e799d8" containerName="nova-api-log" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866675 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14988409-8f11-42ca-bc6d-a4ba3d3056a4" containerName="nova-cell0-conductor-conductor" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866682 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="14988409-8f11-42ca-bc6d-a4ba3d3056a4" containerName="nova-cell0-conductor-conductor" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866688 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50aef737-c888-466c-92d0-9c683267d266" containerName="heat-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866695 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="50aef737-c888-466c-92d0-9c683267d266" containerName="heat-api" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866704 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="ceilometer-notification-agent" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866710 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="ceilometer-notification-agent" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866717 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95116420-b62b-402c-bcbe-17026cba0354" containerName="glance-httpd" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866722 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="95116420-b62b-402c-bcbe-17026cba0354" containerName="glance-httpd" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866733 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d570b43-d0ff-42d7-a305-9d7c2f9f9881" containerName="cinder-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866741 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d570b43-d0ff-42d7-a305-9d7c2f9f9881" containerName="cinder-api" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866752 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e156a6-7aa5-4769-8219-118aecb3b161" containerName="adoption" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866757 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e156a6-7aa5-4769-8219-118aecb3b161" containerName="adoption" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866765 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerName="nova-metadata-metadata" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866771 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerName="nova-metadata-metadata" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866781 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-listener" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866787 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-listener" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866799 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerName="nova-metadata-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866804 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerName="nova-metadata-log" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866810 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce03d60-441e-4a79-acce-d54444aedfcb" containerName="glance-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866815 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce03d60-441e-4a79-acce-d54444aedfcb" containerName="glance-log" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866824 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f69678-8caa-45a4-8361-a0bf3ef10d19" containerName="barbican-keystone-listener-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866831 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f69678-8caa-45a4-8361-a0bf3ef10d19" containerName="barbican-keystone-listener-log" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866841 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5adc996d-c7bb-49a2-bea0-909b84c93353" containerName="heat-engine" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866847 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5adc996d-c7bb-49a2-bea0-909b84c93353" containerName="heat-engine" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866858 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7982ba2-03dd-461c-bef5-d4223fd9ddd5" containerName="adoption" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866863 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7982ba2-03dd-461c-bef5-d4223fd9ddd5" containerName="adoption" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866871 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d570b43-d0ff-42d7-a305-9d7c2f9f9881" containerName="cinder-api-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866877 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d570b43-d0ff-42d7-a305-9d7c2f9f9881" containerName="cinder-api-log" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866883 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adff4079-41bc-4cde-bc30-4a29f5302568" containerName="horizon-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866889 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="adff4079-41bc-4cde-bc30-4a29f5302568" containerName="horizon-log" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866899 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db558a41-6dbf-4b18-af50-6a5311530ef4" containerName="setup-container" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866922 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="db558a41-6dbf-4b18-af50-6a5311530ef4" containerName="setup-container" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866928 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c3821a-cbde-41d5-95fd-1e617b2d12bc" containerName="probe" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866934 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c3821a-cbde-41d5-95fd-1e617b2d12bc" containerName="probe" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866941 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07094621-ca88-4942-a226-76667658a5bc" containerName="placement-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866947 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="07094621-ca88-4942-a226-76667658a5bc" containerName="placement-api" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866957 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55697218-de1b-424f-b5ff-2d0806e54a96" containerName="keystone-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866964 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="55697218-de1b-424f-b5ff-2d0806e54a96" containerName="keystone-api" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866972 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1058e169-d572-43c5-80b3-d5f2f2c78afb" containerName="barbican-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866978 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="1058e169-d572-43c5-80b3-d5f2f2c78afb" containerName="barbican-api" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.866988 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c59427cb-019c-4f83-af18-75900909e70f" containerName="ovn-northd" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.866994 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="c59427cb-019c-4f83-af18-75900909e70f" containerName="ovn-northd" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867002 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="proxy-httpd" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867009 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="proxy-httpd" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867020 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e613490-e007-4f7e-9868-abf59633c7c2" containerName="barbican-worker-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867026 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e613490-e007-4f7e-9868-abf59633c7c2" containerName="barbican-worker-log" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867037 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f69678-8caa-45a4-8361-a0bf3ef10d19" containerName="barbican-keystone-listener" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867043 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f69678-8caa-45a4-8361-a0bf3ef10d19" containerName="barbican-keystone-listener" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867053 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3464c846-13b9-479e-b9af-3d571f03b284" containerName="nova-scheduler-scheduler" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867059 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="3464c846-13b9-479e-b9af-3d571f03b284" containerName="nova-scheduler-scheduler" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867070 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="ceilometer-central-agent" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867076 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="ceilometer-central-agent" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867085 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867091 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-api" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867100 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="sg-core" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867106 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="sg-core" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867114 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1306003-e12b-4db1-beeb-cd461db0975e" containerName="neutron-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867121 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1306003-e12b-4db1-beeb-cd461db0975e" containerName="neutron-api" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867130 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07094621-ca88-4942-a226-76667658a5bc" containerName="placement-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867136 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="07094621-ca88-4942-a226-76667658a5bc" containerName="placement-log" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867147 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8596ec60-89ed-43d9-be63-b7130fd0f937" containerName="heat-cfnapi" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867153 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="8596ec60-89ed-43d9-be63-b7130fd0f937" containerName="heat-cfnapi" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867160 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="570bec3d-603c-4f92-b183-c5abb7e799d8" containerName="nova-api-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867165 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="570bec3d-603c-4f92-b183-c5abb7e799d8" containerName="nova-api-api" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867174 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1306003-e12b-4db1-beeb-cd461db0975e" containerName="neutron-httpd" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867179 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1306003-e12b-4db1-beeb-cd461db0975e" containerName="neutron-httpd" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867188 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9205930b-2303-4b01-a3bf-cf4ef3ad0a49" containerName="openstack-network-exporter" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867193 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9205930b-2303-4b01-a3bf-cf4ef3ad0a49" containerName="openstack-network-exporter" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867200 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01713a47-8bed-4038-8339-bdcd77e6e1db" containerName="nova-cell1-conductor-conductor" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867206 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="01713a47-8bed-4038-8339-bdcd77e6e1db" containerName="nova-cell1-conductor-conductor" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867215 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-evaluator" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867220 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-evaluator" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867227 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f82aad-7df9-4b14-a328-2cc708aeed84" containerName="setup-container" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867233 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f82aad-7df9-4b14-a328-2cc708aeed84" containerName="setup-container" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867244 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753e53f0-d44b-4af2-9aff-eabc1a46d537" containerName="galera" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867250 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="753e53f0-d44b-4af2-9aff-eabc1a46d537" containerName="galera" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867257 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adff4079-41bc-4cde-bc30-4a29f5302568" containerName="horizon" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867262 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="adff4079-41bc-4cde-bc30-4a29f5302568" containerName="horizon" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867271 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9205930b-2303-4b01-a3bf-cf4ef3ad0a49" containerName="ovsdbserver-nb" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867277 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="9205930b-2303-4b01-a3bf-cf4ef3ad0a49" containerName="ovsdbserver-nb" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867287 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74700fa7-59df-4201-a7c4-de815b82208e" containerName="galera" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867292 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="74700fa7-59df-4201-a7c4-de815b82208e" containerName="galera" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.867299 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f82aad-7df9-4b14-a328-2cc708aeed84" containerName="rabbitmq" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867304 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f82aad-7df9-4b14-a328-2cc708aeed84" containerName="rabbitmq" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867451 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="570bec3d-603c-4f92-b183-c5abb7e799d8" containerName="nova-api-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867461 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerName="nova-metadata-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867473 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867482 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="9205930b-2303-4b01-a3bf-cf4ef3ad0a49" containerName="openstack-network-exporter" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867491 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="1058e169-d572-43c5-80b3-d5f2f2c78afb" containerName="barbican-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867497 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1306003-e12b-4db1-beeb-cd461db0975e" containerName="neutron-httpd" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867505 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="39cf7673-4d38-49e0-9b86-f80c3949fd06" containerName="memcached" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867514 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c3821a-cbde-41d5-95fd-1e617b2d12bc" containerName="probe" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867524 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="8596ec60-89ed-43d9-be63-b7130fd0f937" containerName="heat-cfnapi" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867533 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="74700fa7-59df-4201-a7c4-de815b82208e" containerName="galera" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867538 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d570b43-d0ff-42d7-a305-9d7c2f9f9881" containerName="cinder-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867548 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce03d60-441e-4a79-acce-d54444aedfcb" containerName="glance-httpd" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867558 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-evaluator" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867565 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="56f69678-8caa-45a4-8361-a0bf3ef10d19" containerName="barbican-keystone-listener" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867574 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="50aef737-c888-466c-92d0-9c683267d266" containerName="heat-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867584 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="570bec3d-603c-4f92-b183-c5abb7e799d8" containerName="nova-api-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867593 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="55697218-de1b-424f-b5ff-2d0806e54a96" containerName="keystone-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867602 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e613490-e007-4f7e-9868-abf59633c7c2" containerName="barbican-worker" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867612 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="01713a47-8bed-4038-8339-bdcd77e6e1db" containerName="nova-cell1-conductor-conductor" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867622 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="9205930b-2303-4b01-a3bf-cf4ef3ad0a49" containerName="ovsdbserver-nb" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867630 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="14988409-8f11-42ca-bc6d-a4ba3d3056a4" containerName="nova-cell0-conductor-conductor" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867636 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="adff4079-41bc-4cde-bc30-4a29f5302568" containerName="horizon" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867643 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="07094621-ca88-4942-a226-76667658a5bc" containerName="placement-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867652 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="sg-core" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867661 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="56f69678-8caa-45a4-8361-a0bf3ef10d19" containerName="barbican-keystone-listener-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867672 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d714b4f-02d1-433b-ba95-54c199957dce" containerName="kube-state-metrics" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867680 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="1058e169-d572-43c5-80b3-d5f2f2c78afb" containerName="barbican-api-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867686 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="5adc996d-c7bb-49a2-bea0-909b84c93353" containerName="heat-engine" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867695 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7982ba2-03dd-461c-bef5-d4223fd9ddd5" containerName="adoption" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867702 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f82aad-7df9-4b14-a328-2cc708aeed84" containerName="rabbitmq" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867708 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c59427cb-019c-4f83-af18-75900909e70f" containerName="ovn-northd" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867717 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="proxy-httpd" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867726 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="33e156a6-7aa5-4769-8219-118aecb3b161" containerName="adoption" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867736 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="ceilometer-notification-agent" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867745 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4051daa4-7ea3-4ab2-ad1f-52353e0ad995" containerName="nova-metadata-metadata" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867755 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24cc8b9-bf0e-45d8-85a9-7c3937896968" containerName="ceilometer-central-agent" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867766 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-listener" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867772 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="95116420-b62b-402c-bcbe-17026cba0354" containerName="glance-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867779 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="c59427cb-019c-4f83-af18-75900909e70f" containerName="openstack-network-exporter" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867787 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="3464c846-13b9-479e-b9af-3d571f03b284" containerName="nova-scheduler-scheduler" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867794 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="07094621-ca88-4942-a226-76667658a5bc" containerName="placement-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867802 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1306003-e12b-4db1-beeb-cd461db0975e" containerName="neutron-api" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867812 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="95116420-b62b-402c-bcbe-17026cba0354" containerName="glance-httpd" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867821 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="adff4079-41bc-4cde-bc30-4a29f5302568" containerName="horizon-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867828 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4565dbe-dc04-452f-a79e-bc09cb299f29" containerName="mariadb-account-create-update" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867833 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="753e53f0-d44b-4af2-9aff-eabc1a46d537" containerName="galera" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867842 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="db558a41-6dbf-4b18-af50-6a5311530ef4" containerName="rabbitmq" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867848 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9c6500-238e-415a-9e31-e7bf9ccdd205" containerName="aodh-notifier" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867855 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce03d60-441e-4a79-acce-d54444aedfcb" containerName="glance-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867861 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c3821a-cbde-41d5-95fd-1e617b2d12bc" containerName="cinder-scheduler" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867869 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d570b43-d0ff-42d7-a305-9d7c2f9f9881" containerName="cinder-api-log" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867874 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4565dbe-dc04-452f-a79e-bc09cb299f29" containerName="mariadb-account-create-update" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.867883 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e613490-e007-4f7e-9868-abf59633c7c2" containerName="barbican-worker-log" Feb 23 09:05:39 crc kubenswrapper[5047]: E0223 09:05:39.868030 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4565dbe-dc04-452f-a79e-bc09cb299f29" containerName="mariadb-account-create-update" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.868037 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4565dbe-dc04-452f-a79e-bc09cb299f29" containerName="mariadb-account-create-update" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.868705 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/must-gather-qksss" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.870467 5047 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7lqvl"/"default-dockercfg-29l9n" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.870710 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7lqvl"/"kube-root-ca.crt" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.870834 5047 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7lqvl"/"openshift-service-ca.crt" Feb 23 09:05:39 crc kubenswrapper[5047]: I0223 09:05:39.873831 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7lqvl/must-gather-qksss"] Feb 23 09:05:40 crc kubenswrapper[5047]: I0223 09:05:40.068780 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6f4j\" (UniqueName: \"kubernetes.io/projected/20142b23-ff61-429a-bcd2-459f698dad1d-kube-api-access-b6f4j\") pod \"must-gather-qksss\" (UID: \"20142b23-ff61-429a-bcd2-459f698dad1d\") " pod="openshift-must-gather-7lqvl/must-gather-qksss" Feb 23 09:05:40 crc kubenswrapper[5047]: I0223 09:05:40.069110 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20142b23-ff61-429a-bcd2-459f698dad1d-must-gather-output\") pod \"must-gather-qksss\" (UID: \"20142b23-ff61-429a-bcd2-459f698dad1d\") " pod="openshift-must-gather-7lqvl/must-gather-qksss" Feb 23 09:05:40 crc kubenswrapper[5047]: I0223 09:05:40.170697 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6f4j\" (UniqueName: \"kubernetes.io/projected/20142b23-ff61-429a-bcd2-459f698dad1d-kube-api-access-b6f4j\") pod \"must-gather-qksss\" (UID: \"20142b23-ff61-429a-bcd2-459f698dad1d\") " pod="openshift-must-gather-7lqvl/must-gather-qksss" Feb 23 09:05:40 crc kubenswrapper[5047]: I0223 09:05:40.170806 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20142b23-ff61-429a-bcd2-459f698dad1d-must-gather-output\") pod \"must-gather-qksss\" (UID: \"20142b23-ff61-429a-bcd2-459f698dad1d\") " pod="openshift-must-gather-7lqvl/must-gather-qksss" Feb 23 09:05:40 crc kubenswrapper[5047]: I0223 09:05:40.171415 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20142b23-ff61-429a-bcd2-459f698dad1d-must-gather-output\") pod \"must-gather-qksss\" (UID: \"20142b23-ff61-429a-bcd2-459f698dad1d\") " pod="openshift-must-gather-7lqvl/must-gather-qksss" Feb 23 09:05:40 crc kubenswrapper[5047]: I0223 09:05:40.199180 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6f4j\" (UniqueName: \"kubernetes.io/projected/20142b23-ff61-429a-bcd2-459f698dad1d-kube-api-access-b6f4j\") pod \"must-gather-qksss\" (UID: \"20142b23-ff61-429a-bcd2-459f698dad1d\") " pod="openshift-must-gather-7lqvl/must-gather-qksss" Feb 23 09:05:40 crc kubenswrapper[5047]: I0223 09:05:40.488599 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/must-gather-qksss" Feb 23 09:05:40 crc kubenswrapper[5047]: I0223 09:05:40.950538 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7lqvl/must-gather-qksss"] Feb 23 09:05:40 crc kubenswrapper[5047]: I0223 09:05:40.959274 5047 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:05:41 crc kubenswrapper[5047]: I0223 09:05:41.723758 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lqvl/must-gather-qksss" event={"ID":"20142b23-ff61-429a-bcd2-459f698dad1d","Type":"ContainerStarted","Data":"05b58af4cedb03ea17a7d6e83d1076792e62a256ff9b4b6069c6690c8835d3be"} Feb 23 09:05:44 crc kubenswrapper[5047]: I0223 09:05:44.345787 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:05:44 crc kubenswrapper[5047]: E0223 09:05:44.346628 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:05:47 crc kubenswrapper[5047]: I0223 09:05:47.643720 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7lqvl/crc-debug-5nskx"] Feb 23 09:05:47 crc kubenswrapper[5047]: I0223 09:05:47.645220 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/crc-debug-5nskx" Feb 23 09:05:47 crc kubenswrapper[5047]: I0223 09:05:47.727860 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9sb8\" (UniqueName: \"kubernetes.io/projected/748f91ba-ab9f-4ef5-884d-271e13e2884e-kube-api-access-z9sb8\") pod \"crc-debug-5nskx\" (UID: \"748f91ba-ab9f-4ef5-884d-271e13e2884e\") " pod="openshift-must-gather-7lqvl/crc-debug-5nskx" Feb 23 09:05:47 crc kubenswrapper[5047]: I0223 09:05:47.727967 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/748f91ba-ab9f-4ef5-884d-271e13e2884e-host\") pod \"crc-debug-5nskx\" (UID: \"748f91ba-ab9f-4ef5-884d-271e13e2884e\") " pod="openshift-must-gather-7lqvl/crc-debug-5nskx" Feb 23 09:05:47 crc kubenswrapper[5047]: I0223 09:05:47.770799 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lqvl/must-gather-qksss" event={"ID":"20142b23-ff61-429a-bcd2-459f698dad1d","Type":"ContainerStarted","Data":"2d7f151012182078ca4d0c92fd0168143a667a39e426391af2dd552ca3b896b4"} Feb 23 09:05:47 crc kubenswrapper[5047]: I0223 09:05:47.770845 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lqvl/must-gather-qksss" event={"ID":"20142b23-ff61-429a-bcd2-459f698dad1d","Type":"ContainerStarted","Data":"b3ac641dea4dd757f9cd36f96424dd8b7b5e05eb94b0c50889d7eee200cec452"} Feb 23 09:05:47 crc kubenswrapper[5047]: I0223 09:05:47.791250 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7lqvl/must-gather-qksss" podStartSLOduration=2.987079175 podStartE2EDuration="8.791220296s" podCreationTimestamp="2026-02-23 09:05:39 +0000 UTC" firstStartedPulling="2026-02-23 09:05:40.959018953 +0000 UTC m=+8463.210346087" lastFinishedPulling="2026-02-23 09:05:46.763160074 +0000 UTC m=+8469.014487208" observedRunningTime="2026-02-23 09:05:47.791165945 +0000 UTC m=+8470.042493089" watchObservedRunningTime="2026-02-23 09:05:47.791220296 +0000 UTC m=+8470.042547450" Feb 23 09:05:47 crc kubenswrapper[5047]: I0223 09:05:47.829201 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9sb8\" (UniqueName: \"kubernetes.io/projected/748f91ba-ab9f-4ef5-884d-271e13e2884e-kube-api-access-z9sb8\") pod \"crc-debug-5nskx\" (UID: \"748f91ba-ab9f-4ef5-884d-271e13e2884e\") " pod="openshift-must-gather-7lqvl/crc-debug-5nskx" Feb 23 09:05:47 crc kubenswrapper[5047]: I0223 09:05:47.829246 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/748f91ba-ab9f-4ef5-884d-271e13e2884e-host\") pod \"crc-debug-5nskx\" (UID: \"748f91ba-ab9f-4ef5-884d-271e13e2884e\") " pod="openshift-must-gather-7lqvl/crc-debug-5nskx" Feb 23 09:05:47 crc kubenswrapper[5047]: I0223 09:05:47.829581 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/748f91ba-ab9f-4ef5-884d-271e13e2884e-host\") pod \"crc-debug-5nskx\" (UID: \"748f91ba-ab9f-4ef5-884d-271e13e2884e\") " pod="openshift-must-gather-7lqvl/crc-debug-5nskx" Feb 23 09:05:47 crc kubenswrapper[5047]: I0223 09:05:47.860681 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9sb8\" (UniqueName: \"kubernetes.io/projected/748f91ba-ab9f-4ef5-884d-271e13e2884e-kube-api-access-z9sb8\") pod \"crc-debug-5nskx\" (UID: \"748f91ba-ab9f-4ef5-884d-271e13e2884e\") " pod="openshift-must-gather-7lqvl/crc-debug-5nskx" Feb 23 09:05:47 crc kubenswrapper[5047]: I0223 09:05:47.966729 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/crc-debug-5nskx" Feb 23 09:05:47 crc kubenswrapper[5047]: W0223 09:05:47.992759 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod748f91ba_ab9f_4ef5_884d_271e13e2884e.slice/crio-f3f9d2e2fd4a9b6bed84bf49e6a95de48bd13622ee5a47e14a354d3c86e8cffc WatchSource:0}: Error finding container f3f9d2e2fd4a9b6bed84bf49e6a95de48bd13622ee5a47e14a354d3c86e8cffc: Status 404 returned error can't find the container with id f3f9d2e2fd4a9b6bed84bf49e6a95de48bd13622ee5a47e14a354d3c86e8cffc Feb 23 09:05:48 crc kubenswrapper[5047]: I0223 09:05:48.781257 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lqvl/crc-debug-5nskx" event={"ID":"748f91ba-ab9f-4ef5-884d-271e13e2884e","Type":"ContainerStarted","Data":"f3f9d2e2fd4a9b6bed84bf49e6a95de48bd13622ee5a47e14a354d3c86e8cffc"} Feb 23 09:05:59 crc kubenswrapper[5047]: I0223 09:05:59.340985 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:05:59 crc kubenswrapper[5047]: E0223 09:05:59.341762 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:06:00 crc kubenswrapper[5047]: I0223 09:06:00.870669 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lqvl/crc-debug-5nskx" event={"ID":"748f91ba-ab9f-4ef5-884d-271e13e2884e","Type":"ContainerStarted","Data":"f72028008a69e0f436ccb7841158dc9d00c14ad478507267b54ba9a2f7abd2fd"} Feb 23 09:06:00 crc kubenswrapper[5047]: I0223 09:06:00.896118 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7lqvl/crc-debug-5nskx" podStartSLOduration=1.8694033079999999 podStartE2EDuration="13.89608792s" podCreationTimestamp="2026-02-23 09:05:47 +0000 UTC" firstStartedPulling="2026-02-23 09:05:47.995608051 +0000 UTC m=+8470.246935185" lastFinishedPulling="2026-02-23 09:06:00.022292663 +0000 UTC m=+8482.273619797" observedRunningTime="2026-02-23 09:06:00.886857991 +0000 UTC m=+8483.138185125" watchObservedRunningTime="2026-02-23 09:06:00.89608792 +0000 UTC m=+8483.147415094" Feb 23 09:06:02 crc kubenswrapper[5047]: I0223 09:06:02.765248 5047 scope.go:117] "RemoveContainer" containerID="5ee8f747b109ed16b9e7a124807d89b72589cbc2393d42b99e3fd63fd22b516e" Feb 23 09:06:11 crc kubenswrapper[5047]: I0223 09:06:11.340840 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:06:11 crc kubenswrapper[5047]: E0223 09:06:11.341840 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:06:12 crc kubenswrapper[5047]: I0223 09:06:12.176978 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vqxms"] Feb 23 09:06:12 crc kubenswrapper[5047]: I0223 09:06:12.178712 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:12 crc kubenswrapper[5047]: I0223 09:06:12.192081 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqxms"] Feb 23 09:06:12 crc kubenswrapper[5047]: I0223 09:06:12.355623 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-catalog-content\") pod \"certified-operators-vqxms\" (UID: \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\") " pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:12 crc kubenswrapper[5047]: I0223 09:06:12.356197 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-utilities\") pod \"certified-operators-vqxms\" (UID: \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\") " pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:12 crc kubenswrapper[5047]: I0223 09:06:12.356336 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhg77\" (UniqueName: \"kubernetes.io/projected/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-kube-api-access-zhg77\") pod \"certified-operators-vqxms\" (UID: \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\") " pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:12 crc kubenswrapper[5047]: I0223 09:06:12.458227 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhg77\" (UniqueName: \"kubernetes.io/projected/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-kube-api-access-zhg77\") pod \"certified-operators-vqxms\" (UID: \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\") " pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:12 crc kubenswrapper[5047]: I0223 09:06:12.458628 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-catalog-content\") pod \"certified-operators-vqxms\" (UID: \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\") " pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:12 crc kubenswrapper[5047]: I0223 09:06:12.458688 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-utilities\") pod \"certified-operators-vqxms\" (UID: \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\") " pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:12 crc kubenswrapper[5047]: I0223 09:06:12.459384 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-catalog-content\") pod \"certified-operators-vqxms\" (UID: \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\") " pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:12 crc kubenswrapper[5047]: I0223 09:06:12.460131 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-utilities\") pod \"certified-operators-vqxms\" (UID: \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\") " pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:12 crc kubenswrapper[5047]: I0223 09:06:12.479825 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhg77\" (UniqueName: \"kubernetes.io/projected/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-kube-api-access-zhg77\") pod \"certified-operators-vqxms\" (UID: \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\") " pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:12 crc kubenswrapper[5047]: I0223 09:06:12.508031 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:13 crc kubenswrapper[5047]: I0223 09:06:13.059662 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqxms"] Feb 23 09:06:13 crc kubenswrapper[5047]: I0223 09:06:13.962360 5047 generic.go:334] "Generic (PLEG): container finished" podID="a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" containerID="f9e7b2c95d8d316a3f8c2fb721800cf74cd68ea078c437201cbe21e2687e27fd" exitCode=0 Feb 23 09:06:13 crc kubenswrapper[5047]: I0223 09:06:13.962474 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqxms" event={"ID":"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa","Type":"ContainerDied","Data":"f9e7b2c95d8d316a3f8c2fb721800cf74cd68ea078c437201cbe21e2687e27fd"} Feb 23 09:06:13 crc kubenswrapper[5047]: I0223 09:06:13.962934 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqxms" event={"ID":"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa","Type":"ContainerStarted","Data":"e5a2567765f2b2c6b52d6538804b9937d6b4627da20003c96caacb4feb73e11b"} Feb 23 09:06:16 crc kubenswrapper[5047]: I0223 09:06:16.995106 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqxms" event={"ID":"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa","Type":"ContainerStarted","Data":"81d046284264a797c84be58a76383c672843d09e3422af11252048813636971d"} Feb 23 09:06:18 crc kubenswrapper[5047]: I0223 09:06:18.006527 5047 generic.go:334] "Generic (PLEG): container finished" podID="a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" containerID="81d046284264a797c84be58a76383c672843d09e3422af11252048813636971d" exitCode=0 Feb 23 09:06:18 crc kubenswrapper[5047]: I0223 09:06:18.006793 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqxms" event={"ID":"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa","Type":"ContainerDied","Data":"81d046284264a797c84be58a76383c672843d09e3422af11252048813636971d"} Feb 23 09:06:20 crc kubenswrapper[5047]: I0223 09:06:20.026158 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqxms" event={"ID":"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa","Type":"ContainerStarted","Data":"6148afd48891e639f61d316584453667b59ba261804ba5b40ba23a2870ce3dc9"} Feb 23 09:06:20 crc kubenswrapper[5047]: I0223 09:06:20.048491 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vqxms" podStartSLOduration=2.836493929 podStartE2EDuration="8.048469779s" podCreationTimestamp="2026-02-23 09:06:12 +0000 UTC" firstStartedPulling="2026-02-23 09:06:13.968031527 +0000 UTC m=+8496.219358661" lastFinishedPulling="2026-02-23 09:06:19.180007377 +0000 UTC m=+8501.431334511" observedRunningTime="2026-02-23 09:06:20.046085146 +0000 UTC m=+8502.297412280" watchObservedRunningTime="2026-02-23 09:06:20.048469779 +0000 UTC m=+8502.299796923" Feb 23 09:06:22 crc kubenswrapper[5047]: I0223 09:06:22.508823 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:22 crc kubenswrapper[5047]: I0223 09:06:22.509305 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:22 crc kubenswrapper[5047]: I0223 09:06:22.573736 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:25 crc kubenswrapper[5047]: I0223 09:06:25.071203 5047 generic.go:334] "Generic (PLEG): container finished" podID="748f91ba-ab9f-4ef5-884d-271e13e2884e" containerID="f72028008a69e0f436ccb7841158dc9d00c14ad478507267b54ba9a2f7abd2fd" exitCode=0 Feb 23 09:06:25 crc kubenswrapper[5047]: I0223 09:06:25.071339 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lqvl/crc-debug-5nskx" event={"ID":"748f91ba-ab9f-4ef5-884d-271e13e2884e","Type":"ContainerDied","Data":"f72028008a69e0f436ccb7841158dc9d00c14ad478507267b54ba9a2f7abd2fd"} Feb 23 09:06:25 crc kubenswrapper[5047]: I0223 09:06:25.340802 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:06:25 crc kubenswrapper[5047]: E0223 09:06:25.341060 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:06:26 crc kubenswrapper[5047]: I0223 09:06:26.164213 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/crc-debug-5nskx" Feb 23 09:06:26 crc kubenswrapper[5047]: I0223 09:06:26.198397 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7lqvl/crc-debug-5nskx"] Feb 23 09:06:26 crc kubenswrapper[5047]: I0223 09:06:26.203009 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7lqvl/crc-debug-5nskx"] Feb 23 09:06:26 crc kubenswrapper[5047]: I0223 09:06:26.276043 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/748f91ba-ab9f-4ef5-884d-271e13e2884e-host\") pod \"748f91ba-ab9f-4ef5-884d-271e13e2884e\" (UID: \"748f91ba-ab9f-4ef5-884d-271e13e2884e\") " Feb 23 09:06:26 crc kubenswrapper[5047]: I0223 09:06:26.276118 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9sb8\" (UniqueName: \"kubernetes.io/projected/748f91ba-ab9f-4ef5-884d-271e13e2884e-kube-api-access-z9sb8\") pod \"748f91ba-ab9f-4ef5-884d-271e13e2884e\" (UID: \"748f91ba-ab9f-4ef5-884d-271e13e2884e\") " Feb 23 09:06:26 crc kubenswrapper[5047]: I0223 09:06:26.276171 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/748f91ba-ab9f-4ef5-884d-271e13e2884e-host" (OuterVolumeSpecName: "host") pod "748f91ba-ab9f-4ef5-884d-271e13e2884e" (UID: "748f91ba-ab9f-4ef5-884d-271e13e2884e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 09:06:26 crc kubenswrapper[5047]: I0223 09:06:26.276472 5047 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/748f91ba-ab9f-4ef5-884d-271e13e2884e-host\") on node \"crc\" DevicePath \"\"" Feb 23 09:06:26 crc kubenswrapper[5047]: I0223 09:06:26.281330 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/748f91ba-ab9f-4ef5-884d-271e13e2884e-kube-api-access-z9sb8" (OuterVolumeSpecName: "kube-api-access-z9sb8") pod "748f91ba-ab9f-4ef5-884d-271e13e2884e" (UID: "748f91ba-ab9f-4ef5-884d-271e13e2884e"). InnerVolumeSpecName "kube-api-access-z9sb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:06:26 crc kubenswrapper[5047]: I0223 09:06:26.352505 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="748f91ba-ab9f-4ef5-884d-271e13e2884e" path="/var/lib/kubelet/pods/748f91ba-ab9f-4ef5-884d-271e13e2884e/volumes" Feb 23 09:06:26 crc kubenswrapper[5047]: I0223 09:06:26.378436 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9sb8\" (UniqueName: \"kubernetes.io/projected/748f91ba-ab9f-4ef5-884d-271e13e2884e-kube-api-access-z9sb8\") on node \"crc\" DevicePath \"\"" Feb 23 09:06:27 crc kubenswrapper[5047]: I0223 09:06:27.089069 5047 scope.go:117] "RemoveContainer" containerID="f72028008a69e0f436ccb7841158dc9d00c14ad478507267b54ba9a2f7abd2fd" Feb 23 09:06:27 crc kubenswrapper[5047]: I0223 09:06:27.089103 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/crc-debug-5nskx" Feb 23 09:06:27 crc kubenswrapper[5047]: I0223 09:06:27.423431 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7lqvl/crc-debug-xz6hp"] Feb 23 09:06:27 crc kubenswrapper[5047]: E0223 09:06:27.423748 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="748f91ba-ab9f-4ef5-884d-271e13e2884e" containerName="container-00" Feb 23 09:06:27 crc kubenswrapper[5047]: I0223 09:06:27.423760 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="748f91ba-ab9f-4ef5-884d-271e13e2884e" containerName="container-00" Feb 23 09:06:27 crc kubenswrapper[5047]: I0223 09:06:27.423926 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="748f91ba-ab9f-4ef5-884d-271e13e2884e" containerName="container-00" Feb 23 09:06:27 crc kubenswrapper[5047]: I0223 09:06:27.424405 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/crc-debug-xz6hp" Feb 23 09:06:27 crc kubenswrapper[5047]: I0223 09:06:27.496133 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd6nq\" (UniqueName: \"kubernetes.io/projected/307e71d5-c268-4031-ae25-433b273287e5-kube-api-access-rd6nq\") pod \"crc-debug-xz6hp\" (UID: \"307e71d5-c268-4031-ae25-433b273287e5\") " pod="openshift-must-gather-7lqvl/crc-debug-xz6hp" Feb 23 09:06:27 crc kubenswrapper[5047]: I0223 09:06:27.496194 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/307e71d5-c268-4031-ae25-433b273287e5-host\") pod \"crc-debug-xz6hp\" (UID: \"307e71d5-c268-4031-ae25-433b273287e5\") " pod="openshift-must-gather-7lqvl/crc-debug-xz6hp" Feb 23 09:06:27 crc kubenswrapper[5047]: I0223 09:06:27.597757 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/307e71d5-c268-4031-ae25-433b273287e5-host\") pod \"crc-debug-xz6hp\" (UID: \"307e71d5-c268-4031-ae25-433b273287e5\") " pod="openshift-must-gather-7lqvl/crc-debug-xz6hp" Feb 23 09:06:27 crc kubenswrapper[5047]: I0223 09:06:27.597900 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd6nq\" (UniqueName: \"kubernetes.io/projected/307e71d5-c268-4031-ae25-433b273287e5-kube-api-access-rd6nq\") pod \"crc-debug-xz6hp\" (UID: \"307e71d5-c268-4031-ae25-433b273287e5\") " pod="openshift-must-gather-7lqvl/crc-debug-xz6hp" Feb 23 09:06:27 crc kubenswrapper[5047]: I0223 09:06:27.598365 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/307e71d5-c268-4031-ae25-433b273287e5-host\") pod \"crc-debug-xz6hp\" (UID: \"307e71d5-c268-4031-ae25-433b273287e5\") " pod="openshift-must-gather-7lqvl/crc-debug-xz6hp" Feb 23 09:06:27 crc kubenswrapper[5047]: I0223 09:06:27.622559 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd6nq\" (UniqueName: \"kubernetes.io/projected/307e71d5-c268-4031-ae25-433b273287e5-kube-api-access-rd6nq\") pod \"crc-debug-xz6hp\" (UID: \"307e71d5-c268-4031-ae25-433b273287e5\") " pod="openshift-must-gather-7lqvl/crc-debug-xz6hp" Feb 23 09:06:27 crc kubenswrapper[5047]: I0223 09:06:27.740983 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/crc-debug-xz6hp" Feb 23 09:06:28 crc kubenswrapper[5047]: I0223 09:06:28.100872 5047 generic.go:334] "Generic (PLEG): container finished" podID="307e71d5-c268-4031-ae25-433b273287e5" containerID="0bd138744bcd6bb64192fb80ec1629e63795c6b191920e747533027003f9c971" exitCode=0 Feb 23 09:06:28 crc kubenswrapper[5047]: I0223 09:06:28.100955 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lqvl/crc-debug-xz6hp" event={"ID":"307e71d5-c268-4031-ae25-433b273287e5","Type":"ContainerDied","Data":"0bd138744bcd6bb64192fb80ec1629e63795c6b191920e747533027003f9c971"} Feb 23 09:06:28 crc kubenswrapper[5047]: I0223 09:06:28.101250 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lqvl/crc-debug-xz6hp" event={"ID":"307e71d5-c268-4031-ae25-433b273287e5","Type":"ContainerStarted","Data":"7c3fd405355d328f52840578e2a65525423403b351a6780267231cfe498132c7"} Feb 23 09:06:28 crc kubenswrapper[5047]: I0223 09:06:28.262063 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7lqvl/crc-debug-xz6hp"] Feb 23 09:06:28 crc kubenswrapper[5047]: I0223 09:06:28.269501 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7lqvl/crc-debug-xz6hp"] Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.178517 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/crc-debug-xz6hp" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.326292 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/307e71d5-c268-4031-ae25-433b273287e5-host\") pod \"307e71d5-c268-4031-ae25-433b273287e5\" (UID: \"307e71d5-c268-4031-ae25-433b273287e5\") " Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.326375 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd6nq\" (UniqueName: \"kubernetes.io/projected/307e71d5-c268-4031-ae25-433b273287e5-kube-api-access-rd6nq\") pod \"307e71d5-c268-4031-ae25-433b273287e5\" (UID: \"307e71d5-c268-4031-ae25-433b273287e5\") " Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.326561 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/307e71d5-c268-4031-ae25-433b273287e5-host" (OuterVolumeSpecName: "host") pod "307e71d5-c268-4031-ae25-433b273287e5" (UID: "307e71d5-c268-4031-ae25-433b273287e5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.327681 5047 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/307e71d5-c268-4031-ae25-433b273287e5-host\") on node \"crc\" DevicePath \"\"" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.339070 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/307e71d5-c268-4031-ae25-433b273287e5-kube-api-access-rd6nq" (OuterVolumeSpecName: "kube-api-access-rd6nq") pod "307e71d5-c268-4031-ae25-433b273287e5" (UID: "307e71d5-c268-4031-ae25-433b273287e5"). InnerVolumeSpecName "kube-api-access-rd6nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.429676 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd6nq\" (UniqueName: \"kubernetes.io/projected/307e71d5-c268-4031-ae25-433b273287e5-kube-api-access-rd6nq\") on node \"crc\" DevicePath \"\"" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.438077 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7lqvl/crc-debug-pxfbc"] Feb 23 09:06:29 crc kubenswrapper[5047]: E0223 09:06:29.438538 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307e71d5-c268-4031-ae25-433b273287e5" containerName="container-00" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.438560 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="307e71d5-c268-4031-ae25-433b273287e5" containerName="container-00" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.438842 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="307e71d5-c268-4031-ae25-433b273287e5" containerName="container-00" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.441356 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/crc-debug-pxfbc" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.530927 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15432459-a935-4c31-897a-f62bb64b53fe-host\") pod \"crc-debug-pxfbc\" (UID: \"15432459-a935-4c31-897a-f62bb64b53fe\") " pod="openshift-must-gather-7lqvl/crc-debug-pxfbc" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.531003 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr4ht\" (UniqueName: \"kubernetes.io/projected/15432459-a935-4c31-897a-f62bb64b53fe-kube-api-access-sr4ht\") pod \"crc-debug-pxfbc\" (UID: \"15432459-a935-4c31-897a-f62bb64b53fe\") " pod="openshift-must-gather-7lqvl/crc-debug-pxfbc" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.632374 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15432459-a935-4c31-897a-f62bb64b53fe-host\") pod \"crc-debug-pxfbc\" (UID: \"15432459-a935-4c31-897a-f62bb64b53fe\") " pod="openshift-must-gather-7lqvl/crc-debug-pxfbc" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.632458 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr4ht\" (UniqueName: \"kubernetes.io/projected/15432459-a935-4c31-897a-f62bb64b53fe-kube-api-access-sr4ht\") pod \"crc-debug-pxfbc\" (UID: \"15432459-a935-4c31-897a-f62bb64b53fe\") " pod="openshift-must-gather-7lqvl/crc-debug-pxfbc" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.632554 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15432459-a935-4c31-897a-f62bb64b53fe-host\") pod \"crc-debug-pxfbc\" (UID: \"15432459-a935-4c31-897a-f62bb64b53fe\") " pod="openshift-must-gather-7lqvl/crc-debug-pxfbc" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.654334 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr4ht\" (UniqueName: \"kubernetes.io/projected/15432459-a935-4c31-897a-f62bb64b53fe-kube-api-access-sr4ht\") pod \"crc-debug-pxfbc\" (UID: \"15432459-a935-4c31-897a-f62bb64b53fe\") " pod="openshift-must-gather-7lqvl/crc-debug-pxfbc" Feb 23 09:06:29 crc kubenswrapper[5047]: I0223 09:06:29.786971 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/crc-debug-pxfbc" Feb 23 09:06:29 crc kubenswrapper[5047]: W0223 09:06:29.812897 5047 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15432459_a935_4c31_897a_f62bb64b53fe.slice/crio-3bca8a7f5b5fe809cd0cd624099caae2a5ec665c89a8e56dab91871ad6cb946a WatchSource:0}: Error finding container 3bca8a7f5b5fe809cd0cd624099caae2a5ec665c89a8e56dab91871ad6cb946a: Status 404 returned error can't find the container with id 3bca8a7f5b5fe809cd0cd624099caae2a5ec665c89a8e56dab91871ad6cb946a Feb 23 09:06:30 crc kubenswrapper[5047]: I0223 09:06:30.118047 5047 generic.go:334] "Generic (PLEG): container finished" podID="15432459-a935-4c31-897a-f62bb64b53fe" containerID="14c0baaebe68b98e6dddb238b6a7c9f8c0989ed61403060afdcf5cd2c4c72ff0" exitCode=0 Feb 23 09:06:30 crc kubenswrapper[5047]: I0223 09:06:30.118384 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lqvl/crc-debug-pxfbc" event={"ID":"15432459-a935-4c31-897a-f62bb64b53fe","Type":"ContainerDied","Data":"14c0baaebe68b98e6dddb238b6a7c9f8c0989ed61403060afdcf5cd2c4c72ff0"} Feb 23 09:06:30 crc kubenswrapper[5047]: I0223 09:06:30.118419 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lqvl/crc-debug-pxfbc" event={"ID":"15432459-a935-4c31-897a-f62bb64b53fe","Type":"ContainerStarted","Data":"3bca8a7f5b5fe809cd0cd624099caae2a5ec665c89a8e56dab91871ad6cb946a"} Feb 23 09:06:30 crc kubenswrapper[5047]: I0223 09:06:30.120691 5047 scope.go:117] "RemoveContainer" containerID="0bd138744bcd6bb64192fb80ec1629e63795c6b191920e747533027003f9c971" Feb 23 09:06:30 crc kubenswrapper[5047]: I0223 09:06:30.120815 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/crc-debug-xz6hp" Feb 23 09:06:30 crc kubenswrapper[5047]: I0223 09:06:30.160381 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7lqvl/crc-debug-pxfbc"] Feb 23 09:06:30 crc kubenswrapper[5047]: I0223 09:06:30.169988 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7lqvl/crc-debug-pxfbc"] Feb 23 09:06:30 crc kubenswrapper[5047]: I0223 09:06:30.381958 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="307e71d5-c268-4031-ae25-433b273287e5" path="/var/lib/kubelet/pods/307e71d5-c268-4031-ae25-433b273287e5/volumes" Feb 23 09:06:31 crc kubenswrapper[5047]: I0223 09:06:31.218402 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/crc-debug-pxfbc" Feb 23 09:06:31 crc kubenswrapper[5047]: I0223 09:06:31.288128 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr4ht\" (UniqueName: \"kubernetes.io/projected/15432459-a935-4c31-897a-f62bb64b53fe-kube-api-access-sr4ht\") pod \"15432459-a935-4c31-897a-f62bb64b53fe\" (UID: \"15432459-a935-4c31-897a-f62bb64b53fe\") " Feb 23 09:06:31 crc kubenswrapper[5047]: I0223 09:06:31.288316 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15432459-a935-4c31-897a-f62bb64b53fe-host\") pod \"15432459-a935-4c31-897a-f62bb64b53fe\" (UID: \"15432459-a935-4c31-897a-f62bb64b53fe\") " Feb 23 09:06:31 crc kubenswrapper[5047]: I0223 09:06:31.288431 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15432459-a935-4c31-897a-f62bb64b53fe-host" (OuterVolumeSpecName: "host") pod "15432459-a935-4c31-897a-f62bb64b53fe" (UID: "15432459-a935-4c31-897a-f62bb64b53fe"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 09:06:31 crc kubenswrapper[5047]: I0223 09:06:31.288850 5047 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15432459-a935-4c31-897a-f62bb64b53fe-host\") on node \"crc\" DevicePath \"\"" Feb 23 09:06:31 crc kubenswrapper[5047]: I0223 09:06:31.295635 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15432459-a935-4c31-897a-f62bb64b53fe-kube-api-access-sr4ht" (OuterVolumeSpecName: "kube-api-access-sr4ht") pod "15432459-a935-4c31-897a-f62bb64b53fe" (UID: "15432459-a935-4c31-897a-f62bb64b53fe"). InnerVolumeSpecName "kube-api-access-sr4ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:06:31 crc kubenswrapper[5047]: I0223 09:06:31.390532 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr4ht\" (UniqueName: \"kubernetes.io/projected/15432459-a935-4c31-897a-f62bb64b53fe-kube-api-access-sr4ht\") on node \"crc\" DevicePath \"\"" Feb 23 09:06:32 crc kubenswrapper[5047]: I0223 09:06:32.152337 5047 scope.go:117] "RemoveContainer" containerID="14c0baaebe68b98e6dddb238b6a7c9f8c0989ed61403060afdcf5cd2c4c72ff0" Feb 23 09:06:32 crc kubenswrapper[5047]: I0223 09:06:32.152414 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/crc-debug-pxfbc" Feb 23 09:06:32 crc kubenswrapper[5047]: I0223 09:06:32.353608 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15432459-a935-4c31-897a-f62bb64b53fe" path="/var/lib/kubelet/pods/15432459-a935-4c31-897a-f62bb64b53fe/volumes" Feb 23 09:06:32 crc kubenswrapper[5047]: I0223 09:06:32.583799 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:32 crc kubenswrapper[5047]: I0223 09:06:32.639789 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqxms"] Feb 23 09:06:33 crc kubenswrapper[5047]: I0223 09:06:33.160684 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vqxms" podUID="a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" containerName="registry-server" containerID="cri-o://6148afd48891e639f61d316584453667b59ba261804ba5b40ba23a2870ce3dc9" gracePeriod=2 Feb 23 09:06:33 crc kubenswrapper[5047]: I0223 09:06:33.565518 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:33 crc kubenswrapper[5047]: I0223 09:06:33.623694 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-utilities\") pod \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\" (UID: \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\") " Feb 23 09:06:33 crc kubenswrapper[5047]: I0223 09:06:33.623804 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-catalog-content\") pod \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\" (UID: \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\") " Feb 23 09:06:33 crc kubenswrapper[5047]: I0223 09:06:33.623850 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhg77\" (UniqueName: \"kubernetes.io/projected/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-kube-api-access-zhg77\") pod \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\" (UID: \"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa\") " Feb 23 09:06:33 crc kubenswrapper[5047]: I0223 09:06:33.624777 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-utilities" (OuterVolumeSpecName: "utilities") pod "a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" (UID: "a6e4ad09-4c59-4ed8-b826-d64856a4c7aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:06:33 crc kubenswrapper[5047]: I0223 09:06:33.629120 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-kube-api-access-zhg77" (OuterVolumeSpecName: "kube-api-access-zhg77") pod "a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" (UID: "a6e4ad09-4c59-4ed8-b826-d64856a4c7aa"). InnerVolumeSpecName "kube-api-access-zhg77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:06:33 crc kubenswrapper[5047]: I0223 09:06:33.683622 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" (UID: "a6e4ad09-4c59-4ed8-b826-d64856a4c7aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:06:33 crc kubenswrapper[5047]: I0223 09:06:33.726450 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:06:33 crc kubenswrapper[5047]: I0223 09:06:33.726502 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:06:33 crc kubenswrapper[5047]: I0223 09:06:33.726519 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhg77\" (UniqueName: \"kubernetes.io/projected/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa-kube-api-access-zhg77\") on node \"crc\" DevicePath \"\"" Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.174716 5047 generic.go:334] "Generic (PLEG): container finished" podID="a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" containerID="6148afd48891e639f61d316584453667b59ba261804ba5b40ba23a2870ce3dc9" exitCode=0 Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.174964 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqxms" event={"ID":"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa","Type":"ContainerDied","Data":"6148afd48891e639f61d316584453667b59ba261804ba5b40ba23a2870ce3dc9"} Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.175366 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqxms" event={"ID":"a6e4ad09-4c59-4ed8-b826-d64856a4c7aa","Type":"ContainerDied","Data":"e5a2567765f2b2c6b52d6538804b9937d6b4627da20003c96caacb4feb73e11b"} Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.175416 5047 scope.go:117] "RemoveContainer" containerID="6148afd48891e639f61d316584453667b59ba261804ba5b40ba23a2870ce3dc9" Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.175062 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqxms" Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.212459 5047 scope.go:117] "RemoveContainer" containerID="81d046284264a797c84be58a76383c672843d09e3422af11252048813636971d" Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.218837 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqxms"] Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.228109 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vqxms"] Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.239272 5047 scope.go:117] "RemoveContainer" containerID="f9e7b2c95d8d316a3f8c2fb721800cf74cd68ea078c437201cbe21e2687e27fd" Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.256969 5047 scope.go:117] "RemoveContainer" containerID="6148afd48891e639f61d316584453667b59ba261804ba5b40ba23a2870ce3dc9" Feb 23 09:06:34 crc kubenswrapper[5047]: E0223 09:06:34.257449 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6148afd48891e639f61d316584453667b59ba261804ba5b40ba23a2870ce3dc9\": container with ID starting with 6148afd48891e639f61d316584453667b59ba261804ba5b40ba23a2870ce3dc9 not found: ID does not exist" containerID="6148afd48891e639f61d316584453667b59ba261804ba5b40ba23a2870ce3dc9" Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.257488 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6148afd48891e639f61d316584453667b59ba261804ba5b40ba23a2870ce3dc9"} err="failed to get container status \"6148afd48891e639f61d316584453667b59ba261804ba5b40ba23a2870ce3dc9\": rpc error: code = NotFound desc = could not find container \"6148afd48891e639f61d316584453667b59ba261804ba5b40ba23a2870ce3dc9\": container with ID starting with 6148afd48891e639f61d316584453667b59ba261804ba5b40ba23a2870ce3dc9 not found: ID does not exist" Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.257514 5047 scope.go:117] "RemoveContainer" containerID="81d046284264a797c84be58a76383c672843d09e3422af11252048813636971d" Feb 23 09:06:34 crc kubenswrapper[5047]: E0223 09:06:34.258002 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81d046284264a797c84be58a76383c672843d09e3422af11252048813636971d\": container with ID starting with 81d046284264a797c84be58a76383c672843d09e3422af11252048813636971d not found: ID does not exist" containerID="81d046284264a797c84be58a76383c672843d09e3422af11252048813636971d" Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.258045 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d046284264a797c84be58a76383c672843d09e3422af11252048813636971d"} err="failed to get container status \"81d046284264a797c84be58a76383c672843d09e3422af11252048813636971d\": rpc error: code = NotFound desc = could not find container \"81d046284264a797c84be58a76383c672843d09e3422af11252048813636971d\": container with ID starting with 81d046284264a797c84be58a76383c672843d09e3422af11252048813636971d not found: ID does not exist" Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.258095 5047 scope.go:117] "RemoveContainer" containerID="f9e7b2c95d8d316a3f8c2fb721800cf74cd68ea078c437201cbe21e2687e27fd" Feb 23 09:06:34 crc kubenswrapper[5047]: E0223 09:06:34.258525 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9e7b2c95d8d316a3f8c2fb721800cf74cd68ea078c437201cbe21e2687e27fd\": container with ID starting with f9e7b2c95d8d316a3f8c2fb721800cf74cd68ea078c437201cbe21e2687e27fd not found: ID does not exist" containerID="f9e7b2c95d8d316a3f8c2fb721800cf74cd68ea078c437201cbe21e2687e27fd" Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.258551 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e7b2c95d8d316a3f8c2fb721800cf74cd68ea078c437201cbe21e2687e27fd"} err="failed to get container status \"f9e7b2c95d8d316a3f8c2fb721800cf74cd68ea078c437201cbe21e2687e27fd\": rpc error: code = NotFound desc = could not find container \"f9e7b2c95d8d316a3f8c2fb721800cf74cd68ea078c437201cbe21e2687e27fd\": container with ID starting with f9e7b2c95d8d316a3f8c2fb721800cf74cd68ea078c437201cbe21e2687e27fd not found: ID does not exist" Feb 23 09:06:34 crc kubenswrapper[5047]: I0223 09:06:34.352709 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" path="/var/lib/kubelet/pods/a6e4ad09-4c59-4ed8-b826-d64856a4c7aa/volumes" Feb 23 09:06:36 crc kubenswrapper[5047]: I0223 09:06:36.341479 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:06:36 crc kubenswrapper[5047]: E0223 09:06:36.342279 5047 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wh6hv_openshift-machine-config-operator(ca275411-978b-439b-ab4b-f98a7ac42f8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" Feb 23 09:06:47 crc kubenswrapper[5047]: I0223 09:06:47.008195 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_146c85be-9d67-4281-873e-b27f5e90d957/openstack-network-exporter/0.log" Feb 23 09:06:47 crc kubenswrapper[5047]: I0223 09:06:47.200185 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_146c85be-9d67-4281-873e-b27f5e90d957/ovsdbserver-nb/0.log" Feb 23 09:06:47 crc kubenswrapper[5047]: I0223 09:06:47.291707 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_0c6cd063-9d8c-4689-8e90-346ae2ec1ea8/openstack-network-exporter/0.log" Feb 23 09:06:47 crc kubenswrapper[5047]: I0223 09:06:47.340604 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:06:47 crc kubenswrapper[5047]: I0223 09:06:47.374287 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_0c6cd063-9d8c-4689-8e90-346ae2ec1ea8/ovsdbserver-nb/0.log" Feb 23 09:06:47 crc kubenswrapper[5047]: I0223 09:06:47.478780 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_74acc7db-4095-419a-9b09-afa04283a69f/openstack-network-exporter/0.log" Feb 23 09:06:47 crc kubenswrapper[5047]: I0223 09:06:47.616867 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_74acc7db-4095-419a-9b09-afa04283a69f/ovsdbserver-sb/0.log" Feb 23 09:06:47 crc kubenswrapper[5047]: I0223 09:06:47.698844 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_385bab5b-7cad-4274-b672-a0614be3f41e/openstack-network-exporter/0.log" Feb 23 09:06:47 crc kubenswrapper[5047]: I0223 09:06:47.835305 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_385bab5b-7cad-4274-b672-a0614be3f41e/ovsdbserver-sb/0.log" Feb 23 09:06:48 crc kubenswrapper[5047]: I0223 09:06:48.289051 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"1c1d3f51ba6a1b765001fc2b3608a0b8587309852ab518a873052699c9dfa680"} Feb 23 09:07:02 crc kubenswrapper[5047]: I0223 09:07:02.191457 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7_c67ccc6d-2058-4aa1-b91f-db542ab3ff96/util/0.log" Feb 23 09:07:02 crc kubenswrapper[5047]: I0223 09:07:02.389068 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7_c67ccc6d-2058-4aa1-b91f-db542ab3ff96/util/0.log" Feb 23 09:07:02 crc kubenswrapper[5047]: I0223 09:07:02.413115 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7_c67ccc6d-2058-4aa1-b91f-db542ab3ff96/pull/0.log" Feb 23 09:07:02 crc kubenswrapper[5047]: I0223 09:07:02.477431 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7_c67ccc6d-2058-4aa1-b91f-db542ab3ff96/pull/0.log" Feb 23 09:07:02 crc kubenswrapper[5047]: I0223 09:07:02.608452 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7_c67ccc6d-2058-4aa1-b91f-db542ab3ff96/extract/0.log" Feb 23 09:07:02 crc kubenswrapper[5047]: I0223 09:07:02.623098 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7_c67ccc6d-2058-4aa1-b91f-db542ab3ff96/util/0.log" Feb 23 09:07:02 crc kubenswrapper[5047]: I0223 09:07:02.679937 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967w7vk7_c67ccc6d-2058-4aa1-b91f-db542ab3ff96/pull/0.log" Feb 23 09:07:02 crc kubenswrapper[5047]: I0223 09:07:02.821707 5047 scope.go:117] "RemoveContainer" containerID="b8b6a129529add188d1a88c8e83dc2d8bb33b5794c67f05f63bbf20295fd9dd5" Feb 23 09:07:03 crc kubenswrapper[5047]: I0223 09:07:03.074947 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-6sl7g_fd87e444-98c8-4de9-9a11-ec9678daaeaa/manager/0.log" Feb 23 09:07:03 crc kubenswrapper[5047]: I0223 09:07:03.447097 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-7gz7l_8dcb699b-69b9-4a9c-86ff-d04bf088e297/manager/0.log" Feb 23 09:07:03 crc kubenswrapper[5047]: I0223 09:07:03.583260 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-pbjl2_b047da04-c7bc-47c3-aae0-71b9d98a650e/manager/0.log" Feb 23 09:07:03 crc kubenswrapper[5047]: I0223 09:07:03.824229 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-cqtpj_9334599a-1686-4e49-b6e2-799c2038e0df/manager/0.log" Feb 23 09:07:04 crc kubenswrapper[5047]: I0223 09:07:04.257839 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-qr226_a263ae62-717d-4032-b1e2-042f1b3e936e/manager/0.log" Feb 23 09:07:04 crc kubenswrapper[5047]: I0223 09:07:04.838132 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-wdx4l_c67d85ac-ca9b-4b6f-826d-c04dd6b6850b/manager/0.log" Feb 23 09:07:04 crc kubenswrapper[5047]: I0223 09:07:04.960918 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-hfcm8_4f170913-8ccb-42e2-9113-23fb049373c9/manager/0.log" Feb 23 09:07:05 crc kubenswrapper[5047]: I0223 09:07:05.186494 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-bg6jh_86dd701f-396e-4fc1-8d42-29befb968db9/manager/0.log" Feb 23 09:07:05 crc kubenswrapper[5047]: I0223 09:07:05.486821 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-lmr7j_d331d283-a586-4c54-9732-151a8050ed40/manager/0.log" Feb 23 09:07:05 crc kubenswrapper[5047]: I0223 09:07:05.752260 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-bs8mv_af0d0299-b4a6-47bd-99a1-567fd5126f5c/manager/0.log" Feb 23 09:07:06 crc kubenswrapper[5047]: I0223 09:07:06.222894 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-h6tck_32942f2f-2abb-4f90-8444-ed9e77aeef57/manager/0.log" Feb 23 09:07:06 crc kubenswrapper[5047]: I0223 09:07:06.414827 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-fb5fcc5b8-bbxfl_f1b21582-c058-4cbc-bc6b-95d77c4a526c/manager/0.log" Feb 23 09:07:06 crc kubenswrapper[5047]: I0223 09:07:06.606857 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-7642d_d5a20255-deb8-4cda-a48c-2735b1e66247/manager/0.log" Feb 23 09:07:06 crc kubenswrapper[5047]: I0223 09:07:06.752842 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6679bf9b57-68nbx_f5b3049a-c454-4b3a-bc73-7529f164bcf1/operator/0.log" Feb 23 09:07:07 crc kubenswrapper[5047]: I0223 09:07:07.160850 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-cxhzm_59e0a106-e284-4715-b287-b77124d0fc64/registry-server/0.log" Feb 23 09:07:07 crc kubenswrapper[5047]: I0223 09:07:07.415393 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-w297v_9e1bca77-f8b9-41d3-8547-2d7e9bf8ac71/manager/0.log" Feb 23 09:07:07 crc kubenswrapper[5047]: I0223 09:07:07.579687 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-2cbb6_50b2c3a5-05f7-4f75-a03b-b79278119309/manager/0.log" Feb 23 09:07:07 crc kubenswrapper[5047]: I0223 09:07:07.628103 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-t4qfd_f55fc40d-b5bb-4ba4-8b1c-1136aad2fc04/manager/0.log" Feb 23 09:07:07 crc kubenswrapper[5047]: I0223 09:07:07.805826 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-p9fc8_d4f545e3-e6a7-41f1-84cb-895175df22cf/operator/0.log" Feb 23 09:07:07 crc kubenswrapper[5047]: I0223 09:07:07.861970 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-d8rhk_f45166f5-93e4-4787-a25f-0c19e1e83cd5/manager/0.log" Feb 23 09:07:08 crc kubenswrapper[5047]: I0223 09:07:08.235441 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-5v756_7340bc8e-daf4-474d-9b0e-3363755d8f43/manager/0.log" Feb 23 09:07:08 crc kubenswrapper[5047]: I0223 09:07:08.263603 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-fqngb_ada55ddd-c9d2-47ac-b8e2-efcfe2b45bd7/manager/0.log" Feb 23 09:07:08 crc kubenswrapper[5047]: I0223 09:07:08.448083 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-6f5tp_139b2268-6b8e-42e2-934a-639863e06507/manager/0.log" Feb 23 09:07:09 crc kubenswrapper[5047]: I0223 09:07:09.492713 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-69ff7bc449-k9ww2_a6a5f402-78f7-45b9-8358-c16da1787c4e/manager/0.log" Feb 23 09:07:10 crc kubenswrapper[5047]: I0223 09:07:10.269789 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-ft5sm_e2712b1a-9cd8-40f1-aedd-8c6146c4182e/manager/0.log" Feb 23 09:07:28 crc kubenswrapper[5047]: I0223 09:07:28.087431 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-22tgp_99864770-491a-4f8e-8f3f-688436dc18ba/control-plane-machine-set-operator/0.log" Feb 23 09:07:28 crc kubenswrapper[5047]: I0223 09:07:28.259516 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cfg96_cca77003-2334-4205-97ae-87f0ae6d34cc/kube-rbac-proxy/0.log" Feb 23 09:07:28 crc kubenswrapper[5047]: I0223 09:07:28.313337 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cfg96_cca77003-2334-4205-97ae-87f0ae6d34cc/machine-api-operator/0.log" Feb 23 09:07:39 crc kubenswrapper[5047]: I0223 09:07:39.724505 5047 kuberuntime_container.go:700] "PreStop hook not completed in grace period" pod="openstack/ovsdbserver-sb-1" podUID="74acc7db-4095-419a-9b09-afa04283a69f" containerName="ovsdbserver-sb" containerID="cri-o://1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224" gracePeriod=300 Feb 23 09:07:39 crc kubenswrapper[5047]: I0223 09:07:39.725307 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-1" podUID="74acc7db-4095-419a-9b09-afa04283a69f" containerName="ovsdbserver-sb" containerID="cri-o://1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224" gracePeriod=2 Feb 23 09:07:39 crc kubenswrapper[5047]: I0223 09:07:39.736798 5047 kuberuntime_container.go:700] "PreStop hook not completed in grace period" pod="openstack/ovsdbserver-sb-2" podUID="385bab5b-7cad-4274-b672-a0614be3f41e" containerName="ovsdbserver-sb" containerID="cri-o://d823703061a177710055372592a26735509301e028b1a82a26fc0eb8b4c976a3" gracePeriod=300 Feb 23 09:07:39 crc kubenswrapper[5047]: I0223 09:07:39.737008 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-2" podUID="385bab5b-7cad-4274-b672-a0614be3f41e" containerName="ovsdbserver-sb" containerID="cri-o://d823703061a177710055372592a26735509301e028b1a82a26fc0eb8b4c976a3" gracePeriod=2 Feb 23 09:07:39 crc kubenswrapper[5047]: E0223 09:07:39.737518 5047 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 23 09:07:39 crc kubenswrapper[5047]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Feb 23 09:07:39 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 09:07:39 crc kubenswrapper[5047]: ++ DB_TYPE=sb Feb 23 09:07:39 crc kubenswrapper[5047]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Feb 23 09:07:39 crc kubenswrapper[5047]: + DB_NAME=OVN_Northbound Feb 23 09:07:39 crc kubenswrapper[5047]: + [[ sb == \s\b ]] Feb 23 09:07:39 crc kubenswrapper[5047]: + DB_NAME=OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ hostname Feb 23 09:07:39 crc kubenswrapper[5047]: + [[ ovsdbserver-sb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Feb 23 09:07:39 crc kubenswrapper[5047]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: > execCommand=["/usr/local/bin/container-scripts/cleanup.sh"] containerName="ovsdbserver-sb" pod="openstack/ovsdbserver-sb-1" message=< Feb 23 09:07:39 crc kubenswrapper[5047]: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Feb 23 09:07:39 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 09:07:39 crc kubenswrapper[5047]: ++ DB_TYPE=sb Feb 23 09:07:39 crc kubenswrapper[5047]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Feb 23 09:07:39 crc kubenswrapper[5047]: + DB_NAME=OVN_Northbound Feb 23 09:07:39 crc kubenswrapper[5047]: + [[ sb == \s\b ]] Feb 23 09:07:39 crc kubenswrapper[5047]: + DB_NAME=OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ hostname Feb 23 09:07:39 crc kubenswrapper[5047]: + [[ ovsdbserver-sb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Feb 23 09:07:39 crc kubenswrapper[5047]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: > Feb 23 09:07:39 crc kubenswrapper[5047]: E0223 09:07:39.737726 5047 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 23 09:07:39 crc kubenswrapper[5047]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Feb 23 09:07:39 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 09:07:39 crc kubenswrapper[5047]: ++ DB_TYPE=sb Feb 23 09:07:39 crc kubenswrapper[5047]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Feb 23 09:07:39 crc kubenswrapper[5047]: + DB_NAME=OVN_Northbound Feb 23 09:07:39 crc kubenswrapper[5047]: + [[ sb == \s\b ]] Feb 23 09:07:39 crc kubenswrapper[5047]: + DB_NAME=OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ hostname Feb 23 09:07:39 crc kubenswrapper[5047]: + [[ ovsdbserver-sb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Feb 23 09:07:39 crc kubenswrapper[5047]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: > pod="openstack/ovsdbserver-sb-1" podUID="74acc7db-4095-419a-9b09-afa04283a69f" containerName="ovsdbserver-sb" containerID="cri-o://1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224" Feb 23 09:07:39 crc kubenswrapper[5047]: E0223 09:07:39.748944 5047 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 23 09:07:39 crc kubenswrapper[5047]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Feb 23 09:07:39 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 09:07:39 crc kubenswrapper[5047]: ++ DB_TYPE=sb Feb 23 09:07:39 crc kubenswrapper[5047]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Feb 23 09:07:39 crc kubenswrapper[5047]: + DB_NAME=OVN_Northbound Feb 23 09:07:39 crc kubenswrapper[5047]: + [[ sb == \s\b ]] Feb 23 09:07:39 crc kubenswrapper[5047]: + DB_NAME=OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ hostname Feb 23 09:07:39 crc kubenswrapper[5047]: + [[ ovsdbserver-sb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Feb 23 09:07:39 crc kubenswrapper[5047]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: > execCommand=["/usr/local/bin/container-scripts/cleanup.sh"] containerName="ovsdbserver-sb" pod="openstack/ovsdbserver-sb-2" message=< Feb 23 09:07:39 crc kubenswrapper[5047]: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Feb 23 09:07:39 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 09:07:39 crc kubenswrapper[5047]: ++ DB_TYPE=sb Feb 23 09:07:39 crc kubenswrapper[5047]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Feb 23 09:07:39 crc kubenswrapper[5047]: + DB_NAME=OVN_Northbound Feb 23 09:07:39 crc kubenswrapper[5047]: + [[ sb == \s\b ]] Feb 23 09:07:39 crc kubenswrapper[5047]: + DB_NAME=OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ hostname Feb 23 09:07:39 crc kubenswrapper[5047]: + [[ ovsdbserver-sb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Feb 23 09:07:39 crc kubenswrapper[5047]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:39 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:39 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:39 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:39 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:39 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:39 crc kubenswrapper[5047]: + true Feb 23 09:07:39 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: > Feb 23 09:07:40 crc kubenswrapper[5047]: E0223 09:07:39.749413 5047 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 23 09:07:40 crc kubenswrapper[5047]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Feb 23 09:07:40 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_TYPE=sb Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Feb 23 09:07:40 crc kubenswrapper[5047]: + DB_NAME=OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ sb == \s\b ]] Feb 23 09:07:40 crc kubenswrapper[5047]: + DB_NAME=OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ hostname Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ ovsdbserver-sb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Feb 23 09:07:40 crc kubenswrapper[5047]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: > pod="openstack/ovsdbserver-sb-2" podUID="385bab5b-7cad-4274-b672-a0614be3f41e" containerName="ovsdbserver-sb" containerID="cri-o://d823703061a177710055372592a26735509301e028b1a82a26fc0eb8b4c976a3" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:39.859094 5047 kuberuntime_container.go:700] "PreStop hook not completed in grace period" pod="openstack/ovsdbserver-nb-2" podUID="0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" containerName="ovsdbserver-nb" containerID="cri-o://e18a052509e18885714dbaa0ad71543da1d6cd717884fceb5aff08abd9fae5c9" gracePeriod=300 Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:39.859515 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-2" podUID="0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" containerName="ovsdbserver-nb" containerID="cri-o://e18a052509e18885714dbaa0ad71543da1d6cd717884fceb5aff08abd9fae5c9" gracePeriod=2 Feb 23 09:07:40 crc kubenswrapper[5047]: E0223 09:07:39.870696 5047 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 23 09:07:40 crc kubenswrapper[5047]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Feb 23 09:07:40 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_TYPE=nb Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Feb 23 09:07:40 crc kubenswrapper[5047]: + DB_NAME=OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ nb == \s\b ]] Feb 23 09:07:40 crc kubenswrapper[5047]: ++ hostname Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ ovsdbserver-nb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Feb 23 09:07:40 crc kubenswrapper[5047]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: > execCommand=["/usr/local/bin/container-scripts/cleanup.sh"] containerName="ovsdbserver-nb" pod="openstack/ovsdbserver-nb-2" message=< Feb 23 09:07:40 crc kubenswrapper[5047]: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Feb 23 09:07:40 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_TYPE=nb Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Feb 23 09:07:40 crc kubenswrapper[5047]: + DB_NAME=OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ nb == \s\b ]] Feb 23 09:07:40 crc kubenswrapper[5047]: ++ hostname Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ ovsdbserver-nb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Feb 23 09:07:40 crc kubenswrapper[5047]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: > Feb 23 09:07:40 crc kubenswrapper[5047]: E0223 09:07:39.871077 5047 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 23 09:07:40 crc kubenswrapper[5047]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Feb 23 09:07:40 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_TYPE=nb Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Feb 23 09:07:40 crc kubenswrapper[5047]: + DB_NAME=OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ nb == \s\b ]] Feb 23 09:07:40 crc kubenswrapper[5047]: ++ hostname Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ ovsdbserver-nb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Feb 23 09:07:40 crc kubenswrapper[5047]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: > pod="openstack/ovsdbserver-nb-2" podUID="0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" containerName="ovsdbserver-nb" containerID="cri-o://e18a052509e18885714dbaa0ad71543da1d6cd717884fceb5aff08abd9fae5c9" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:39.892989 5047 kuberuntime_container.go:700] "PreStop hook not completed in grace period" pod="openstack/ovsdbserver-nb-1" podUID="146c85be-9d67-4281-873e-b27f5e90d957" containerName="ovsdbserver-nb" containerID="cri-o://b95626b92928b330aeada129bc33bfc7f515fbbc0d2b15b4671ea5ae25433071" gracePeriod=300 Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:39.893435 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-1" podUID="146c85be-9d67-4281-873e-b27f5e90d957" containerName="ovsdbserver-nb" containerID="cri-o://b95626b92928b330aeada129bc33bfc7f515fbbc0d2b15b4671ea5ae25433071" gracePeriod=2 Feb 23 09:07:40 crc kubenswrapper[5047]: E0223 09:07:39.907843 5047 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 23 09:07:40 crc kubenswrapper[5047]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Feb 23 09:07:40 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_TYPE=nb Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Feb 23 09:07:40 crc kubenswrapper[5047]: + DB_NAME=OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ nb == \s\b ]] Feb 23 09:07:40 crc kubenswrapper[5047]: ++ hostname Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ ovsdbserver-nb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Feb 23 09:07:40 crc kubenswrapper[5047]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: > execCommand=["/usr/local/bin/container-scripts/cleanup.sh"] containerName="ovsdbserver-nb" pod="openstack/ovsdbserver-nb-1" message=< Feb 23 09:07:40 crc kubenswrapper[5047]: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Feb 23 09:07:40 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_TYPE=nb Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Feb 23 09:07:40 crc kubenswrapper[5047]: + DB_NAME=OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ nb == \s\b ]] Feb 23 09:07:40 crc kubenswrapper[5047]: ++ hostname Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ ovsdbserver-nb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Feb 23 09:07:40 crc kubenswrapper[5047]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: > Feb 23 09:07:40 crc kubenswrapper[5047]: E0223 09:07:39.908029 5047 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 23 09:07:40 crc kubenswrapper[5047]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Feb 23 09:07:40 crc kubenswrapper[5047]: + source /usr/local/bin/container-scripts/functions Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_TYPE=nb Feb 23 09:07:40 crc kubenswrapper[5047]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Feb 23 09:07:40 crc kubenswrapper[5047]: + DB_NAME=OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ nb == \s\b ]] Feb 23 09:07:40 crc kubenswrapper[5047]: ++ hostname Feb 23 09:07:40 crc kubenswrapper[5047]: + [[ ovsdbserver-nb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Feb 23 09:07:40 crc kubenswrapper[5047]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: + true Feb 23 09:07:40 crc kubenswrapper[5047]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Feb 23 09:07:40 crc kubenswrapper[5047]: ++ grep Status: Feb 23 09:07:40 crc kubenswrapper[5047]: ++ awk -e '{print $2}' Feb 23 09:07:40 crc kubenswrapper[5047]: + STATUS=leaving Feb 23 09:07:40 crc kubenswrapper[5047]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Feb 23 09:07:40 crc kubenswrapper[5047]: + sleep 1 Feb 23 09:07:40 crc kubenswrapper[5047]: > pod="openstack/ovsdbserver-nb-1" podUID="146c85be-9d67-4281-873e-b27f5e90d957" containerName="ovsdbserver-nb" containerID="cri-o://b95626b92928b330aeada129bc33bfc7f515fbbc0d2b15b4671ea5ae25433071" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.696388 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_146c85be-9d67-4281-873e-b27f5e90d957/ovsdbserver-nb/0.log" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.696736 5047 generic.go:334] "Generic (PLEG): container finished" podID="146c85be-9d67-4281-873e-b27f5e90d957" containerID="b95626b92928b330aeada129bc33bfc7f515fbbc0d2b15b4671ea5ae25433071" exitCode=143 Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.696787 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"146c85be-9d67-4281-873e-b27f5e90d957","Type":"ContainerDied","Data":"b95626b92928b330aeada129bc33bfc7f515fbbc0d2b15b4671ea5ae25433071"} Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.700939 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_74acc7db-4095-419a-9b09-afa04283a69f/ovsdbserver-sb/0.log" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.701051 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.708710 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_74acc7db-4095-419a-9b09-afa04283a69f/ovsdbserver-sb/0.log" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.708783 5047 generic.go:334] "Generic (PLEG): container finished" podID="74acc7db-4095-419a-9b09-afa04283a69f" containerID="1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224" exitCode=143 Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.708841 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"74acc7db-4095-419a-9b09-afa04283a69f","Type":"ContainerDied","Data":"1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224"} Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.708898 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"74acc7db-4095-419a-9b09-afa04283a69f","Type":"ContainerDied","Data":"ab37837a45462c9622cfc0a373f7b8dc97474dd28b811ca6b9653d8931daf9f8"} Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.708946 5047 scope.go:117] "RemoveContainer" containerID="c44ef2aca4d1a0638cb9dfd26e45f7f7dab935e68d88e63b1aa79658027107f1" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.746067 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_0c6cd063-9d8c-4689-8e90-346ae2ec1ea8/ovsdbserver-nb/0.log" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.746115 5047 generic.go:334] "Generic (PLEG): container finished" podID="0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" containerID="e18a052509e18885714dbaa0ad71543da1d6cd717884fceb5aff08abd9fae5c9" exitCode=143 Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.746184 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8","Type":"ContainerDied","Data":"e18a052509e18885714dbaa0ad71543da1d6cd717884fceb5aff08abd9fae5c9"} Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.767882 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_385bab5b-7cad-4274-b672-a0614be3f41e/ovsdbserver-sb/0.log" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.767970 5047 generic.go:334] "Generic (PLEG): container finished" podID="385bab5b-7cad-4274-b672-a0614be3f41e" containerID="d823703061a177710055372592a26735509301e028b1a82a26fc0eb8b4c976a3" exitCode=143 Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.768011 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"385bab5b-7cad-4274-b672-a0614be3f41e","Type":"ContainerDied","Data":"d823703061a177710055372592a26735509301e028b1a82a26fc0eb8b4c976a3"} Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.783373 5047 scope.go:117] "RemoveContainer" containerID="1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.798381 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_146c85be-9d67-4281-873e-b27f5e90d957/ovsdbserver-nb/0.log" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.798463 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.808250 5047 scope.go:117] "RemoveContainer" containerID="c44ef2aca4d1a0638cb9dfd26e45f7f7dab935e68d88e63b1aa79658027107f1" Feb 23 09:07:40 crc kubenswrapper[5047]: E0223 09:07:40.808631 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c44ef2aca4d1a0638cb9dfd26e45f7f7dab935e68d88e63b1aa79658027107f1\": container with ID starting with c44ef2aca4d1a0638cb9dfd26e45f7f7dab935e68d88e63b1aa79658027107f1 not found: ID does not exist" containerID="c44ef2aca4d1a0638cb9dfd26e45f7f7dab935e68d88e63b1aa79658027107f1" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.808674 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c44ef2aca4d1a0638cb9dfd26e45f7f7dab935e68d88e63b1aa79658027107f1"} err="failed to get container status \"c44ef2aca4d1a0638cb9dfd26e45f7f7dab935e68d88e63b1aa79658027107f1\": rpc error: code = NotFound desc = could not find container \"c44ef2aca4d1a0638cb9dfd26e45f7f7dab935e68d88e63b1aa79658027107f1\": container with ID starting with c44ef2aca4d1a0638cb9dfd26e45f7f7dab935e68d88e63b1aa79658027107f1 not found: ID does not exist" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.808703 5047 scope.go:117] "RemoveContainer" containerID="1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224" Feb 23 09:07:40 crc kubenswrapper[5047]: E0223 09:07:40.809135 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224\": container with ID starting with 1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224 not found: ID does not exist" containerID="1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.809157 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224"} err="failed to get container status \"1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224\": rpc error: code = NotFound desc = could not find container \"1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224\": container with ID starting with 1b6ad3f507fe9aab96c5568f9fa62167cd4fef0979c86781065879e36fd3e224 not found: ID does not exist" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.812843 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_385bab5b-7cad-4274-b672-a0614be3f41e/ovsdbserver-sb/0.log" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.812975 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.861741 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74acc7db-4095-419a-9b09-afa04283a69f-config\") pod \"74acc7db-4095-419a-9b09-afa04283a69f\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.862381 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110\") pod \"74acc7db-4095-419a-9b09-afa04283a69f\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.862493 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74acc7db-4095-419a-9b09-afa04283a69f-config" (OuterVolumeSpecName: "config") pod "74acc7db-4095-419a-9b09-afa04283a69f" (UID: "74acc7db-4095-419a-9b09-afa04283a69f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.862534 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74acc7db-4095-419a-9b09-afa04283a69f-scripts\") pod \"74acc7db-4095-419a-9b09-afa04283a69f\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.862626 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-metrics-certs-tls-certs\") pod \"74acc7db-4095-419a-9b09-afa04283a69f\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.862653 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74acc7db-4095-419a-9b09-afa04283a69f-ovsdb-rundir\") pod \"74acc7db-4095-419a-9b09-afa04283a69f\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.862682 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkz4p\" (UniqueName: \"kubernetes.io/projected/74acc7db-4095-419a-9b09-afa04283a69f-kube-api-access-mkz4p\") pod \"74acc7db-4095-419a-9b09-afa04283a69f\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.862716 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-combined-ca-bundle\") pod \"74acc7db-4095-419a-9b09-afa04283a69f\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.862750 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-ovsdbserver-sb-tls-certs\") pod \"74acc7db-4095-419a-9b09-afa04283a69f\" (UID: \"74acc7db-4095-419a-9b09-afa04283a69f\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.863296 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74acc7db-4095-419a-9b09-afa04283a69f-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.866379 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74acc7db-4095-419a-9b09-afa04283a69f-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "74acc7db-4095-419a-9b09-afa04283a69f" (UID: "74acc7db-4095-419a-9b09-afa04283a69f"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.866749 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74acc7db-4095-419a-9b09-afa04283a69f-scripts" (OuterVolumeSpecName: "scripts") pod "74acc7db-4095-419a-9b09-afa04283a69f" (UID: "74acc7db-4095-419a-9b09-afa04283a69f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.876590 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74acc7db-4095-419a-9b09-afa04283a69f-kube-api-access-mkz4p" (OuterVolumeSpecName: "kube-api-access-mkz4p") pod "74acc7db-4095-419a-9b09-afa04283a69f" (UID: "74acc7db-4095-419a-9b09-afa04283a69f"). InnerVolumeSpecName "kube-api-access-mkz4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.900989 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "74acc7db-4095-419a-9b09-afa04283a69f" (UID: "74acc7db-4095-419a-9b09-afa04283a69f"). InnerVolumeSpecName "pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.901101 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74acc7db-4095-419a-9b09-afa04283a69f" (UID: "74acc7db-4095-419a-9b09-afa04283a69f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.928068 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "74acc7db-4095-419a-9b09-afa04283a69f" (UID: "74acc7db-4095-419a-9b09-afa04283a69f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.937747 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "74acc7db-4095-419a-9b09-afa04283a69f" (UID: "74acc7db-4095-419a-9b09-afa04283a69f"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.965978 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-combined-ca-bundle\") pod \"385bab5b-7cad-4274-b672-a0614be3f41e\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.966040 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-ovsdbserver-nb-tls-certs\") pod \"146c85be-9d67-4281-873e-b27f5e90d957\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.966110 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385bab5b-7cad-4274-b672-a0614be3f41e-scripts\") pod \"385bab5b-7cad-4274-b672-a0614be3f41e\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.966686 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7bf93e5-6c47-47d7-a283-272db430f832\") pod \"146c85be-9d67-4281-873e-b27f5e90d957\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.966720 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgjm7\" (UniqueName: \"kubernetes.io/projected/146c85be-9d67-4281-873e-b27f5e90d957-kube-api-access-mgjm7\") pod \"146c85be-9d67-4281-873e-b27f5e90d957\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.966744 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/146c85be-9d67-4281-873e-b27f5e90d957-config\") pod \"146c85be-9d67-4281-873e-b27f5e90d957\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.966816 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/385bab5b-7cad-4274-b672-a0614be3f41e-ovsdb-rundir\") pod \"385bab5b-7cad-4274-b672-a0614be3f41e\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.966864 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-combined-ca-bundle\") pod \"146c85be-9d67-4281-873e-b27f5e90d957\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.966892 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-ovsdbserver-sb-tls-certs\") pod \"385bab5b-7cad-4274-b672-a0614be3f41e\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.966954 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-metrics-certs-tls-certs\") pod \"146c85be-9d67-4281-873e-b27f5e90d957\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.966982 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-metrics-certs-tls-certs\") pod \"385bab5b-7cad-4274-b672-a0614be3f41e\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.967011 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn7bw\" (UniqueName: \"kubernetes.io/projected/385bab5b-7cad-4274-b672-a0614be3f41e-kube-api-access-kn7bw\") pod \"385bab5b-7cad-4274-b672-a0614be3f41e\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.967062 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/385bab5b-7cad-4274-b672-a0614be3f41e-config\") pod \"385bab5b-7cad-4274-b672-a0614be3f41e\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.967096 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/146c85be-9d67-4281-873e-b27f5e90d957-scripts\") pod \"146c85be-9d67-4281-873e-b27f5e90d957\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.967121 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/146c85be-9d67-4281-873e-b27f5e90d957-ovsdb-rundir\") pod \"146c85be-9d67-4281-873e-b27f5e90d957\" (UID: \"146c85be-9d67-4281-873e-b27f5e90d957\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.967500 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4\") pod \"385bab5b-7cad-4274-b672-a0614be3f41e\" (UID: \"385bab5b-7cad-4274-b672-a0614be3f41e\") " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.967580 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146c85be-9d67-4281-873e-b27f5e90d957-config" (OuterVolumeSpecName: "config") pod "146c85be-9d67-4281-873e-b27f5e90d957" (UID: "146c85be-9d67-4281-873e-b27f5e90d957"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.967872 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110\") on node \"crc\" " Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.967894 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74acc7db-4095-419a-9b09-afa04283a69f-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.967921 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/385bab5b-7cad-4274-b672-a0614be3f41e-scripts" (OuterVolumeSpecName: "scripts") pod "385bab5b-7cad-4274-b672-a0614be3f41e" (UID: "385bab5b-7cad-4274-b672-a0614be3f41e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.967955 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.968276 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/385bab5b-7cad-4274-b672-a0614be3f41e-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "385bab5b-7cad-4274-b672-a0614be3f41e" (UID: "385bab5b-7cad-4274-b672-a0614be3f41e"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.968680 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74acc7db-4095-419a-9b09-afa04283a69f-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.968695 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkz4p\" (UniqueName: \"kubernetes.io/projected/74acc7db-4095-419a-9b09-afa04283a69f-kube-api-access-mkz4p\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.968704 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/146c85be-9d67-4281-873e-b27f5e90d957-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.968712 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.968721 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74acc7db-4095-419a-9b09-afa04283a69f-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.971315 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146c85be-9d67-4281-873e-b27f5e90d957-kube-api-access-mgjm7" (OuterVolumeSpecName: "kube-api-access-mgjm7") pod "146c85be-9d67-4281-873e-b27f5e90d957" (UID: "146c85be-9d67-4281-873e-b27f5e90d957"). InnerVolumeSpecName "kube-api-access-mgjm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.972127 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/385bab5b-7cad-4274-b672-a0614be3f41e-config" (OuterVolumeSpecName: "config") pod "385bab5b-7cad-4274-b672-a0614be3f41e" (UID: "385bab5b-7cad-4274-b672-a0614be3f41e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.976298 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146c85be-9d67-4281-873e-b27f5e90d957-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "146c85be-9d67-4281-873e-b27f5e90d957" (UID: "146c85be-9d67-4281-873e-b27f5e90d957"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.976504 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146c85be-9d67-4281-873e-b27f5e90d957-scripts" (OuterVolumeSpecName: "scripts") pod "146c85be-9d67-4281-873e-b27f5e90d957" (UID: "146c85be-9d67-4281-873e-b27f5e90d957"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.980379 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/385bab5b-7cad-4274-b672-a0614be3f41e-kube-api-access-kn7bw" (OuterVolumeSpecName: "kube-api-access-kn7bw") pod "385bab5b-7cad-4274-b672-a0614be3f41e" (UID: "385bab5b-7cad-4274-b672-a0614be3f41e"). InnerVolumeSpecName "kube-api-access-kn7bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.987949 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "385bab5b-7cad-4274-b672-a0614be3f41e" (UID: "385bab5b-7cad-4274-b672-a0614be3f41e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:40 crc kubenswrapper[5047]: I0223 09:07:40.990925 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "146c85be-9d67-4281-873e-b27f5e90d957" (UID: "146c85be-9d67-4281-873e-b27f5e90d957"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.005739 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.005965 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110") on node "crc" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.006206 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "385bab5b-7cad-4274-b672-a0614be3f41e" (UID: "385bab5b-7cad-4274-b672-a0614be3f41e"). InnerVolumeSpecName "pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.021745 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7bf93e5-6c47-47d7-a283-272db430f832" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "146c85be-9d67-4281-873e-b27f5e90d957" (UID: "146c85be-9d67-4281-873e-b27f5e90d957"). InnerVolumeSpecName "pvc-f7bf93e5-6c47-47d7-a283-272db430f832". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.025103 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "146c85be-9d67-4281-873e-b27f5e90d957" (UID: "146c85be-9d67-4281-873e-b27f5e90d957"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.025383 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "385bab5b-7cad-4274-b672-a0614be3f41e" (UID: "385bab5b-7cad-4274-b672-a0614be3f41e"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.052388 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "385bab5b-7cad-4274-b672-a0614be3f41e" (UID: "385bab5b-7cad-4274-b672-a0614be3f41e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.055914 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "146c85be-9d67-4281-873e-b27f5e90d957" (UID: "146c85be-9d67-4281-873e-b27f5e90d957"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.069804 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/385bab5b-7cad-4274-b672-a0614be3f41e-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.069839 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86b31958-b6c3-43c2-8ec7-9726fa2c2110\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.069851 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/146c85be-9d67-4281-873e-b27f5e90d957-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.069860 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/146c85be-9d67-4281-873e-b27f5e90d957-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.069888 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4\") on node \"crc\" " Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.069914 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.069949 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.069959 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385bab5b-7cad-4274-b672-a0614be3f41e-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.069973 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f7bf93e5-6c47-47d7-a283-272db430f832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7bf93e5-6c47-47d7-a283-272db430f832\") on node \"crc\" " Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.069982 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgjm7\" (UniqueName: \"kubernetes.io/projected/146c85be-9d67-4281-873e-b27f5e90d957-kube-api-access-mgjm7\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.069991 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/385bab5b-7cad-4274-b672-a0614be3f41e-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.069999 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.070006 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.070015 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/146c85be-9d67-4281-873e-b27f5e90d957-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.070022 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/385bab5b-7cad-4274-b672-a0614be3f41e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.070030 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn7bw\" (UniqueName: \"kubernetes.io/projected/385bab5b-7cad-4274-b672-a0614be3f41e-kube-api-access-kn7bw\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.085562 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.085743 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f7bf93e5-6c47-47d7-a283-272db430f832" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7bf93e5-6c47-47d7-a283-272db430f832") on node "crc" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.093647 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.093891 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4") on node "crc" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.171278 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-f7bf93e5-6c47-47d7-a283-272db430f832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7bf93e5-6c47-47d7-a283-272db430f832\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.171320 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9632426e-1379-4ab4-a0b6-d3eb9e649bc4\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.183145 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_0c6cd063-9d8c-4689-8e90-346ae2ec1ea8/ovsdbserver-nb/0.log" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.183219 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.373655 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-scripts\") pod \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.373722 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-ovsdb-rundir\") pod \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.373758 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-ovsdbserver-nb-tls-certs\") pod \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.373790 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-config\") pod \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.374079 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" (UID: "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.374230 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-scripts" (OuterVolumeSpecName: "scripts") pod "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" (UID: "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.374511 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-config" (OuterVolumeSpecName: "config") pod "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" (UID: "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.376257 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c\") pod \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.376398 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvbwj\" (UniqueName: \"kubernetes.io/projected/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-kube-api-access-tvbwj\") pod \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.376466 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-combined-ca-bundle\") pod \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.376526 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-metrics-certs-tls-certs\") pod \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\" (UID: \"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8\") " Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.377112 5047 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.377143 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.377161 5047 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.382044 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-kube-api-access-tvbwj" (OuterVolumeSpecName: "kube-api-access-tvbwj") pod "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" (UID: "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8"). InnerVolumeSpecName "kube-api-access-tvbwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.388734 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" (UID: "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8"). InnerVolumeSpecName "pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.407024 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" (UID: "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.431577 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" (UID: "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.435325 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" (UID: "0c6cd063-9d8c-4689-8e90-346ae2ec1ea8"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.478748 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvbwj\" (UniqueName: \"kubernetes.io/projected/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-kube-api-access-tvbwj\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.478793 5047 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.478809 5047 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.478821 5047 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.478869 5047 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c\") on node \"crc\" " Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.498889 5047 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.499129 5047 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c") on node "crc" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.580079 5047 reconciler_common.go:293] "Volume detached for volume \"pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5caa6ea7-be1c-4e26-8f31-a46e9729155c\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.676237 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-8rv27_8f24c3b5-950d-40ec-a7cf-e71d942a57aa/cert-manager-controller/0.log" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.775980 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.779111 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_0c6cd063-9d8c-4689-8e90-346ae2ec1ea8/ovsdbserver-nb/0.log" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.779335 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0c6cd063-9d8c-4689-8e90-346ae2ec1ea8","Type":"ContainerDied","Data":"c1fa523d70d012697da63410618631befcddece6cb31ecc5063ca4369f02f911"} Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.779345 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.779508 5047 scope.go:117] "RemoveContainer" containerID="dc2a3c442046d0ca90c2e2cf7cce3b69605e0fac97b43c4fa57bbf160f1e4fe4" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.781385 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_385bab5b-7cad-4274-b672-a0614be3f41e/ovsdbserver-sb/0.log" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.781518 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.781672 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"385bab5b-7cad-4274-b672-a0614be3f41e","Type":"ContainerDied","Data":"8533a01011ace10ae16910553d260b19301102bbb29f5efb54691e2b02ca56af"} Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.788976 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_146c85be-9d67-4281-873e-b27f5e90d957/ovsdbserver-nb/0.log" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.789046 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"146c85be-9d67-4281-873e-b27f5e90d957","Type":"ContainerDied","Data":"7190a44dff669a9b3906efced5dbb427934cbde97e946429826653f924f96da8"} Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.789140 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.799482 5047 scope.go:117] "RemoveContainer" containerID="e18a052509e18885714dbaa0ad71543da1d6cd717884fceb5aff08abd9fae5c9" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.828702 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.830565 5047 scope.go:117] "RemoveContainer" containerID="f3246bebc9ebdff6972450f73eaa67d34d962a161929ce55ff3505b8bc91d983" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.842757 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.850894 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.859038 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.861184 5047 scope.go:117] "RemoveContainer" containerID="d823703061a177710055372592a26735509301e028b1a82a26fc0eb8b4c976a3" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.866253 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.872385 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.880015 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.882916 5047 scope.go:117] "RemoveContainer" containerID="855e342977c6d200e9eb4077a8fada2e25afccca32a14e64e70912cbfbb54b29" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.886678 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-hrqps_20a7cb7d-0122-4d8b-b98c-ed082d5fb596/cert-manager-cainjector/0.log" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.903049 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.941669 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-rbg92_7ccbccf7-5572-4158-b181-e23c3ef00a05/cert-manager-webhook/0.log" Feb 23 09:07:41 crc kubenswrapper[5047]: I0223 09:07:41.942144 5047 scope.go:117] "RemoveContainer" containerID="b95626b92928b330aeada129bc33bfc7f515fbbc0d2b15b4671ea5ae25433071" Feb 23 09:07:42 crc kubenswrapper[5047]: I0223 09:07:42.349361 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" path="/var/lib/kubelet/pods/0c6cd063-9d8c-4689-8e90-346ae2ec1ea8/volumes" Feb 23 09:07:42 crc kubenswrapper[5047]: I0223 09:07:42.350064 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146c85be-9d67-4281-873e-b27f5e90d957" path="/var/lib/kubelet/pods/146c85be-9d67-4281-873e-b27f5e90d957/volumes" Feb 23 09:07:42 crc kubenswrapper[5047]: I0223 09:07:42.351182 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="385bab5b-7cad-4274-b672-a0614be3f41e" path="/var/lib/kubelet/pods/385bab5b-7cad-4274-b672-a0614be3f41e/volumes" Feb 23 09:07:42 crc kubenswrapper[5047]: I0223 09:07:42.351805 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74acc7db-4095-419a-9b09-afa04283a69f" path="/var/lib/kubelet/pods/74acc7db-4095-419a-9b09-afa04283a69f/volumes" Feb 23 09:07:54 crc kubenswrapper[5047]: I0223 09:07:54.385956 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-jrmbs_4a3dbc97-61d9-4247-b652-f136cfc02688/nmstate-console-plugin/0.log" Feb 23 09:07:54 crc kubenswrapper[5047]: I0223 09:07:54.525936 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-kwhjc_48daf116-bd6d-462a-baf4-6dfb51456617/nmstate-handler/0.log" Feb 23 09:07:54 crc kubenswrapper[5047]: I0223 09:07:54.560097 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-mwjdq_c7816fce-0cd4-4edb-a9df-3589458e007a/kube-rbac-proxy/0.log" Feb 23 09:07:54 crc kubenswrapper[5047]: I0223 09:07:54.638091 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-mwjdq_c7816fce-0cd4-4edb-a9df-3589458e007a/nmstate-metrics/0.log" Feb 23 09:07:54 crc kubenswrapper[5047]: I0223 09:07:54.775929 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-xrldn_82af618b-d2b3-4b63-8112-54073bbbea1e/nmstate-operator/0.log" Feb 23 09:07:54 crc kubenswrapper[5047]: I0223 09:07:54.833685 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-wqj42_56627564-fec8-4817-9883-639384c8c1ed/nmstate-webhook/0.log" Feb 23 09:08:09 crc kubenswrapper[5047]: I0223 09:08:09.188270 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-r6zrs_ecb737b5-f02c-4aea-b4d0-93c516c3c258/prometheus-operator/0.log" Feb 23 09:08:09 crc kubenswrapper[5047]: I0223 09:08:09.395327 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h_8e2c1208-29b7-4cda-8067-c429af1b5d63/prometheus-operator-admission-webhook/0.log" Feb 23 09:08:09 crc kubenswrapper[5047]: I0223 09:08:09.397179 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8_a1e4753e-1eb8-4961-9669-72b6abe068e4/prometheus-operator-admission-webhook/0.log" Feb 23 09:08:09 crc kubenswrapper[5047]: I0223 09:08:09.578734 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-x49nh_ebcba915-39c3-4516-89c6-ccc956b12b99/operator/0.log" Feb 23 09:08:09 crc kubenswrapper[5047]: I0223 09:08:09.611680 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-gv8g5_b0642e24-413d-4eea-83fb-e959cdcbc6dd/perses-operator/0.log" Feb 23 09:08:23 crc kubenswrapper[5047]: I0223 09:08:23.771559 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-9sdh2_2e8d69d4-4bff-47e9-950f-c07c56adad77/kube-rbac-proxy/0.log" Feb 23 09:08:23 crc kubenswrapper[5047]: I0223 09:08:23.997303 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/cp-frr-files/0.log" Feb 23 09:08:24 crc kubenswrapper[5047]: I0223 09:08:24.234311 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/cp-reloader/0.log" Feb 23 09:08:24 crc kubenswrapper[5047]: I0223 09:08:24.295194 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/cp-metrics/0.log" Feb 23 09:08:24 crc kubenswrapper[5047]: I0223 09:08:24.295678 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/cp-frr-files/0.log" Feb 23 09:08:24 crc kubenswrapper[5047]: I0223 09:08:24.456353 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/cp-reloader/0.log" Feb 23 09:08:24 crc kubenswrapper[5047]: I0223 09:08:24.540207 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-9sdh2_2e8d69d4-4bff-47e9-950f-c07c56adad77/controller/0.log" Feb 23 09:08:24 crc kubenswrapper[5047]: I0223 09:08:24.621644 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/cp-reloader/0.log" Feb 23 09:08:24 crc kubenswrapper[5047]: I0223 09:08:24.627743 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/cp-frr-files/0.log" Feb 23 09:08:24 crc kubenswrapper[5047]: I0223 09:08:24.652077 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/cp-metrics/0.log" Feb 23 09:08:24 crc kubenswrapper[5047]: I0223 09:08:24.723352 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/cp-metrics/0.log" Feb 23 09:08:24 crc kubenswrapper[5047]: I0223 09:08:24.906586 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/cp-reloader/0.log" Feb 23 09:08:24 crc kubenswrapper[5047]: I0223 09:08:24.926702 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/cp-frr-files/0.log" Feb 23 09:08:24 crc kubenswrapper[5047]: I0223 09:08:24.927252 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/cp-metrics/0.log" Feb 23 09:08:24 crc kubenswrapper[5047]: I0223 09:08:24.980889 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/controller/0.log" Feb 23 09:08:25 crc kubenswrapper[5047]: I0223 09:08:25.126636 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/frr/1.log" Feb 23 09:08:25 crc kubenswrapper[5047]: I0223 09:08:25.145608 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/frr-metrics/0.log" Feb 23 09:08:25 crc kubenswrapper[5047]: I0223 09:08:25.221091 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/kube-rbac-proxy/0.log" Feb 23 09:08:25 crc kubenswrapper[5047]: I0223 09:08:25.393574 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/reloader/0.log" Feb 23 09:08:25 crc kubenswrapper[5047]: I0223 09:08:25.412579 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/kube-rbac-proxy-frr/0.log" Feb 23 09:08:25 crc kubenswrapper[5047]: I0223 09:08:25.570530 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-bhglb_59ed94c1-a5ba-4e59-b112-68a944434be0/frr-k8s-webhook-server/0.log" Feb 23 09:08:25 crc kubenswrapper[5047]: I0223 09:08:25.710874 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-65b877fd6b-65kbc_a2e5f18f-e775-4839-8e1b-a5b05c2af4e9/manager/0.log" Feb 23 09:08:25 crc kubenswrapper[5047]: I0223 09:08:25.832517 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74bbf6f644-rwt8j_d9b84120-1d90-41d1-8686-535d6dcfb6d4/webhook-server/0.log" Feb 23 09:08:26 crc kubenswrapper[5047]: I0223 09:08:26.062706 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wxppp_d6b13e52-f7be-49a3-accf-f5084c8d38a6/kube-rbac-proxy/0.log" Feb 23 09:08:26 crc kubenswrapper[5047]: I0223 09:08:26.736807 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wxppp_d6b13e52-f7be-49a3-accf-f5084c8d38a6/speaker/0.log" Feb 23 09:08:27 crc kubenswrapper[5047]: I0223 09:08:27.644943 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nh5jn_e7785968-bae6-4ad4-bc4a-ccc4fac2cf41/frr/0.log" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.777479 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tl8bf"] Feb 23 09:08:30 crc kubenswrapper[5047]: E0223 09:08:30.778168 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74acc7db-4095-419a-9b09-afa04283a69f" containerName="ovsdbserver-sb" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778186 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="74acc7db-4095-419a-9b09-afa04283a69f" containerName="ovsdbserver-sb" Feb 23 09:08:30 crc kubenswrapper[5047]: E0223 09:08:30.778203 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" containerName="extract-utilities" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778211 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" containerName="extract-utilities" Feb 23 09:08:30 crc kubenswrapper[5047]: E0223 09:08:30.778224 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" containerName="openstack-network-exporter" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778232 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" containerName="openstack-network-exporter" Feb 23 09:08:30 crc kubenswrapper[5047]: E0223 09:08:30.778246 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74acc7db-4095-419a-9b09-afa04283a69f" containerName="openstack-network-exporter" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778253 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="74acc7db-4095-419a-9b09-afa04283a69f" containerName="openstack-network-exporter" Feb 23 09:08:30 crc kubenswrapper[5047]: E0223 09:08:30.778267 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15432459-a935-4c31-897a-f62bb64b53fe" containerName="container-00" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778275 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="15432459-a935-4c31-897a-f62bb64b53fe" containerName="container-00" Feb 23 09:08:30 crc kubenswrapper[5047]: E0223 09:08:30.778290 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385bab5b-7cad-4274-b672-a0614be3f41e" containerName="ovsdbserver-sb" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778297 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="385bab5b-7cad-4274-b672-a0614be3f41e" containerName="ovsdbserver-sb" Feb 23 09:08:30 crc kubenswrapper[5047]: E0223 09:08:30.778306 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146c85be-9d67-4281-873e-b27f5e90d957" containerName="openstack-network-exporter" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778313 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="146c85be-9d67-4281-873e-b27f5e90d957" containerName="openstack-network-exporter" Feb 23 09:08:30 crc kubenswrapper[5047]: E0223 09:08:30.778330 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" containerName="ovsdbserver-nb" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778337 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" containerName="ovsdbserver-nb" Feb 23 09:08:30 crc kubenswrapper[5047]: E0223 09:08:30.778354 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385bab5b-7cad-4274-b672-a0614be3f41e" containerName="openstack-network-exporter" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778362 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="385bab5b-7cad-4274-b672-a0614be3f41e" containerName="openstack-network-exporter" Feb 23 09:08:30 crc kubenswrapper[5047]: E0223 09:08:30.778399 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" containerName="extract-content" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778409 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" containerName="extract-content" Feb 23 09:08:30 crc kubenswrapper[5047]: E0223 09:08:30.778425 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146c85be-9d67-4281-873e-b27f5e90d957" containerName="ovsdbserver-nb" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778433 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="146c85be-9d67-4281-873e-b27f5e90d957" containerName="ovsdbserver-nb" Feb 23 09:08:30 crc kubenswrapper[5047]: E0223 09:08:30.778447 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" containerName="registry-server" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778454 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" containerName="registry-server" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778640 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="74acc7db-4095-419a-9b09-afa04283a69f" containerName="ovsdbserver-sb" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778658 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" containerName="ovsdbserver-nb" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778671 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="146c85be-9d67-4281-873e-b27f5e90d957" containerName="ovsdbserver-nb" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778685 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6cd063-9d8c-4689-8e90-346ae2ec1ea8" containerName="openstack-network-exporter" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778694 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e4ad09-4c59-4ed8-b826-d64856a4c7aa" containerName="registry-server" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778708 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="15432459-a935-4c31-897a-f62bb64b53fe" containerName="container-00" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778721 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="385bab5b-7cad-4274-b672-a0614be3f41e" containerName="openstack-network-exporter" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778733 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="385bab5b-7cad-4274-b672-a0614be3f41e" containerName="ovsdbserver-sb" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778750 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="146c85be-9d67-4281-873e-b27f5e90d957" containerName="openstack-network-exporter" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.778760 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="74acc7db-4095-419a-9b09-afa04283a69f" containerName="openstack-network-exporter" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.780370 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.842085 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tl8bf"] Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.941147 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbpwj\" (UniqueName: \"kubernetes.io/projected/82d2d2bf-0380-49b6-9684-60129d01a990-kube-api-access-qbpwj\") pod \"redhat-marketplace-tl8bf\" (UID: \"82d2d2bf-0380-49b6-9684-60129d01a990\") " pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.941210 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82d2d2bf-0380-49b6-9684-60129d01a990-catalog-content\") pod \"redhat-marketplace-tl8bf\" (UID: \"82d2d2bf-0380-49b6-9684-60129d01a990\") " pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:30 crc kubenswrapper[5047]: I0223 09:08:30.941400 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82d2d2bf-0380-49b6-9684-60129d01a990-utilities\") pod \"redhat-marketplace-tl8bf\" (UID: \"82d2d2bf-0380-49b6-9684-60129d01a990\") " pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:31 crc kubenswrapper[5047]: I0223 09:08:31.042687 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82d2d2bf-0380-49b6-9684-60129d01a990-utilities\") pod \"redhat-marketplace-tl8bf\" (UID: \"82d2d2bf-0380-49b6-9684-60129d01a990\") " pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:31 crc kubenswrapper[5047]: I0223 09:08:31.042835 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbpwj\" (UniqueName: \"kubernetes.io/projected/82d2d2bf-0380-49b6-9684-60129d01a990-kube-api-access-qbpwj\") pod \"redhat-marketplace-tl8bf\" (UID: \"82d2d2bf-0380-49b6-9684-60129d01a990\") " pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:31 crc kubenswrapper[5047]: I0223 09:08:31.042875 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82d2d2bf-0380-49b6-9684-60129d01a990-catalog-content\") pod \"redhat-marketplace-tl8bf\" (UID: \"82d2d2bf-0380-49b6-9684-60129d01a990\") " pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:31 crc kubenswrapper[5047]: I0223 09:08:31.043292 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82d2d2bf-0380-49b6-9684-60129d01a990-utilities\") pod \"redhat-marketplace-tl8bf\" (UID: \"82d2d2bf-0380-49b6-9684-60129d01a990\") " pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:31 crc kubenswrapper[5047]: I0223 09:08:31.043404 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82d2d2bf-0380-49b6-9684-60129d01a990-catalog-content\") pod \"redhat-marketplace-tl8bf\" (UID: \"82d2d2bf-0380-49b6-9684-60129d01a990\") " pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:31 crc kubenswrapper[5047]: I0223 09:08:31.061887 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbpwj\" (UniqueName: \"kubernetes.io/projected/82d2d2bf-0380-49b6-9684-60129d01a990-kube-api-access-qbpwj\") pod \"redhat-marketplace-tl8bf\" (UID: \"82d2d2bf-0380-49b6-9684-60129d01a990\") " pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:31 crc kubenswrapper[5047]: I0223 09:08:31.162926 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:31 crc kubenswrapper[5047]: I0223 09:08:31.601093 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tl8bf"] Feb 23 09:08:32 crc kubenswrapper[5047]: I0223 09:08:32.138424 5047 generic.go:334] "Generic (PLEG): container finished" podID="82d2d2bf-0380-49b6-9684-60129d01a990" containerID="17eeb35e40f02a5e16126300d83f0b665252c7178458d84a32938b923d7b9c7c" exitCode=0 Feb 23 09:08:32 crc kubenswrapper[5047]: I0223 09:08:32.138473 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tl8bf" event={"ID":"82d2d2bf-0380-49b6-9684-60129d01a990","Type":"ContainerDied","Data":"17eeb35e40f02a5e16126300d83f0b665252c7178458d84a32938b923d7b9c7c"} Feb 23 09:08:32 crc kubenswrapper[5047]: I0223 09:08:32.138499 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tl8bf" event={"ID":"82d2d2bf-0380-49b6-9684-60129d01a990","Type":"ContainerStarted","Data":"5a7ed56f4b27156dc6752b0a18aea12bf115a067a6c4f5ab9f59464ddbf709ec"} Feb 23 09:08:33 crc kubenswrapper[5047]: I0223 09:08:33.148191 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tl8bf" event={"ID":"82d2d2bf-0380-49b6-9684-60129d01a990","Type":"ContainerStarted","Data":"86a4666e6d978df21f8a6318dfa538f04e87133ec115ed6181d22eadbd00056a"} Feb 23 09:08:34 crc kubenswrapper[5047]: I0223 09:08:34.158417 5047 generic.go:334] "Generic (PLEG): container finished" podID="82d2d2bf-0380-49b6-9684-60129d01a990" containerID="86a4666e6d978df21f8a6318dfa538f04e87133ec115ed6181d22eadbd00056a" exitCode=0 Feb 23 09:08:34 crc kubenswrapper[5047]: I0223 09:08:34.158511 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tl8bf" event={"ID":"82d2d2bf-0380-49b6-9684-60129d01a990","Type":"ContainerDied","Data":"86a4666e6d978df21f8a6318dfa538f04e87133ec115ed6181d22eadbd00056a"} Feb 23 09:08:35 crc kubenswrapper[5047]: I0223 09:08:35.166875 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tl8bf" event={"ID":"82d2d2bf-0380-49b6-9684-60129d01a990","Type":"ContainerStarted","Data":"d871e4e95cd8f64400f8992744888be1ea589a6ab5e7a26eca63ddba65d39d50"} Feb 23 09:08:35 crc kubenswrapper[5047]: I0223 09:08:35.190437 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tl8bf" podStartSLOduration=2.801463538 podStartE2EDuration="5.190415457s" podCreationTimestamp="2026-02-23 09:08:30 +0000 UTC" firstStartedPulling="2026-02-23 09:08:32.141231223 +0000 UTC m=+8634.392558357" lastFinishedPulling="2026-02-23 09:08:34.530183142 +0000 UTC m=+8636.781510276" observedRunningTime="2026-02-23 09:08:35.184663601 +0000 UTC m=+8637.435990745" watchObservedRunningTime="2026-02-23 09:08:35.190415457 +0000 UTC m=+8637.441742591" Feb 23 09:08:41 crc kubenswrapper[5047]: I0223 09:08:41.163544 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:41 crc kubenswrapper[5047]: I0223 09:08:41.164134 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:41 crc kubenswrapper[5047]: I0223 09:08:41.216155 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:41 crc kubenswrapper[5047]: I0223 09:08:41.262226 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:41 crc kubenswrapper[5047]: I0223 09:08:41.476947 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tl8bf"] Feb 23 09:08:41 crc kubenswrapper[5047]: I0223 09:08:41.489564 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2_eb5db10f-166f-4c03-b8f9-a8d549c48948/util/0.log" Feb 23 09:08:41 crc kubenswrapper[5047]: I0223 09:08:41.637692 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2_eb5db10f-166f-4c03-b8f9-a8d549c48948/util/0.log" Feb 23 09:08:41 crc kubenswrapper[5047]: I0223 09:08:41.696275 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2_eb5db10f-166f-4c03-b8f9-a8d549c48948/pull/0.log" Feb 23 09:08:41 crc kubenswrapper[5047]: I0223 09:08:41.741524 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2_eb5db10f-166f-4c03-b8f9-a8d549c48948/pull/0.log" Feb 23 09:08:41 crc kubenswrapper[5047]: I0223 09:08:41.852376 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2_eb5db10f-166f-4c03-b8f9-a8d549c48948/util/0.log" Feb 23 09:08:41 crc kubenswrapper[5047]: I0223 09:08:41.887270 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2_eb5db10f-166f-4c03-b8f9-a8d549c48948/extract/0.log" Feb 23 09:08:41 crc kubenswrapper[5047]: I0223 09:08:41.928496 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m5ch2_eb5db10f-166f-4c03-b8f9-a8d549c48948/pull/0.log" Feb 23 09:08:42 crc kubenswrapper[5047]: I0223 09:08:42.026429 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd_9d5da52b-350d-4489-ba83-8e8db44549b3/util/0.log" Feb 23 09:08:42 crc kubenswrapper[5047]: I0223 09:08:42.210230 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd_9d5da52b-350d-4489-ba83-8e8db44549b3/util/0.log" Feb 23 09:08:42 crc kubenswrapper[5047]: I0223 09:08:42.242555 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd_9d5da52b-350d-4489-ba83-8e8db44549b3/pull/0.log" Feb 23 09:08:42 crc kubenswrapper[5047]: I0223 09:08:42.253806 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd_9d5da52b-350d-4489-ba83-8e8db44549b3/pull/0.log" Feb 23 09:08:42 crc kubenswrapper[5047]: I0223 09:08:42.414190 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd_9d5da52b-350d-4489-ba83-8e8db44549b3/pull/0.log" Feb 23 09:08:42 crc kubenswrapper[5047]: I0223 09:08:42.418789 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd_9d5da52b-350d-4489-ba83-8e8db44549b3/util/0.log" Feb 23 09:08:42 crc kubenswrapper[5047]: I0223 09:08:42.421319 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bdmnd_9d5da52b-350d-4489-ba83-8e8db44549b3/extract/0.log" Feb 23 09:08:42 crc kubenswrapper[5047]: I0223 09:08:42.603806 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps_e914782d-7780-4bd6-947e-7d381f18c574/util/0.log" Feb 23 09:08:42 crc kubenswrapper[5047]: I0223 09:08:42.817251 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps_e914782d-7780-4bd6-947e-7d381f18c574/pull/0.log" Feb 23 09:08:42 crc kubenswrapper[5047]: I0223 09:08:42.845858 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps_e914782d-7780-4bd6-947e-7d381f18c574/util/0.log" Feb 23 09:08:42 crc kubenswrapper[5047]: I0223 09:08:42.853784 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps_e914782d-7780-4bd6-947e-7d381f18c574/pull/0.log" Feb 23 09:08:42 crc kubenswrapper[5047]: I0223 09:08:42.966503 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps_e914782d-7780-4bd6-947e-7d381f18c574/util/0.log" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.012735 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps_e914782d-7780-4bd6-947e-7d381f18c574/pull/0.log" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.026605 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q9jps_e914782d-7780-4bd6-947e-7d381f18c574/extract/0.log" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.168478 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pbw2h_8369b050-4b0f-461d-8629-9425f1997ee5/extract-utilities/0.log" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.222656 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tl8bf" podUID="82d2d2bf-0380-49b6-9684-60129d01a990" containerName="registry-server" containerID="cri-o://d871e4e95cd8f64400f8992744888be1ea589a6ab5e7a26eca63ddba65d39d50" gracePeriod=2 Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.306371 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pbw2h_8369b050-4b0f-461d-8629-9425f1997ee5/extract-content/0.log" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.308189 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pbw2h_8369b050-4b0f-461d-8629-9425f1997ee5/extract-utilities/0.log" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.361459 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pbw2h_8369b050-4b0f-461d-8629-9425f1997ee5/extract-content/0.log" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.566807 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pbw2h_8369b050-4b0f-461d-8629-9425f1997ee5/extract-utilities/0.log" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.638210 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pbw2h_8369b050-4b0f-461d-8629-9425f1997ee5/extract-content/0.log" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.726301 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.729359 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82d2d2bf-0380-49b6-9684-60129d01a990-catalog-content\") pod \"82d2d2bf-0380-49b6-9684-60129d01a990\" (UID: \"82d2d2bf-0380-49b6-9684-60129d01a990\") " Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.729439 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbpwj\" (UniqueName: \"kubernetes.io/projected/82d2d2bf-0380-49b6-9684-60129d01a990-kube-api-access-qbpwj\") pod \"82d2d2bf-0380-49b6-9684-60129d01a990\" (UID: \"82d2d2bf-0380-49b6-9684-60129d01a990\") " Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.729459 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82d2d2bf-0380-49b6-9684-60129d01a990-utilities\") pod \"82d2d2bf-0380-49b6-9684-60129d01a990\" (UID: \"82d2d2bf-0380-49b6-9684-60129d01a990\") " Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.731585 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82d2d2bf-0380-49b6-9684-60129d01a990-utilities" (OuterVolumeSpecName: "utilities") pod "82d2d2bf-0380-49b6-9684-60129d01a990" (UID: "82d2d2bf-0380-49b6-9684-60129d01a990"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.739463 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d2d2bf-0380-49b6-9684-60129d01a990-kube-api-access-qbpwj" (OuterVolumeSpecName: "kube-api-access-qbpwj") pod "82d2d2bf-0380-49b6-9684-60129d01a990" (UID: "82d2d2bf-0380-49b6-9684-60129d01a990"). InnerVolumeSpecName "kube-api-access-qbpwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.810411 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85n7r_97f58084-18f9-4231-b254-c2c70ffc1909/extract-utilities/0.log" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.811813 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82d2d2bf-0380-49b6-9684-60129d01a990-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82d2d2bf-0380-49b6-9684-60129d01a990" (UID: "82d2d2bf-0380-49b6-9684-60129d01a990"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.830406 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbpwj\" (UniqueName: \"kubernetes.io/projected/82d2d2bf-0380-49b6-9684-60129d01a990-kube-api-access-qbpwj\") on node \"crc\" DevicePath \"\"" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.830437 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82d2d2bf-0380-49b6-9684-60129d01a990-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:08:43 crc kubenswrapper[5047]: I0223 09:08:43.830449 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82d2d2bf-0380-49b6-9684-60129d01a990-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.055199 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85n7r_97f58084-18f9-4231-b254-c2c70ffc1909/extract-utilities/0.log" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.233676 5047 generic.go:334] "Generic (PLEG): container finished" podID="82d2d2bf-0380-49b6-9684-60129d01a990" containerID="d871e4e95cd8f64400f8992744888be1ea589a6ab5e7a26eca63ddba65d39d50" exitCode=0 Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.233712 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tl8bf" event={"ID":"82d2d2bf-0380-49b6-9684-60129d01a990","Type":"ContainerDied","Data":"d871e4e95cd8f64400f8992744888be1ea589a6ab5e7a26eca63ddba65d39d50"} Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.233737 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tl8bf" event={"ID":"82d2d2bf-0380-49b6-9684-60129d01a990","Type":"ContainerDied","Data":"5a7ed56f4b27156dc6752b0a18aea12bf115a067a6c4f5ab9f59464ddbf709ec"} Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.233755 5047 scope.go:117] "RemoveContainer" containerID="d871e4e95cd8f64400f8992744888be1ea589a6ab5e7a26eca63ddba65d39d50" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.233861 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tl8bf" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.253030 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85n7r_97f58084-18f9-4231-b254-c2c70ffc1909/extract-content/0.log" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.265410 5047 scope.go:117] "RemoveContainer" containerID="86a4666e6d978df21f8a6318dfa538f04e87133ec115ed6181d22eadbd00056a" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.277054 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pbw2h_8369b050-4b0f-461d-8629-9425f1997ee5/registry-server/0.log" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.284081 5047 scope.go:117] "RemoveContainer" containerID="17eeb35e40f02a5e16126300d83f0b665252c7178458d84a32938b923d7b9c7c" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.284150 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tl8bf"] Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.290028 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tl8bf"] Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.312044 5047 scope.go:117] "RemoveContainer" containerID="d871e4e95cd8f64400f8992744888be1ea589a6ab5e7a26eca63ddba65d39d50" Feb 23 09:08:44 crc kubenswrapper[5047]: E0223 09:08:44.314310 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d871e4e95cd8f64400f8992744888be1ea589a6ab5e7a26eca63ddba65d39d50\": container with ID starting with d871e4e95cd8f64400f8992744888be1ea589a6ab5e7a26eca63ddba65d39d50 not found: ID does not exist" containerID="d871e4e95cd8f64400f8992744888be1ea589a6ab5e7a26eca63ddba65d39d50" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.314363 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d871e4e95cd8f64400f8992744888be1ea589a6ab5e7a26eca63ddba65d39d50"} err="failed to get container status \"d871e4e95cd8f64400f8992744888be1ea589a6ab5e7a26eca63ddba65d39d50\": rpc error: code = NotFound desc = could not find container \"d871e4e95cd8f64400f8992744888be1ea589a6ab5e7a26eca63ddba65d39d50\": container with ID starting with d871e4e95cd8f64400f8992744888be1ea589a6ab5e7a26eca63ddba65d39d50 not found: ID does not exist" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.314386 5047 scope.go:117] "RemoveContainer" containerID="86a4666e6d978df21f8a6318dfa538f04e87133ec115ed6181d22eadbd00056a" Feb 23 09:08:44 crc kubenswrapper[5047]: E0223 09:08:44.315397 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86a4666e6d978df21f8a6318dfa538f04e87133ec115ed6181d22eadbd00056a\": container with ID starting with 86a4666e6d978df21f8a6318dfa538f04e87133ec115ed6181d22eadbd00056a not found: ID does not exist" containerID="86a4666e6d978df21f8a6318dfa538f04e87133ec115ed6181d22eadbd00056a" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.315458 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86a4666e6d978df21f8a6318dfa538f04e87133ec115ed6181d22eadbd00056a"} err="failed to get container status \"86a4666e6d978df21f8a6318dfa538f04e87133ec115ed6181d22eadbd00056a\": rpc error: code = NotFound desc = could not find container \"86a4666e6d978df21f8a6318dfa538f04e87133ec115ed6181d22eadbd00056a\": container with ID starting with 86a4666e6d978df21f8a6318dfa538f04e87133ec115ed6181d22eadbd00056a not found: ID does not exist" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.315493 5047 scope.go:117] "RemoveContainer" containerID="17eeb35e40f02a5e16126300d83f0b665252c7178458d84a32938b923d7b9c7c" Feb 23 09:08:44 crc kubenswrapper[5047]: E0223 09:08:44.316237 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17eeb35e40f02a5e16126300d83f0b665252c7178458d84a32938b923d7b9c7c\": container with ID starting with 17eeb35e40f02a5e16126300d83f0b665252c7178458d84a32938b923d7b9c7c not found: ID does not exist" containerID="17eeb35e40f02a5e16126300d83f0b665252c7178458d84a32938b923d7b9c7c" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.316269 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17eeb35e40f02a5e16126300d83f0b665252c7178458d84a32938b923d7b9c7c"} err="failed to get container status \"17eeb35e40f02a5e16126300d83f0b665252c7178458d84a32938b923d7b9c7c\": rpc error: code = NotFound desc = could not find container \"17eeb35e40f02a5e16126300d83f0b665252c7178458d84a32938b923d7b9c7c\": container with ID starting with 17eeb35e40f02a5e16126300d83f0b665252c7178458d84a32938b923d7b9c7c not found: ID does not exist" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.329811 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85n7r_97f58084-18f9-4231-b254-c2c70ffc1909/extract-content/0.log" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.348529 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82d2d2bf-0380-49b6-9684-60129d01a990" path="/var/lib/kubelet/pods/82d2d2bf-0380-49b6-9684-60129d01a990/volumes" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.448562 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85n7r_97f58084-18f9-4231-b254-c2c70ffc1909/extract-utilities/0.log" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.458959 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85n7r_97f58084-18f9-4231-b254-c2c70ffc1909/extract-content/0.log" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.684961 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n_a40f73c1-7a41-4706-87fa-f5059a8adb4f/util/0.log" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.880237 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n_a40f73c1-7a41-4706-87fa-f5059a8adb4f/pull/0.log" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.926958 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n_a40f73c1-7a41-4706-87fa-f5059a8adb4f/util/0.log" Feb 23 09:08:44 crc kubenswrapper[5047]: I0223 09:08:44.961593 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n_a40f73c1-7a41-4706-87fa-f5059a8adb4f/pull/0.log" Feb 23 09:08:45 crc kubenswrapper[5047]: I0223 09:08:45.143856 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n_a40f73c1-7a41-4706-87fa-f5059a8adb4f/pull/0.log" Feb 23 09:08:45 crc kubenswrapper[5047]: I0223 09:08:45.193825 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n_a40f73c1-7a41-4706-87fa-f5059a8adb4f/util/0.log" Feb 23 09:08:45 crc kubenswrapper[5047]: I0223 09:08:45.198086 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecatnx9n_a40f73c1-7a41-4706-87fa-f5059a8adb4f/extract/0.log" Feb 23 09:08:45 crc kubenswrapper[5047]: I0223 09:08:45.405723 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6m29_34827715-7787-4f5b-b31d-27a4a217a266/extract-utilities/0.log" Feb 23 09:08:45 crc kubenswrapper[5047]: I0223 09:08:45.475483 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zp4xr_3e7c7b9b-b216-4442-88b2-6c2ad1506955/marketplace-operator/0.log" Feb 23 09:08:45 crc kubenswrapper[5047]: I0223 09:08:45.622459 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85n7r_97f58084-18f9-4231-b254-c2c70ffc1909/registry-server/0.log" Feb 23 09:08:45 crc kubenswrapper[5047]: I0223 09:08:45.637174 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6m29_34827715-7787-4f5b-b31d-27a4a217a266/extract-utilities/0.log" Feb 23 09:08:45 crc kubenswrapper[5047]: I0223 09:08:45.642476 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6m29_34827715-7787-4f5b-b31d-27a4a217a266/extract-content/0.log" Feb 23 09:08:45 crc kubenswrapper[5047]: I0223 09:08:45.662950 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6m29_34827715-7787-4f5b-b31d-27a4a217a266/extract-content/0.log" Feb 23 09:08:45 crc kubenswrapper[5047]: I0223 09:08:45.834150 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6m29_34827715-7787-4f5b-b31d-27a4a217a266/extract-utilities/0.log" Feb 23 09:08:45 crc kubenswrapper[5047]: I0223 09:08:45.911418 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6m29_34827715-7787-4f5b-b31d-27a4a217a266/extract-content/0.log" Feb 23 09:08:46 crc kubenswrapper[5047]: I0223 09:08:46.106077 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6m29_34827715-7787-4f5b-b31d-27a4a217a266/registry-server/0.log" Feb 23 09:08:46 crc kubenswrapper[5047]: I0223 09:08:46.321042 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7wgb_35593a4c-662a-4df0-8076-963962ffb460/extract-utilities/0.log" Feb 23 09:08:46 crc kubenswrapper[5047]: I0223 09:08:46.472951 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7wgb_35593a4c-662a-4df0-8076-963962ffb460/extract-utilities/0.log" Feb 23 09:08:46 crc kubenswrapper[5047]: I0223 09:08:46.523287 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7wgb_35593a4c-662a-4df0-8076-963962ffb460/extract-content/0.log" Feb 23 09:08:46 crc kubenswrapper[5047]: I0223 09:08:46.527215 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7wgb_35593a4c-662a-4df0-8076-963962ffb460/extract-content/0.log" Feb 23 09:08:46 crc kubenswrapper[5047]: I0223 09:08:46.702638 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7wgb_35593a4c-662a-4df0-8076-963962ffb460/extract-utilities/0.log" Feb 23 09:08:46 crc kubenswrapper[5047]: I0223 09:08:46.706897 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7wgb_35593a4c-662a-4df0-8076-963962ffb460/extract-content/0.log" Feb 23 09:08:47 crc kubenswrapper[5047]: I0223 09:08:47.508832 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7wgb_35593a4c-662a-4df0-8076-963962ffb460/registry-server/0.log" Feb 23 09:08:59 crc kubenswrapper[5047]: I0223 09:08:59.166893 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-r6zrs_ecb737b5-f02c-4aea-b4d0-93c516c3c258/prometheus-operator/0.log" Feb 23 09:08:59 crc kubenswrapper[5047]: I0223 09:08:59.208329 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b74d65d4d-5s6h8_a1e4753e-1eb8-4961-9669-72b6abe068e4/prometheus-operator-admission-webhook/0.log" Feb 23 09:08:59 crc kubenswrapper[5047]: I0223 09:08:59.239553 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b74d65d4d-qgb2h_8e2c1208-29b7-4cda-8067-c429af1b5d63/prometheus-operator-admission-webhook/0.log" Feb 23 09:08:59 crc kubenswrapper[5047]: I0223 09:08:59.375395 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-x49nh_ebcba915-39c3-4516-89c6-ccc956b12b99/operator/0.log" Feb 23 09:08:59 crc kubenswrapper[5047]: I0223 09:08:59.391294 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-gv8g5_b0642e24-413d-4eea-83fb-e959cdcbc6dd/perses-operator/0.log" Feb 23 09:09:03 crc kubenswrapper[5047]: I0223 09:09:03.918695 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6dksx"] Feb 23 09:09:03 crc kubenswrapper[5047]: E0223 09:09:03.919608 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d2d2bf-0380-49b6-9684-60129d01a990" containerName="extract-content" Feb 23 09:09:03 crc kubenswrapper[5047]: I0223 09:09:03.919624 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d2d2bf-0380-49b6-9684-60129d01a990" containerName="extract-content" Feb 23 09:09:03 crc kubenswrapper[5047]: E0223 09:09:03.919643 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d2d2bf-0380-49b6-9684-60129d01a990" containerName="extract-utilities" Feb 23 09:09:03 crc kubenswrapper[5047]: I0223 09:09:03.919652 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d2d2bf-0380-49b6-9684-60129d01a990" containerName="extract-utilities" Feb 23 09:09:03 crc kubenswrapper[5047]: E0223 09:09:03.919664 5047 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d2d2bf-0380-49b6-9684-60129d01a990" containerName="registry-server" Feb 23 09:09:03 crc kubenswrapper[5047]: I0223 09:09:03.919672 5047 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d2d2bf-0380-49b6-9684-60129d01a990" containerName="registry-server" Feb 23 09:09:03 crc kubenswrapper[5047]: I0223 09:09:03.919889 5047 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d2d2bf-0380-49b6-9684-60129d01a990" containerName="registry-server" Feb 23 09:09:03 crc kubenswrapper[5047]: I0223 09:09:03.921175 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:03 crc kubenswrapper[5047]: I0223 09:09:03.930058 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6dksx"] Feb 23 09:09:04 crc kubenswrapper[5047]: I0223 09:09:04.004253 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9b87240-45ef-4200-ad38-bbb19b8bcc35-utilities\") pod \"redhat-operators-6dksx\" (UID: \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\") " pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:04 crc kubenswrapper[5047]: I0223 09:09:04.004330 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tplvn\" (UniqueName: \"kubernetes.io/projected/e9b87240-45ef-4200-ad38-bbb19b8bcc35-kube-api-access-tplvn\") pod \"redhat-operators-6dksx\" (UID: \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\") " pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:04 crc kubenswrapper[5047]: I0223 09:09:04.004375 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9b87240-45ef-4200-ad38-bbb19b8bcc35-catalog-content\") pod \"redhat-operators-6dksx\" (UID: \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\") " pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:04 crc kubenswrapper[5047]: I0223 09:09:04.105421 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9b87240-45ef-4200-ad38-bbb19b8bcc35-utilities\") pod \"redhat-operators-6dksx\" (UID: \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\") " pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:04 crc kubenswrapper[5047]: I0223 09:09:04.105503 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tplvn\" (UniqueName: \"kubernetes.io/projected/e9b87240-45ef-4200-ad38-bbb19b8bcc35-kube-api-access-tplvn\") pod \"redhat-operators-6dksx\" (UID: \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\") " pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:04 crc kubenswrapper[5047]: I0223 09:09:04.105553 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9b87240-45ef-4200-ad38-bbb19b8bcc35-catalog-content\") pod \"redhat-operators-6dksx\" (UID: \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\") " pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:04 crc kubenswrapper[5047]: I0223 09:09:04.106048 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9b87240-45ef-4200-ad38-bbb19b8bcc35-utilities\") pod \"redhat-operators-6dksx\" (UID: \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\") " pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:04 crc kubenswrapper[5047]: I0223 09:09:04.106083 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9b87240-45ef-4200-ad38-bbb19b8bcc35-catalog-content\") pod \"redhat-operators-6dksx\" (UID: \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\") " pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:04 crc kubenswrapper[5047]: I0223 09:09:04.127862 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tplvn\" (UniqueName: \"kubernetes.io/projected/e9b87240-45ef-4200-ad38-bbb19b8bcc35-kube-api-access-tplvn\") pod \"redhat-operators-6dksx\" (UID: \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\") " pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:04 crc kubenswrapper[5047]: I0223 09:09:04.251148 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:04 crc kubenswrapper[5047]: I0223 09:09:04.684669 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6dksx"] Feb 23 09:09:05 crc kubenswrapper[5047]: I0223 09:09:05.379436 5047 generic.go:334] "Generic (PLEG): container finished" podID="e9b87240-45ef-4200-ad38-bbb19b8bcc35" containerID="6c4240ef6eb32be628819c83e18f1e7b7ad16c1abb635ccb3f5d034a7b35c163" exitCode=0 Feb 23 09:09:05 crc kubenswrapper[5047]: I0223 09:09:05.379535 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dksx" event={"ID":"e9b87240-45ef-4200-ad38-bbb19b8bcc35","Type":"ContainerDied","Data":"6c4240ef6eb32be628819c83e18f1e7b7ad16c1abb635ccb3f5d034a7b35c163"} Feb 23 09:09:05 crc kubenswrapper[5047]: I0223 09:09:05.379711 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dksx" event={"ID":"e9b87240-45ef-4200-ad38-bbb19b8bcc35","Type":"ContainerStarted","Data":"206e6e7e9b022fa9f6cf90f214380e91b613d8847691d0844f77cbc506e981f8"} Feb 23 09:09:06 crc kubenswrapper[5047]: I0223 09:09:06.394654 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dksx" event={"ID":"e9b87240-45ef-4200-ad38-bbb19b8bcc35","Type":"ContainerStarted","Data":"b17d77d364973c9d4f8e75dcee487d113c329351f21f1231076d411c935d5a85"} Feb 23 09:09:09 crc kubenswrapper[5047]: I0223 09:09:09.421580 5047 generic.go:334] "Generic (PLEG): container finished" podID="e9b87240-45ef-4200-ad38-bbb19b8bcc35" containerID="b17d77d364973c9d4f8e75dcee487d113c329351f21f1231076d411c935d5a85" exitCode=0 Feb 23 09:09:09 crc kubenswrapper[5047]: I0223 09:09:09.421652 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dksx" event={"ID":"e9b87240-45ef-4200-ad38-bbb19b8bcc35","Type":"ContainerDied","Data":"b17d77d364973c9d4f8e75dcee487d113c329351f21f1231076d411c935d5a85"} Feb 23 09:09:10 crc kubenswrapper[5047]: I0223 09:09:10.433963 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dksx" event={"ID":"e9b87240-45ef-4200-ad38-bbb19b8bcc35","Type":"ContainerStarted","Data":"3ece09521bfb54b1383fc7fd42275e0383d9fb7adf8c263b4113e87ddcf1ae9d"} Feb 23 09:09:10 crc kubenswrapper[5047]: I0223 09:09:10.465348 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6dksx" podStartSLOduration=3.013362235 podStartE2EDuration="7.465328894s" podCreationTimestamp="2026-02-23 09:09:03 +0000 UTC" firstStartedPulling="2026-02-23 09:09:05.381388413 +0000 UTC m=+8667.632715547" lastFinishedPulling="2026-02-23 09:09:09.833355062 +0000 UTC m=+8672.084682206" observedRunningTime="2026-02-23 09:09:10.462839678 +0000 UTC m=+8672.714166842" watchObservedRunningTime="2026-02-23 09:09:10.465328894 +0000 UTC m=+8672.716656028" Feb 23 09:09:14 crc kubenswrapper[5047]: I0223 09:09:14.252110 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:14 crc kubenswrapper[5047]: I0223 09:09:14.252483 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:15 crc kubenswrapper[5047]: I0223 09:09:15.310052 5047 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6dksx" podUID="e9b87240-45ef-4200-ad38-bbb19b8bcc35" containerName="registry-server" probeResult="failure" output=< Feb 23 09:09:15 crc kubenswrapper[5047]: timeout: failed to connect service ":50051" within 1s Feb 23 09:09:15 crc kubenswrapper[5047]: > Feb 23 09:09:16 crc kubenswrapper[5047]: I0223 09:09:16.759424 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:09:16 crc kubenswrapper[5047]: I0223 09:09:16.759525 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:09:20 crc kubenswrapper[5047]: I0223 09:09:20.749606 5047 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k2r8x"] Feb 23 09:09:20 crc kubenswrapper[5047]: I0223 09:09:20.751467 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:20 crc kubenswrapper[5047]: I0223 09:09:20.770865 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k2r8x"] Feb 23 09:09:20 crc kubenswrapper[5047]: I0223 09:09:20.888612 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033e9e29-5654-4278-abe1-192153a996a2-catalog-content\") pod \"community-operators-k2r8x\" (UID: \"033e9e29-5654-4278-abe1-192153a996a2\") " pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:20 crc kubenswrapper[5047]: I0223 09:09:20.888946 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033e9e29-5654-4278-abe1-192153a996a2-utilities\") pod \"community-operators-k2r8x\" (UID: \"033e9e29-5654-4278-abe1-192153a996a2\") " pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:20 crc kubenswrapper[5047]: I0223 09:09:20.889087 5047 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7lwb\" (UniqueName: \"kubernetes.io/projected/033e9e29-5654-4278-abe1-192153a996a2-kube-api-access-k7lwb\") pod \"community-operators-k2r8x\" (UID: \"033e9e29-5654-4278-abe1-192153a996a2\") " pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:20 crc kubenswrapper[5047]: I0223 09:09:20.990234 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033e9e29-5654-4278-abe1-192153a996a2-catalog-content\") pod \"community-operators-k2r8x\" (UID: \"033e9e29-5654-4278-abe1-192153a996a2\") " pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:20 crc kubenswrapper[5047]: I0223 09:09:20.990286 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033e9e29-5654-4278-abe1-192153a996a2-utilities\") pod \"community-operators-k2r8x\" (UID: \"033e9e29-5654-4278-abe1-192153a996a2\") " pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:20 crc kubenswrapper[5047]: I0223 09:09:20.990342 5047 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7lwb\" (UniqueName: \"kubernetes.io/projected/033e9e29-5654-4278-abe1-192153a996a2-kube-api-access-k7lwb\") pod \"community-operators-k2r8x\" (UID: \"033e9e29-5654-4278-abe1-192153a996a2\") " pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:20 crc kubenswrapper[5047]: I0223 09:09:20.991272 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033e9e29-5654-4278-abe1-192153a996a2-catalog-content\") pod \"community-operators-k2r8x\" (UID: \"033e9e29-5654-4278-abe1-192153a996a2\") " pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:20 crc kubenswrapper[5047]: I0223 09:09:20.991529 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033e9e29-5654-4278-abe1-192153a996a2-utilities\") pod \"community-operators-k2r8x\" (UID: \"033e9e29-5654-4278-abe1-192153a996a2\") " pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:21 crc kubenswrapper[5047]: I0223 09:09:21.021264 5047 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7lwb\" (UniqueName: \"kubernetes.io/projected/033e9e29-5654-4278-abe1-192153a996a2-kube-api-access-k7lwb\") pod \"community-operators-k2r8x\" (UID: \"033e9e29-5654-4278-abe1-192153a996a2\") " pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:21 crc kubenswrapper[5047]: I0223 09:09:21.075536 5047 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:21 crc kubenswrapper[5047]: I0223 09:09:21.569945 5047 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k2r8x"] Feb 23 09:09:22 crc kubenswrapper[5047]: I0223 09:09:22.524625 5047 generic.go:334] "Generic (PLEG): container finished" podID="033e9e29-5654-4278-abe1-192153a996a2" containerID="b2030569ea3493693e6873095b756dddfbb210b76cd57619b348007370f0125f" exitCode=0 Feb 23 09:09:22 crc kubenswrapper[5047]: I0223 09:09:22.524828 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2r8x" event={"ID":"033e9e29-5654-4278-abe1-192153a996a2","Type":"ContainerDied","Data":"b2030569ea3493693e6873095b756dddfbb210b76cd57619b348007370f0125f"} Feb 23 09:09:22 crc kubenswrapper[5047]: I0223 09:09:22.525020 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2r8x" event={"ID":"033e9e29-5654-4278-abe1-192153a996a2","Type":"ContainerStarted","Data":"dad717b76ec2b6d53c459628dcf450aecee9f7f19ffc5274327c0db9683892e3"} Feb 23 09:09:23 crc kubenswrapper[5047]: I0223 09:09:23.534808 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2r8x" event={"ID":"033e9e29-5654-4278-abe1-192153a996a2","Type":"ContainerStarted","Data":"4df74faeca902ebc03b67b46fc27491949afb0f95684109f17add609cd6d0dea"} Feb 23 09:09:24 crc kubenswrapper[5047]: I0223 09:09:24.299374 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:24 crc kubenswrapper[5047]: I0223 09:09:24.358020 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:24 crc kubenswrapper[5047]: I0223 09:09:24.543714 5047 generic.go:334] "Generic (PLEG): container finished" podID="033e9e29-5654-4278-abe1-192153a996a2" containerID="4df74faeca902ebc03b67b46fc27491949afb0f95684109f17add609cd6d0dea" exitCode=0 Feb 23 09:09:24 crc kubenswrapper[5047]: I0223 09:09:24.543838 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2r8x" event={"ID":"033e9e29-5654-4278-abe1-192153a996a2","Type":"ContainerDied","Data":"4df74faeca902ebc03b67b46fc27491949afb0f95684109f17add609cd6d0dea"} Feb 23 09:09:25 crc kubenswrapper[5047]: I0223 09:09:25.551471 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2r8x" event={"ID":"033e9e29-5654-4278-abe1-192153a996a2","Type":"ContainerStarted","Data":"7a657b901901ee0de100e19562a4d1b0a7ba7574ac76ae125c0212c7d60f0169"} Feb 23 09:09:25 crc kubenswrapper[5047]: I0223 09:09:25.576002 5047 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k2r8x" podStartSLOduration=3.1626472310000002 podStartE2EDuration="5.575983746s" podCreationTimestamp="2026-02-23 09:09:20 +0000 UTC" firstStartedPulling="2026-02-23 09:09:22.526517886 +0000 UTC m=+8684.777845020" lastFinishedPulling="2026-02-23 09:09:24.939854401 +0000 UTC m=+8687.191181535" observedRunningTime="2026-02-23 09:09:25.574816595 +0000 UTC m=+8687.826143749" watchObservedRunningTime="2026-02-23 09:09:25.575983746 +0000 UTC m=+8687.827310880" Feb 23 09:09:27 crc kubenswrapper[5047]: I0223 09:09:27.750063 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6dksx"] Feb 23 09:09:27 crc kubenswrapper[5047]: I0223 09:09:27.750636 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6dksx" podUID="e9b87240-45ef-4200-ad38-bbb19b8bcc35" containerName="registry-server" containerID="cri-o://3ece09521bfb54b1383fc7fd42275e0383d9fb7adf8c263b4113e87ddcf1ae9d" gracePeriod=2 Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.278932 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.402284 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9b87240-45ef-4200-ad38-bbb19b8bcc35-utilities\") pod \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\" (UID: \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\") " Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.402380 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tplvn\" (UniqueName: \"kubernetes.io/projected/e9b87240-45ef-4200-ad38-bbb19b8bcc35-kube-api-access-tplvn\") pod \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\" (UID: \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\") " Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.402469 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9b87240-45ef-4200-ad38-bbb19b8bcc35-catalog-content\") pod \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\" (UID: \"e9b87240-45ef-4200-ad38-bbb19b8bcc35\") " Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.403340 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9b87240-45ef-4200-ad38-bbb19b8bcc35-utilities" (OuterVolumeSpecName: "utilities") pod "e9b87240-45ef-4200-ad38-bbb19b8bcc35" (UID: "e9b87240-45ef-4200-ad38-bbb19b8bcc35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.408589 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b87240-45ef-4200-ad38-bbb19b8bcc35-kube-api-access-tplvn" (OuterVolumeSpecName: "kube-api-access-tplvn") pod "e9b87240-45ef-4200-ad38-bbb19b8bcc35" (UID: "e9b87240-45ef-4200-ad38-bbb19b8bcc35"). InnerVolumeSpecName "kube-api-access-tplvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.505993 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9b87240-45ef-4200-ad38-bbb19b8bcc35-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.506303 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tplvn\" (UniqueName: \"kubernetes.io/projected/e9b87240-45ef-4200-ad38-bbb19b8bcc35-kube-api-access-tplvn\") on node \"crc\" DevicePath \"\"" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.520992 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9b87240-45ef-4200-ad38-bbb19b8bcc35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9b87240-45ef-4200-ad38-bbb19b8bcc35" (UID: "e9b87240-45ef-4200-ad38-bbb19b8bcc35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.573475 5047 generic.go:334] "Generic (PLEG): container finished" podID="e9b87240-45ef-4200-ad38-bbb19b8bcc35" containerID="3ece09521bfb54b1383fc7fd42275e0383d9fb7adf8c263b4113e87ddcf1ae9d" exitCode=0 Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.573520 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dksx" event={"ID":"e9b87240-45ef-4200-ad38-bbb19b8bcc35","Type":"ContainerDied","Data":"3ece09521bfb54b1383fc7fd42275e0383d9fb7adf8c263b4113e87ddcf1ae9d"} Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.573547 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dksx" event={"ID":"e9b87240-45ef-4200-ad38-bbb19b8bcc35","Type":"ContainerDied","Data":"206e6e7e9b022fa9f6cf90f214380e91b613d8847691d0844f77cbc506e981f8"} Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.573567 5047 scope.go:117] "RemoveContainer" containerID="3ece09521bfb54b1383fc7fd42275e0383d9fb7adf8c263b4113e87ddcf1ae9d" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.573588 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dksx" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.601302 5047 scope.go:117] "RemoveContainer" containerID="b17d77d364973c9d4f8e75dcee487d113c329351f21f1231076d411c935d5a85" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.608047 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6dksx"] Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.609005 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9b87240-45ef-4200-ad38-bbb19b8bcc35-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.616241 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6dksx"] Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.634374 5047 scope.go:117] "RemoveContainer" containerID="6c4240ef6eb32be628819c83e18f1e7b7ad16c1abb635ccb3f5d034a7b35c163" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.653681 5047 scope.go:117] "RemoveContainer" containerID="3ece09521bfb54b1383fc7fd42275e0383d9fb7adf8c263b4113e87ddcf1ae9d" Feb 23 09:09:28 crc kubenswrapper[5047]: E0223 09:09:28.654629 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ece09521bfb54b1383fc7fd42275e0383d9fb7adf8c263b4113e87ddcf1ae9d\": container with ID starting with 3ece09521bfb54b1383fc7fd42275e0383d9fb7adf8c263b4113e87ddcf1ae9d not found: ID does not exist" containerID="3ece09521bfb54b1383fc7fd42275e0383d9fb7adf8c263b4113e87ddcf1ae9d" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.654682 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ece09521bfb54b1383fc7fd42275e0383d9fb7adf8c263b4113e87ddcf1ae9d"} err="failed to get container status \"3ece09521bfb54b1383fc7fd42275e0383d9fb7adf8c263b4113e87ddcf1ae9d\": rpc error: code = NotFound desc = could not find container \"3ece09521bfb54b1383fc7fd42275e0383d9fb7adf8c263b4113e87ddcf1ae9d\": container with ID starting with 3ece09521bfb54b1383fc7fd42275e0383d9fb7adf8c263b4113e87ddcf1ae9d not found: ID does not exist" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.654714 5047 scope.go:117] "RemoveContainer" containerID="b17d77d364973c9d4f8e75dcee487d113c329351f21f1231076d411c935d5a85" Feb 23 09:09:28 crc kubenswrapper[5047]: E0223 09:09:28.655394 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b17d77d364973c9d4f8e75dcee487d113c329351f21f1231076d411c935d5a85\": container with ID starting with b17d77d364973c9d4f8e75dcee487d113c329351f21f1231076d411c935d5a85 not found: ID does not exist" containerID="b17d77d364973c9d4f8e75dcee487d113c329351f21f1231076d411c935d5a85" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.655422 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b17d77d364973c9d4f8e75dcee487d113c329351f21f1231076d411c935d5a85"} err="failed to get container status \"b17d77d364973c9d4f8e75dcee487d113c329351f21f1231076d411c935d5a85\": rpc error: code = NotFound desc = could not find container \"b17d77d364973c9d4f8e75dcee487d113c329351f21f1231076d411c935d5a85\": container with ID starting with b17d77d364973c9d4f8e75dcee487d113c329351f21f1231076d411c935d5a85 not found: ID does not exist" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.655440 5047 scope.go:117] "RemoveContainer" containerID="6c4240ef6eb32be628819c83e18f1e7b7ad16c1abb635ccb3f5d034a7b35c163" Feb 23 09:09:28 crc kubenswrapper[5047]: E0223 09:09:28.655713 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c4240ef6eb32be628819c83e18f1e7b7ad16c1abb635ccb3f5d034a7b35c163\": container with ID starting with 6c4240ef6eb32be628819c83e18f1e7b7ad16c1abb635ccb3f5d034a7b35c163 not found: ID does not exist" containerID="6c4240ef6eb32be628819c83e18f1e7b7ad16c1abb635ccb3f5d034a7b35c163" Feb 23 09:09:28 crc kubenswrapper[5047]: I0223 09:09:28.655740 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c4240ef6eb32be628819c83e18f1e7b7ad16c1abb635ccb3f5d034a7b35c163"} err="failed to get container status \"6c4240ef6eb32be628819c83e18f1e7b7ad16c1abb635ccb3f5d034a7b35c163\": rpc error: code = NotFound desc = could not find container \"6c4240ef6eb32be628819c83e18f1e7b7ad16c1abb635ccb3f5d034a7b35c163\": container with ID starting with 6c4240ef6eb32be628819c83e18f1e7b7ad16c1abb635ccb3f5d034a7b35c163 not found: ID does not exist" Feb 23 09:09:30 crc kubenswrapper[5047]: I0223 09:09:30.352145 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9b87240-45ef-4200-ad38-bbb19b8bcc35" path="/var/lib/kubelet/pods/e9b87240-45ef-4200-ad38-bbb19b8bcc35/volumes" Feb 23 09:09:31 crc kubenswrapper[5047]: I0223 09:09:31.076505 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:31 crc kubenswrapper[5047]: I0223 09:09:31.076863 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:31 crc kubenswrapper[5047]: I0223 09:09:31.135245 5047 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:31 crc kubenswrapper[5047]: I0223 09:09:31.680362 5047 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:35 crc kubenswrapper[5047]: I0223 09:09:35.959380 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k2r8x"] Feb 23 09:09:35 crc kubenswrapper[5047]: I0223 09:09:35.959930 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k2r8x" podUID="033e9e29-5654-4278-abe1-192153a996a2" containerName="registry-server" containerID="cri-o://7a657b901901ee0de100e19562a4d1b0a7ba7574ac76ae125c0212c7d60f0169" gracePeriod=2 Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.362615 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.456255 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033e9e29-5654-4278-abe1-192153a996a2-catalog-content\") pod \"033e9e29-5654-4278-abe1-192153a996a2\" (UID: \"033e9e29-5654-4278-abe1-192153a996a2\") " Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.456417 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033e9e29-5654-4278-abe1-192153a996a2-utilities\") pod \"033e9e29-5654-4278-abe1-192153a996a2\" (UID: \"033e9e29-5654-4278-abe1-192153a996a2\") " Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.457585 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/033e9e29-5654-4278-abe1-192153a996a2-utilities" (OuterVolumeSpecName: "utilities") pod "033e9e29-5654-4278-abe1-192153a996a2" (UID: "033e9e29-5654-4278-abe1-192153a996a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.457680 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7lwb\" (UniqueName: \"kubernetes.io/projected/033e9e29-5654-4278-abe1-192153a996a2-kube-api-access-k7lwb\") pod \"033e9e29-5654-4278-abe1-192153a996a2\" (UID: \"033e9e29-5654-4278-abe1-192153a996a2\") " Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.459333 5047 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033e9e29-5654-4278-abe1-192153a996a2-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.462617 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033e9e29-5654-4278-abe1-192153a996a2-kube-api-access-k7lwb" (OuterVolumeSpecName: "kube-api-access-k7lwb") pod "033e9e29-5654-4278-abe1-192153a996a2" (UID: "033e9e29-5654-4278-abe1-192153a996a2"). InnerVolumeSpecName "kube-api-access-k7lwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.506375 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/033e9e29-5654-4278-abe1-192153a996a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "033e9e29-5654-4278-abe1-192153a996a2" (UID: "033e9e29-5654-4278-abe1-192153a996a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.561128 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7lwb\" (UniqueName: \"kubernetes.io/projected/033e9e29-5654-4278-abe1-192153a996a2-kube-api-access-k7lwb\") on node \"crc\" DevicePath \"\"" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.561162 5047 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033e9e29-5654-4278-abe1-192153a996a2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.640163 5047 generic.go:334] "Generic (PLEG): container finished" podID="033e9e29-5654-4278-abe1-192153a996a2" containerID="7a657b901901ee0de100e19562a4d1b0a7ba7574ac76ae125c0212c7d60f0169" exitCode=0 Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.640211 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2r8x" event={"ID":"033e9e29-5654-4278-abe1-192153a996a2","Type":"ContainerDied","Data":"7a657b901901ee0de100e19562a4d1b0a7ba7574ac76ae125c0212c7d60f0169"} Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.640241 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2r8x" event={"ID":"033e9e29-5654-4278-abe1-192153a996a2","Type":"ContainerDied","Data":"dad717b76ec2b6d53c459628dcf450aecee9f7f19ffc5274327c0db9683892e3"} Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.640263 5047 scope.go:117] "RemoveContainer" containerID="7a657b901901ee0de100e19562a4d1b0a7ba7574ac76ae125c0212c7d60f0169" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.640400 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2r8x" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.679324 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k2r8x"] Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.680825 5047 scope.go:117] "RemoveContainer" containerID="4df74faeca902ebc03b67b46fc27491949afb0f95684109f17add609cd6d0dea" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.688260 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k2r8x"] Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.700261 5047 scope.go:117] "RemoveContainer" containerID="b2030569ea3493693e6873095b756dddfbb210b76cd57619b348007370f0125f" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.725792 5047 scope.go:117] "RemoveContainer" containerID="7a657b901901ee0de100e19562a4d1b0a7ba7574ac76ae125c0212c7d60f0169" Feb 23 09:09:36 crc kubenswrapper[5047]: E0223 09:09:36.726277 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a657b901901ee0de100e19562a4d1b0a7ba7574ac76ae125c0212c7d60f0169\": container with ID starting with 7a657b901901ee0de100e19562a4d1b0a7ba7574ac76ae125c0212c7d60f0169 not found: ID does not exist" containerID="7a657b901901ee0de100e19562a4d1b0a7ba7574ac76ae125c0212c7d60f0169" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.726342 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a657b901901ee0de100e19562a4d1b0a7ba7574ac76ae125c0212c7d60f0169"} err="failed to get container status \"7a657b901901ee0de100e19562a4d1b0a7ba7574ac76ae125c0212c7d60f0169\": rpc error: code = NotFound desc = could not find container \"7a657b901901ee0de100e19562a4d1b0a7ba7574ac76ae125c0212c7d60f0169\": container with ID starting with 7a657b901901ee0de100e19562a4d1b0a7ba7574ac76ae125c0212c7d60f0169 not found: ID does not exist" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.726388 5047 scope.go:117] "RemoveContainer" containerID="4df74faeca902ebc03b67b46fc27491949afb0f95684109f17add609cd6d0dea" Feb 23 09:09:36 crc kubenswrapper[5047]: E0223 09:09:36.727029 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4df74faeca902ebc03b67b46fc27491949afb0f95684109f17add609cd6d0dea\": container with ID starting with 4df74faeca902ebc03b67b46fc27491949afb0f95684109f17add609cd6d0dea not found: ID does not exist" containerID="4df74faeca902ebc03b67b46fc27491949afb0f95684109f17add609cd6d0dea" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.727084 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4df74faeca902ebc03b67b46fc27491949afb0f95684109f17add609cd6d0dea"} err="failed to get container status \"4df74faeca902ebc03b67b46fc27491949afb0f95684109f17add609cd6d0dea\": rpc error: code = NotFound desc = could not find container \"4df74faeca902ebc03b67b46fc27491949afb0f95684109f17add609cd6d0dea\": container with ID starting with 4df74faeca902ebc03b67b46fc27491949afb0f95684109f17add609cd6d0dea not found: ID does not exist" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.727117 5047 scope.go:117] "RemoveContainer" containerID="b2030569ea3493693e6873095b756dddfbb210b76cd57619b348007370f0125f" Feb 23 09:09:36 crc kubenswrapper[5047]: E0223 09:09:36.727466 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2030569ea3493693e6873095b756dddfbb210b76cd57619b348007370f0125f\": container with ID starting with b2030569ea3493693e6873095b756dddfbb210b76cd57619b348007370f0125f not found: ID does not exist" containerID="b2030569ea3493693e6873095b756dddfbb210b76cd57619b348007370f0125f" Feb 23 09:09:36 crc kubenswrapper[5047]: I0223 09:09:36.727515 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2030569ea3493693e6873095b756dddfbb210b76cd57619b348007370f0125f"} err="failed to get container status \"b2030569ea3493693e6873095b756dddfbb210b76cd57619b348007370f0125f\": rpc error: code = NotFound desc = could not find container \"b2030569ea3493693e6873095b756dddfbb210b76cd57619b348007370f0125f\": container with ID starting with b2030569ea3493693e6873095b756dddfbb210b76cd57619b348007370f0125f not found: ID does not exist" Feb 23 09:09:38 crc kubenswrapper[5047]: I0223 09:09:38.353878 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="033e9e29-5654-4278-abe1-192153a996a2" path="/var/lib/kubelet/pods/033e9e29-5654-4278-abe1-192153a996a2/volumes" Feb 23 09:09:46 crc kubenswrapper[5047]: I0223 09:09:46.760517 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:09:46 crc kubenswrapper[5047]: I0223 09:09:46.761444 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:10:06 crc kubenswrapper[5047]: I0223 09:10:06.860358 5047 generic.go:334] "Generic (PLEG): container finished" podID="20142b23-ff61-429a-bcd2-459f698dad1d" containerID="b3ac641dea4dd757f9cd36f96424dd8b7b5e05eb94b0c50889d7eee200cec452" exitCode=0 Feb 23 09:10:06 crc kubenswrapper[5047]: I0223 09:10:06.860429 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lqvl/must-gather-qksss" event={"ID":"20142b23-ff61-429a-bcd2-459f698dad1d","Type":"ContainerDied","Data":"b3ac641dea4dd757f9cd36f96424dd8b7b5e05eb94b0c50889d7eee200cec452"} Feb 23 09:10:06 crc kubenswrapper[5047]: I0223 09:10:06.861768 5047 scope.go:117] "RemoveContainer" containerID="b3ac641dea4dd757f9cd36f96424dd8b7b5e05eb94b0c50889d7eee200cec452" Feb 23 09:10:07 crc kubenswrapper[5047]: I0223 09:10:07.642778 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7lqvl_must-gather-qksss_20142b23-ff61-429a-bcd2-459f698dad1d/gather/0.log" Feb 23 09:10:15 crc kubenswrapper[5047]: I0223 09:10:15.827869 5047 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7lqvl/must-gather-qksss"] Feb 23 09:10:15 crc kubenswrapper[5047]: I0223 09:10:15.828868 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7lqvl/must-gather-qksss" podUID="20142b23-ff61-429a-bcd2-459f698dad1d" containerName="copy" containerID="cri-o://2d7f151012182078ca4d0c92fd0168143a667a39e426391af2dd552ca3b896b4" gracePeriod=2 Feb 23 09:10:15 crc kubenswrapper[5047]: I0223 09:10:15.834732 5047 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7lqvl/must-gather-qksss"] Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.263770 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7lqvl_must-gather-qksss_20142b23-ff61-429a-bcd2-459f698dad1d/copy/0.log" Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.264785 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/must-gather-qksss" Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.379568 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6f4j\" (UniqueName: \"kubernetes.io/projected/20142b23-ff61-429a-bcd2-459f698dad1d-kube-api-access-b6f4j\") pod \"20142b23-ff61-429a-bcd2-459f698dad1d\" (UID: \"20142b23-ff61-429a-bcd2-459f698dad1d\") " Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.379629 5047 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20142b23-ff61-429a-bcd2-459f698dad1d-must-gather-output\") pod \"20142b23-ff61-429a-bcd2-459f698dad1d\" (UID: \"20142b23-ff61-429a-bcd2-459f698dad1d\") " Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.384891 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20142b23-ff61-429a-bcd2-459f698dad1d-kube-api-access-b6f4j" (OuterVolumeSpecName: "kube-api-access-b6f4j") pod "20142b23-ff61-429a-bcd2-459f698dad1d" (UID: "20142b23-ff61-429a-bcd2-459f698dad1d"). InnerVolumeSpecName "kube-api-access-b6f4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.481017 5047 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6f4j\" (UniqueName: \"kubernetes.io/projected/20142b23-ff61-429a-bcd2-459f698dad1d-kube-api-access-b6f4j\") on node \"crc\" DevicePath \"\"" Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.502254 5047 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20142b23-ff61-429a-bcd2-459f698dad1d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "20142b23-ff61-429a-bcd2-459f698dad1d" (UID: "20142b23-ff61-429a-bcd2-459f698dad1d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.582406 5047 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20142b23-ff61-429a-bcd2-459f698dad1d-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.759340 5047 patch_prober.go:28] interesting pod/machine-config-daemon-wh6hv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.759395 5047 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.759436 5047 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.760025 5047 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c1d3f51ba6a1b765001fc2b3608a0b8587309852ab518a873052699c9dfa680"} pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.760087 5047 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" podUID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerName="machine-config-daemon" containerID="cri-o://1c1d3f51ba6a1b765001fc2b3608a0b8587309852ab518a873052699c9dfa680" gracePeriod=600 Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.974648 5047 generic.go:334] "Generic (PLEG): container finished" podID="ca275411-978b-439b-ab4b-f98a7ac42f8b" containerID="1c1d3f51ba6a1b765001fc2b3608a0b8587309852ab518a873052699c9dfa680" exitCode=0 Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.974722 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerDied","Data":"1c1d3f51ba6a1b765001fc2b3608a0b8587309852ab518a873052699c9dfa680"} Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.974775 5047 scope.go:117] "RemoveContainer" containerID="6f10bc769ffed5057ed3df83e360ec647faf98a15dadef19b455221dd8fcb492" Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.976384 5047 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7lqvl_must-gather-qksss_20142b23-ff61-429a-bcd2-459f698dad1d/copy/0.log" Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.977668 5047 generic.go:334] "Generic (PLEG): container finished" podID="20142b23-ff61-429a-bcd2-459f698dad1d" containerID="2d7f151012182078ca4d0c92fd0168143a667a39e426391af2dd552ca3b896b4" exitCode=143 Feb 23 09:10:16 crc kubenswrapper[5047]: I0223 09:10:16.977738 5047 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lqvl/must-gather-qksss" Feb 23 09:10:17 crc kubenswrapper[5047]: I0223 09:10:17.046063 5047 scope.go:117] "RemoveContainer" containerID="2d7f151012182078ca4d0c92fd0168143a667a39e426391af2dd552ca3b896b4" Feb 23 09:10:17 crc kubenswrapper[5047]: I0223 09:10:17.074859 5047 scope.go:117] "RemoveContainer" containerID="b3ac641dea4dd757f9cd36f96424dd8b7b5e05eb94b0c50889d7eee200cec452" Feb 23 09:10:17 crc kubenswrapper[5047]: I0223 09:10:17.152077 5047 scope.go:117] "RemoveContainer" containerID="2d7f151012182078ca4d0c92fd0168143a667a39e426391af2dd552ca3b896b4" Feb 23 09:10:17 crc kubenswrapper[5047]: E0223 09:10:17.152552 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d7f151012182078ca4d0c92fd0168143a667a39e426391af2dd552ca3b896b4\": container with ID starting with 2d7f151012182078ca4d0c92fd0168143a667a39e426391af2dd552ca3b896b4 not found: ID does not exist" containerID="2d7f151012182078ca4d0c92fd0168143a667a39e426391af2dd552ca3b896b4" Feb 23 09:10:17 crc kubenswrapper[5047]: I0223 09:10:17.152584 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7f151012182078ca4d0c92fd0168143a667a39e426391af2dd552ca3b896b4"} err="failed to get container status \"2d7f151012182078ca4d0c92fd0168143a667a39e426391af2dd552ca3b896b4\": rpc error: code = NotFound desc = could not find container \"2d7f151012182078ca4d0c92fd0168143a667a39e426391af2dd552ca3b896b4\": container with ID starting with 2d7f151012182078ca4d0c92fd0168143a667a39e426391af2dd552ca3b896b4 not found: ID does not exist" Feb 23 09:10:17 crc kubenswrapper[5047]: I0223 09:10:17.152604 5047 scope.go:117] "RemoveContainer" containerID="b3ac641dea4dd757f9cd36f96424dd8b7b5e05eb94b0c50889d7eee200cec452" Feb 23 09:10:17 crc kubenswrapper[5047]: E0223 09:10:17.153071 5047 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3ac641dea4dd757f9cd36f96424dd8b7b5e05eb94b0c50889d7eee200cec452\": container with ID starting with b3ac641dea4dd757f9cd36f96424dd8b7b5e05eb94b0c50889d7eee200cec452 not found: ID does not exist" containerID="b3ac641dea4dd757f9cd36f96424dd8b7b5e05eb94b0c50889d7eee200cec452" Feb 23 09:10:17 crc kubenswrapper[5047]: I0223 09:10:17.153131 5047 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3ac641dea4dd757f9cd36f96424dd8b7b5e05eb94b0c50889d7eee200cec452"} err="failed to get container status \"b3ac641dea4dd757f9cd36f96424dd8b7b5e05eb94b0c50889d7eee200cec452\": rpc error: code = NotFound desc = could not find container \"b3ac641dea4dd757f9cd36f96424dd8b7b5e05eb94b0c50889d7eee200cec452\": container with ID starting with b3ac641dea4dd757f9cd36f96424dd8b7b5e05eb94b0c50889d7eee200cec452 not found: ID does not exist" Feb 23 09:10:17 crc kubenswrapper[5047]: I0223 09:10:17.992958 5047 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wh6hv" event={"ID":"ca275411-978b-439b-ab4b-f98a7ac42f8b","Type":"ContainerStarted","Data":"998229ab2edbea41dadeb3e561e530a193711a070b4bfe470bd2007383b5ebe7"} Feb 23 09:10:18 crc kubenswrapper[5047]: I0223 09:10:18.359545 5047 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20142b23-ff61-429a-bcd2-459f698dad1d" path="/var/lib/kubelet/pods/20142b23-ff61-429a-bcd2-459f698dad1d/volumes"